Abstract: METHODS AND SYSTEMS FOR CONTROLLING AGRICULTURAL IMPLEMENT FOR REMOVING WEEDS Embodiments disclosed herein relate to agricultural vehicles and more particularly, to detecting and removing weeds using agricultural implement(s) attached to an agricultural vehicle. The system (100) comprises sensor module 102, a camera position adjusting system (108) and a de-weeding blade position adjusting system (110). The sensor module 102 includes media acquisition devices 102a, a speed sensor 102b and a load sensor 102c, The sensor module 102 monitors at least one parameter corresponding to identification and removal of at least one weed, wherein the sensor module 102 is configured to capture at least one crop view. A processing unit 104 processes the captured at least one parameter to generate at least one control output signal to a control unit 106 corresponding to the identification and removal of at least one weed. The system precisely identifies and removes the weeds in an optimized way in real-time scenario. FIG. 1
Claims:We claim:
1. A method (600) for controlling an agricultural implement (D) for removing weeds, the method (600) comprising:
monitoring, by a sensor module (102), at least one parameter corresponding to identification and removal of at least one weed;
processing, by a processing unit (104), the at least one parameter received from the sensor module (102) to generate at least one control output signal to a control unit (106); and
altering, by a de-weeding blade position adjusting system (108), a position of at least one de-weeding blade (B) of the agricultural implement (D) to a weed removing position in which the de-weeding blade (B) removes at least one weed based on the at least one control output signal received from the control unit (106).
2. The method (600) as claimed in claim 1, wherein said monitoring, by the sensor module (102), at least one parameter corresponding to the identification and removal of at least one weed comprises,
capturing and communicating, by at least one media acquisition device (102a), images of crops and weeds, to the processing unit (104).
3. The method (600) as claimed in claim 2, wherein said capturing and communicating by, at least one media acquisition device (102a), images of crops and weeds, to the processing unit (104) includes,
capturing and communicating, by at least one media acquisition device (102a), at least one of video of the agricultural field, image of row crops, images of weeds grown along the crop and images of crop growth stages, to the processing unit (104).
4. The method (600) as claimed in claim 3, wherein said monitoring, by the sensor module (102), at least one parameter corresponding to the identification and removal of at least one weed comprises,
detecting and communicating, by a speed sensor (102b), a speed of an agricultural vehicle to the processing unit (104); and
detecting and communicating, by a load sensor (102c), a load of the agricultural implement (D) to the processing unit (104).
5. The method (600) as claimed in claim 2, wherein said method comprises altering, by a camera position adjusting system (108), a position of at least one media acquisition device (102a) based on instructions received from a controller (104b) of the processing unit (104), where the least one media acquisition device (102a) is provided on at least one the agricultural implement (D) and an agricultural vehicle.
6. The method (600) as claimed in claim 5, wherein said altering, by camera position adjusting system (108), the position of at least one media acquisition device (102a) comprises at least one of,
moving, by at least one first linear actuator (108F), the at least one media acquisition device (102a) through a first movable member (108MF) thereby altering the position of the at least one media acquisition device (102a) along a lengthwise direction of the agricultural vehicle based on instruction received from the controller (104b) of the processing unit (104);
moving, by at least one second linear actuator (108S), the at least one media acquisition device (102a) through at least one second movable member (108MS) thereby altering the position of the at least one media acquisition device (102a) along a width wise direction of the agricultural vehicle based on instruction received from the controller (104b); and
moving, by at least one third linear actuator, the at least one media acquisition device (102a) thereby altering the position of the at least one media acquisition device (102a) along a height wise direction of the agricultural vehicle based on instruction received from the controller (104b).
7. The method (600) as claimed in claim 1, wherein said altering by, the de-weeding blade position adjusting system (108), the position of at least one de-weeding blade (B) of the agricultural implement (D) comprises at least one of,
moving, by at least one first linear actuator (110F), the at least one de-weeding blade (B) through corresponding tine (T) and a linkage system (110L) thereby altering the position of the at least one de-weeding blade (B) with respect to a movement axis (A) of corresponding tine (T) based on the control output signal received from the control unit (106); and
moving, by at least one second linear actuator (110S), the at least one de-weeding blade (B) with respect to corresponding tine (T) along a vertical direction thereby altering the position of de-weeding blade (B) along a height wise direction of agricultural vehicle based on the control output signal received from the control unit (106).
8. The method (600) as claimed in claim 2, wherein said method (600) comprises,
classifying, by a learning module (104a) of the processing unit (104), the captured at least one parameter from the at least one media acquisition device (102a) to a list of crops and weeds based on a list of trained data thereby identifying the at least one weed.
9. The method (600) as claimed in claim 4, said method (600) comprises,
providing, by an user interface unit (204), at least one user defined input to an optimization module 202, where the at least one user defined input is at least one of depth of the weed to be removed, nature of weed, nature of crop, crop height, weed height, crop growth stage, weed growth stage, depth of the soil, geo-coordinates, nature of the soil and weather condition; and
processing, by optimization module 202, inputs received from the speed sensor 102b, the learning module 104a, the load sensor 102c and the user interface unit 204); and
generating, by the optimization module 202, an optimized control signal to the control unit (106) to identify and remove the weed in an optimal manner.
10. The method (600) as claimed in claim 2, wherein said monitoring, by the sensor module (102), at least one parameter corresponding to identification and removal of at least one weed comprises,
capturing and communicating, by at least one media acquisition device (102a) of at least one drone, at least one of images of crops, images of weeds, video of the agricultural field, image of row crops, images of weeds grown along the crop and images of crop growth stages, to the processing unit (104); and
communicating, by a positioning module of the at least one drone, the geographical position to controller (104b), where the geographical position is at least one of geographical position of at least one of crops, weeds and agricultural field.
11. The method (600) as claimed in claim 10, wherein said method (600) comprises,
generating and communicating, by the controller (104b), at least one weed removal map to the control unit (106) based on the information received from the at least one drone;
communicating, by a positioning module of the sensor module (102), the geographical position to controller (104b) where the geographical position is at least one of geographical position of at least one of crops, weeds, agricultural field and the agricultural vehicle; and
guiding, by the controller (104b), the agricultural vehicle to reach at least one target weed by selecting an optimum path to reach the at least one target weed in the agricultural field.
12. A system (100) for controlling agricultural implement (D) for removing weeds, the system (100) comprising:
a sensor module (102) configured to monitor at least one parameter corresponding to identification and removal of at least one weed;
a processing unit (104) configured to process the at least one parameter received from said sensor module (102) to generate at least one control output signal to ta control unit (106); and
a de-weeding blade position adjusting system (110) adapted to alter a position of at least one de-weeding blade (B) of the agricultural implement (D) to a weed removing position in which the at least one de-weeding blade (B) removes at least one weed based on the at least one control output signal received from the control unit (106).
13. The system (100) as claimed in claim 12, wherein said sensor module (102) comprises at least one media acquisition device (102a) adapted to capture and communicate at least images of crops and weeds, to said processing unit (104).
14. The system (100) as claimed in claim 13, wherein said media acquisition device (102a) is adapted to capture and communicate at least one of video of the agricultural field, image of row crops, images of weeds grown along the crop and images of crop growth stages, to said processing unit (104).
15. The system (100) as claimed in claim 12, wherein said sensor module (102) comprises,
a speed sensor (102b) adapted to detect and communicate a speed of an agricultural vehicle to said processing unit (104); and
a load sensor (102c) adapted to detect and communicate a load of the agricultural implement (D) to said processing unit (104).
16. The system (100) as claimed in claim 12, wherein said system (100) comprises a camera position adjusting system (108) configured to alter a position of at least one media acquisition device (102a) based on instruction received from a controller (104b) of said processing unit (104), where the least one media acquisition device (102a) is provided on at least one of the agricultural implement (D) and the agricultural vehicle, wherein said camera position adjusting system (108) comprises,
at least one first linear actuator (108F) configured to move the at least one media acquisition device (102a) through a first movable member (108MF) thereby altering the position of the media acquisition devices (102a) along a lengthwise direction of an agricultural vehicle based on instruction received from the controller (104b) of the processing unit (104);
at least one second linear actuator (108S) configured to move the media acquisition device (102a) through corresponding second movable member (108MS) thereby altering the position of the media acquisition device (102a) along a width wise direction of the agricultural vehicle based on instruction received from the controller (104b); and
at least one third linear actuator configured to move the media acquisition device (102a) thereby altering the position of the media acquisition device (102a) along a height wise direction of the agricultural vehicle based on instruction received from the controller (104b).
17. The system (100) as claimed in claim 12, wherein said de-weeding blade position adjusting system (108) comprises,
at least one first linear actuator (110F) configured to move the de-weeding blade (B) through corresponding tine (T) and a linkage system (110L) thereby altering the position of the at least one de-weeding blade (B) with respect to a movement axis (A) of corresponding tine (T) based on the control output signal received from the control unit (106); and
at least one second linear actuator (110S) configured to move the at least one de-weeding blade (B) with respect to the tine (T) along a vertical direction thereby altering the position of de-weeding blade (B) along a height wise direction of agricultural vehicle based on the control output signal received from the control unit (106).
18. The system (100) as claimed in claim 12, wherein said processing unit (104) includes a learning module (104a) adapted to classify captured at least one parameter received from the at least one media acquisition device (102a) to a list of crops and weeds based on a list of trained data thereby identifying at least one weed.
19. The system (100) as claimed in claim 18, wherein said system (100) comprises,
an user interface unit (204) adapted to provide at one user defined input to an optimization module 202 of said processing unit (104), where the at least one user defined input is at least one of depth of the weed to be removed, nature of weed, nature of crop, crop height, weed height, crop growth stage, weed growth stage, depth of the soil, geo-coordinates, nature of the soil and weather condition,
wherein
said optimization module (202) is adapted to receive inputs from said speed sensor 102b, said learning module 104a, said load sensor 102c and said user interface unit 204; and
said optimization module (202) is configured to generate and communicate an optimized control signal to said control unit (106) to identify and remove the weed in an optimal manner.
20. The system (100) as claimed in claim 13, wherein said at least one media acquisition device (102a) is provided on at least one drone,
wherein
said drone includes a positioning module adapted to provide geographical position to the controller (104b), where the geographical position is at least one of geographical position of at least one of crops, weeds and agricultural field;
said controller (104b) is configured to generate and communicate at least one weed removal map to the control unit (106) based on the information received from the at least one drone;
said sensor module (102) includes a positioning module adapted to provide geographical position to controller (104b), where the geographical position is at least one of geographical position of at least one of crops, weeds, agricultural field and the agricultural vehicle; and
said controller (104b) is configured to guide the agricultural vehicle to reach at least one target weed by selecting an optimum path to reach the at least one target weed in the agricultural field.
, Description:TECHNICAL FIELD
[001] Embodiments disclosed herein relate to agricultural vehicles and more particularly, to detecting and removing weeds using agricultural implement(s) attached to an agricultural vehicle.
BACKGROUND
[002] Weeds are unwanted plants that compete with healthy crops for essential resources such as space, water, nutrients, light and carbon dioxide. Weeds distribution is heterogeneous in agricultural fields and weed presence reduces overall agricultural output, thereby reducing total productivity. Identification and removal of weeds is the important process involved in the agricultural practices, as the weed competing with the crop plants may reduce the growth of the crops by obtaining nutrients and other resources provided to the crops, thereby reducing the yield of the food crops. While detecting the weed, it is important to identify the species of weeds, the spread of weed and the weed growth stages and the like. The evaluation and assessment of weeds among crops are typically performed by field personnel, either by walking or the use of a motor vehicle. However, the ability to determine weed growth physical area at a glance is not immediate.
[003] In conventional approaches, removal of weeds between the crops may involve removing large quantity of weeds between crops grown in rows. Usually, the weeds are removed manually by using weed removing tools. Physical removal of weeds requires strenuous digging and pulling by hands. Getting on hands and knees to pull the weeds puts strain on the back and legs. Further, manual removing of weeds is tedious and time consuming especially in vast agricultural fields. Some weeds have long roots that require extra digging to remove the entire weed. Therefore, the weed removing tools attached to the agricultural vehicle may remove the weeds along the plant or may remove the plants without proper identification of weeds. Thus, removal of weed must be optimized to identify between the weeds and the crops before the removal of unnecessary plants. Manually driven strategies for removing weeds can be error prone and inefficient due to dependency on skill sets of a farmer especially for a novice. Thus, the manual driven strategies may result in incorrect agricultural practices.
[004] In addition, the conventional agricultural vehicles do not have real-time information related to the agricultural vehicle and the agricultural implement such as, but not limited to, speed match of the agricultural implement and the agricultural vehicle, time required to remove the weeds, excessive push or pull information required to remove the weeds and so on.
[005] In addition, in the conventional approaches, the removal of weeds may not use smart method to precisely identify the weeds along the plants. The weed removal tools attached to the agricultural vehicle may not precisely remove the weed due to mismatch in time or speed required by the agricultural vehicle to remove the weed. Thereby, resulting in improper or wrong removal of plants, that is undesirable and causes chaos in the agricultural practice and also affects the production of crops.
[006] Also, the crop-cultivation practiced in various countries differs in different regions, so it is difficult to manufacture a system to implement all the subtle differences in the crop practices.
OBJECTS
[007] The principal object of embodiments herein is to disclose methods and systems for controlling agricultural implement for removing weeds.
[008] Another object of the embodiments relates to precise identification and removal of weeds in an optimized way in real-time scenario.
[009] These and other objects of embodiments herein will be better appreciated and understood when considered in conjunction with following description and accompanying drawings. It should be understood, however, that the following descriptions, while indicating embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
BRIEF DESCRIPTION OF DRAWINGS
[0010] The embodiments are illustrated in the accompanying drawings, throughout which like reference letters indicate corresponding parts in various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
[0011] FIG. 1 illustrates a de-wedding control system for controlling agricultural implement for removing weeds, where the agricultural implement is attached to an agricultural vehicle, according to embodiments as disclosed herein;
[0012] FIG. 2 illustrates an optimization module required to achieve synchronous operation in identifying and removing the weed in an optimal manner, according to embodiments as disclosed herein;
[0013] FIG. 3 illustrates a processing module for identifying and removing weeds using agricultural implement, according to embodiments as disclosed herein;
[0014] FIG. 4 depicts media acquisition device(s) and camera position adjusting system mounted on the agricultural implement, according to embodiments as disclosed herein;
[0015] FIG. 5 depicts a de-weeding blade position adjusting system mounted on the agricultural implement, according to embodiments as disclosed herein;
[0016] FIG. 6 depicts first linear actuator(s) and second linear(s) of the de-weeding blade position adjusting system, according to an embodiment as disclosed herein; and
[0017] FIG. 7 is a flow diagram illustrating a method for controlling agricultural implement for removing weeds, according to embodiments as disclosed herein.
DETAILED DESCRIPTION
[0018] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0019] The embodiments herein disclose methods and systems for controlling de-weeding agricultural implement for removing weeds. Referring now to the drawings, and more particularly to FIGS. 1 through 7, where similar reference characters denote corresponding features consistently throughout the figures, there are shown embodiments.
[0020] FIG. 1 illustrates a de-weeding control system 100 for controlling agricultural implement for removing weeds, where the agricultural implement (D) is attached to an agricultural vehicle, according to an embodiment. The agricultural vehicle herein refers to any vehicle or farm machinery that can be used for performing agricultural related operations in agricultural fields. An example of the agricultural vehicle can be, but not limited to, a tractor. Embodiments herein are further explained considering the tractor as the agricultural vehicle, but it may be obvious to a person of ordinary skill in the art that any suitable vehicle or agricultural machines can be considered.
[0021] The agricultural vehicle can include at least one agricultural implement (D) attached to the vehicle. In an embodiment, the agricultural implement (D) can be attached to the vehicle permanently. In an embodiment, the agricultural implement (D) can be attached to the vehicle using a detachable means, such as a three-point hitch/linkage, and so on. Examples of the agricultural implement can be, but is not limited to, sprayers, harrows, plows, planters, harvesters/reapers, fertilizer spreader and so on.
[0022] The de-weeding control system 100 can be mounted on at least one of the agricultural implements and the agricultural vehicle. In an embodiment, the de-weeding control system 100 can be dust proof, leak proof and able to withstand dry land and wet land cultivation and vibration as per the farm requirements. The de-weeding control system 100 is configured to accurately identify and remove the weeds in an optimized way in real-time scenario. The de-weeding control system 100 includes a sensor module 102, a processing unit 104, a control unit 106, a storage unit 107, a camera position adjusting system (108) and a de-weeding blade position adjusting system (110). The sensor module 102 can be coupled to the processing unit 104 and the control unit (106) through a communication network. The communication network can be, but is not limited to, the Internet, a wired network (a Local Area Network (LAN), a Controller Area Network (CAN), a bus network, Ethernet and so on), a wireless (Wi-Fi) network, a cellular network, a Wi-Fi Hotspot, Bluetooth, Zigbee and so on) and so on.
[0023] The sensor module (102) is configured to monitor the parameters corresponding to identification and removal of weeds. Examples of the parameters can be, but is not limited to, media related to identification of weed, speed of the vehicle, load of the agricultural implement, and so on. In an embodiment, the sensor module (102) includes a plurality of media acquisition devices (102a), a speed sensor (102b) and a load sensor (102c).
[0024] The media acquisition device 102a is adapted to capture media related to identification of the weed. The captured media can include but not limited to video of the agricultural field, image of row crops, images of weeds, images of crops, images of weeds along the crop, images of crop growth stages and so on. The media acquisition device 102a can include but not limited to, cameras, RGB (Red, green and blue) cameras, a thermal camera, an ultraviolet (UV) camera, multispectral cameras, and so on. However, it is also within the scope of the embodiments disclosed herein to use any type of cameras without otherwise deterring the intended function of the media acquisition device 102a as can be deduced from this description and corresponding drawings. The media acquisition devices 102a and the camera position adjusting system (108) can be mounted on the agricultural implement (D) such that position of the media acquisition devices 102a can be moved in a suitable direction to capture best quality images of crops, crops grown in rows, different growth stages of the crops, the weeds grown along the crops, and so on. In another embodiment, the plurality of media acquisition devices (102a) and the camera position adjusting system (108) can be mounted on a mounting structure (not shown) which is to be attached at one of a front-end, rear end, side and mid of the agricultural vehicle. The positions of plants (weeds, crops and off variety plants) are geographical referenced for identification of plants (weeds, crops and off variety plants) at respective locations. In an embodiment, for example, the media acquisition device 102a can be moved forward, backward, sideways in horizontal plane parallel to ground and also in vertical direction by using the camera position adjusting system (108). Adjusting the position of the media acquisition devices 102a allows capturing of the crop and weed for optimally identifying and removing the weed from the crops.
[0025] The speed sensor 102b is adapted to measure the speed of the agricultural vehicle that are to be processed by the control unit (106) through the processing unit (104).
[0026] The load sensor 102c can be mounted on at least one of the three-point hitch and the agricultural implement (D) of the agricultural vehicle. The load sensor 102c measures the operating load of the agricultural implement (D). In an embodiment, the operating load of the agricultural implement can include but not limited to load acting on at least one of the three-point hitch and the agricultural implement during operation of the agricultural implement, growth of the weed, depth of the weed to be cut, depth of the soil, nature of the soil, nature of the weed along the crop and the like.
[0027] The processing unit 104 can be configured to collect and process the measured parameters from the sensor module 102. In an embodiment, the measured parameters can be collected for a pre-determined time period of operation of the agricultural vehicle and the agricultural implement. In an embodiment, the measured parameters can be collected on one or more pre-determined events occurring. In an embodiment, the processing unit 104 may process the measured parameters from the sensor module 102. For example, the processing unit 104 may process the video or image captured by the media acquisition device 102a. The processing unit 104 may analyze the video or other media captured by the media acquisition device 102a and process the captured image or video frame wise to obtain the basic images of the weed. In another embodiment, the processing unit 104 can be configured to operate other components/devices associated with the agricultural vehicle. For example, the processing unit 104 may control the speed of the agricultural vehicle. Also the processing unit 104 may keep track of the amount of weed to be removed from the field by the agricultural implement. The processing unit 104 includes a learning module 104a and a controller 104b.
[0028] The learning module 104a embedded in the processing unit 104 receives the monitored parameters from the sensor module 102. The learning module 104a uses a set of rules derived from a machine learning model to identify the crops and weeds based on the captured parameter received from the media acquisition device (104a). The machine learning model can include at least one of a supervised learning algorithm, an unsupervised learning algorithm and so on. For example, the learning module 104a hosts a set of deep learning methods which includes a list of trained data which relates to weed identification and removal process. The list of trained data relates to the set of data used to train the learning module 104a and to predict the desired outcome from the series of processing data. For example the list of trained data comprising a weed information and other related data such as weed type, weed nature, nature and type of soil, and amount of force required to remove the weed, it may also include other weed identification and removal related data. Further, the learning module 104a includes a trained large database along with the augmented data set for high accuracy of weed identification and removal. For example, the augmented data set may enable the learning module 104a to significantly increase the diversity of data available through the list of trained data. Also, the augmented data set improves the performance of the learning module 104a by increasing the training data related to identification and removal of weed.
[0029] Also, the learning module 104a along with the trained dataset classifies the captured parameter from the sensor module 102. For example, the captured image or video may be analyzed by the learning module 104a based on the trained data to classify the crops and weed. The classification result provided by the learning module 104a plays a vital role in identifying and removing weeds from the crops, the classification may be made based on type of weed, nature of crop along the weed, growth stage of weed and so on.
[0030] The controller 104b can be configured to generate the control output signal corresponding to the measured parameters analyzed by the learning module 104a. The controller 104b provides the control output signal to the camera position adjusting system (108). Further, the controller 104b provides the control output signal to the control unit 106 for controlling the de-weeding blade position adjusting system (110). The control output signal indicates required optimum parameter(s) of the agricultural implement for operating the at least one of the agricultural implements and the agricultural vehicle. In an embodiment, examples of the required optimum parameters of the agricultural implement can be, but is not limited to, load, speed, hour count, weed identification and removal, and so on. In an embodiment, the controller 104b can include a processor, a microcontroller, a memory, a storage unit and so on.
[0031] The control unit 106 receives the control output signal from the controller 104b based on analysis of the learning module 104a.The control unit 106 instructs the de-weeding blade position adjusting system (110) to alter the position of de-weeding blade (B) of the agricultural implement (D) thereby enabling the de-weeding blade B to remove the weeds.
[0032] Based on the analysis of the learning module 104a with respect to the captured parameters of the sensor module 102, the learning module 104 provides signal to the de-weeding blade position adjusting system (110) through the control unit (106) to remove the weed in its exact position. For example, the learning module 104 classifies the weed and crop based on the captured image from the media acquisition device 102a. The learning module 104a provides the co-ordinates of the weed to the control unit (106) which instructs the de-weeding blade position adjusting system (110) to alter the position of de-weeding blades (B) thereby enabling the de-weeding blade (B) to remove the weed in its exact position.
[0033] The learning module 104a calculates the speed required by the agricultural vehicle to reach the accurate geo co-ordinates of the weeds, for effective removal of the weeds. The learning module 104a calculates the adjustments required to be configured on the media acquisition device 102a for effective identification of weed. For example, the settings of the camera including shutter speed adjustments, adjusting capture quality of the images, capture ratio of the image and video and the like can be adjusted based on the control signal received from the learning module 104a. Also, the processing speed of the camera and other parameters related to the media acquisition device 102a can be adjusted based on the speed of the agricultural vehicle.
[0034] The learning module 104a controls the speed of the agricultural vehicle to remove the weeds. Speed of the agricultural vehicle is tracked by the speed sensor 102b and the learning module 104a controls the speed of the agricultural vehicle to remove the weeds. Also, if the speed of the agricultural vehicle needs to be controlled based on different delays of processing is controlled by the controller 104b. Therefore, the learning module 104a controls the dynamic position of the vehicle, the settings of the media acquisition device 102a, the position of the media acquisition device 102a through camera position adjusting system 108, the position of de-weeding blades through the de-weeding blade position adjusting system (110) and the like.
[0035] The de-weeding blade position adjusting system (110) is provided on the agricultural implement to alter the position of de-wedding blades for easy removal of weeds along the crop. The de-weeding blade position adjusting system (110) may be employed to perform the multi-row weed removal operation based on the captured parameter and the control signal provided by the learning method 104a.
[0036] Further, the controller 104b transmit at least one control output signal to control unit 106 and multiple devices using at least one of a wired network (a LAN, a CAN network, an ISO bus, a bus network, Ethernet and so on), a wireless network (Wi-Fi network, a GSM, a cellular network, Wi-Fi Hotspot, Bluetooth, Zigbee and so on) and so on. The multiple devices can be, but is not limited to, the media acquisition devices 102a, the camera position adjusting system (108), the de-weeding blade position adjusting system (110), the agricultural vehicle, a cloud server, an electronic device (mobile, smartphone, laptop, tablet and so on), and so on. In an embodiment, the multiple devices can present the control output signal according to requirements received from the sensor module 102.
[0037] The storage unit 107 can be configured to store the measured parameters and operating count values of the agricultural implement. The storage unit 107 includes at least one of a file server, a data server, a memory, a server, a cloud and so on. The memory may include one or more computer-readable storage media. The memory may include non-volatile storage elements. Examples of such non-volatile storage elements may include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. In addition, the memory may, in some examples, be considered a non- transitory storage medium.
[0038] The term “non-transitory” may indicate that the storage medium is not embodied in a carrier wave or a propagated signal. However, the term “non- transitory” should not be interpreted to mean that the memory is non-movable. In some examples, the memory can be configured to store larger amounts of information than the memory. In certain examples, a non- transitory storage medium may store data that can, over time, change (e.g., in Random Access Memory (RAM) or cache).
[0039] FIG. 1 shows exemplary blocks of the de-weeding control system 100, but it is to be understood that other embodiments are not limited thereon. In other embodiments, the de-weeding control system 100 may include less or more number of blocks. Further, the labels or names of the blocks are used only for illustrative purpose and does not limit the scope of the embodiments herein. One or more blocks can be combined together to perform same or substantially similar function in the de-weeding. FIG. 2 illustrates an optimization module 202 required to achieve synchronous operation in identifying and removing the weed, according to embodiments. The optimization module 202 receives inputs from the speed sensor 102b, the learning module 104a, the load sensor 102c and a user interface unit 204. The speed sensor 102b measures the speed of the the agricultural vehicle. In the learning module 104a classifies the captured parameter from the sensor module 102. The classification result may provide type of weed, nature of crop along the weed, growth stage of weed and so on. The classification is performed based on captured parameter from the sensor module 102 to the list of trained data comprising weed information and other related data such as weed type, weed nature, nature and type of soil, and amount of weed to remove the weed, it may also include other weed removal related data. The load sensor 102c measures the operating load of the agricultural implement. In an embodiment the operating load of the agricultural implement can include but not limited to load acting on the agricultural implement during operation of the agricultural implement, growth of the weed, depth of the weed to be cut, depth of the soil, nature of the soil, nature of the weed along the crop and the like.
[0040] The user interface unit 204 may provide the communicating interface between the user and the de-weeding control system (100). The user may provide user defined inputs through an application of the user interface unit 204 to be analyzed by the optimization module 202. The input provided by the user can include but is not limited to depth of the weed to be removed, nature of weed, nature of the crop, crop height, weed height, crop growth stage, weed growth stage, depth of the soil, nature of the soil, weather condition, geo-coordinates, other weed identification inputs and the like. The user can include, but is not limited to the farmer, cultivator, planter, botanist, gardener, reaper, weed remover and the like.
[0041] . The optimization module 202 incorporates a procedure which is executed iteratively with the received inputs to obtain the most optimized results. The optimization module 202 processes the inputs received from the speed sensor 102b, learning module 104a, load sensor 102c and the user interface unit 204. In an embodiment, the optimization module 202 iteratively executes the received inputs to obtain the optimized results.
[0042] The optimized results received from the optimization module 202 may be used to identify and remove the weed in an optimal manner without engaging the crop. The optimized result may include the accurate co-ordinates required to track the weed, the depth of the weed, the depth of the crop, the speed required by the agricultural vehicle to reach the co-ordinates, the speed of processing of the media acquisition device 102a, the positioning of the media acquisition device 102a, the positioning of the de-weeding blade (B) and the like. In an embodiment the optimized result from the optimization module 202 can include, but not limited to controlling the shutter speed of the camera, adjusting the position of the camera, controlling the processing speed of the camera, adjusting the position of de-weeding blades, adjusting the operating speed required by the de-weeding blades and the like. The optimized result obtained from the optimization module 202 aims in providing the most optimized and efficient way in identifying and removing the weeds from the row crops.
[0043] The optimization module 202 also helps in achieving the synchronous operations based on the inputs received from the various units. In an embodiment, the position of the camera on the crop practice can be adjusted, the speed of processing of the camera is adjusted based on the vehicle speed through the control unit. The operation time of camera, the time for processing at the system level controller to generate actuation signal etc are considered while performing actuation. Also, if the speed of the agricultural vehicle needs to be controlled based on the different delays of processing that is also master controlled at the controller. Thereby the synchronous operations between the various units are achieved through the optimization module 202.
[0044] FIG. 2 shows exemplary blocks of optimization module 202 required to achieve synchronous operation in identifying and removing the weed in the optimal manner, but it is to be understood that other embodiments are not limited thereon. In other embodiments, the optimization module 202 may include less or more number of blocks. Further, the labels or names of the blocks are used only for illustrative purpose and does not limit the scope of the embodiments herein. One or more blocks can be combined together to perform same or substantially similar function in the optimization module 202.
[0045] FIG. 3 illustrates a processing module for identifying and removing weeds using agricultural implement(s) attached to an agricultural vehicle, according to an embodiment. The processing unit 104 includes a media processing module 302, a classification module 304, an actuation controlling module 306, an optimization module 202 and a camera position adjusting module 310. The learning module 104a being embedded in the processing unit 104 receives captured parameters from the sensor module 102. The processing unit 104 processes the received input and provides the output signal to the de-weeding blade position adjusting system (110) through the control unit (106).
[0046] The media processing module 302 may receive the captured parameters from the media acquisition device 102a. The media processing module 302 may process the captured parameters from the media acquisition device 102a, the captured parameter can include and is not limited to video of the agricultural field, image of row crops, weeds along the crops, images of the plant growth stages and so on. In an embodiment, the media processing module 302 may process the captured parameter into its basic elements to identify the further features present in the parameters. For example, in an embodiment, the captured video or image is being processed framewise to obtain the depth of weeds along the group crops, nature of soil of the weed, crop growth stage and the like. Therefore, the media processing module processes the received media and identifies the basic elements present in each frame.
[0047] The classification module 304 receives the processed elements from the media processing module 302 containing basic elements present in the captured parameters of the media acquisition device 102a. On receiving the basic elements of the processed parameter, the classification module 304 identifies the weed, and classifies the weed based on its type and nature of the weed.
[0048] Further, classification of weed involves comparing the received elements from the captured parameter to the trained data from the learning method 104a. The learning module 104a comprises a large set of trained data related to weed identification and classification of weeds based on its nature, type, depth and other relates parameters.
[0049] The actuation controlling module 306 provides output control signal to the de-weeding blade position adjusting system (110) through the control unit (106). Based on the classification result, the controlling unit 306 directs the de-weeding blade position adjusting system (110) to alter the position of de-weeding blades accordingly to perform de-wedding operation. The blade column (BC), as shown in fig. 5) retracts on identification of plants and extends on identification of weeds. Therefore, the motion of the blade column (BC) is controlled by the actuation controlling module 306, which in turn moves the de-weeding blade (B) accordingly in or out of the plant row. Thus, allowing the de-weeding blade (B) of the agricultural implement to selectively cut weeds and performs the multi-row weed removal operations based on the actuation controlling module 306.
[0050] The optimization module 202 performs the de-weeding operation to efficiently identify and remove the weeds which helps in achieving the synchronous operations between various units. In an embodiment, the media acquisition device 102a, other sensor units, the agricultural vehicle and other components are operated synchronously to achieve de-weeding. For an example, the position of the camera on the crop practice can be adjusted, the speed of processing of the camera is adjusted based on the vehicle speed through the control unit.
[0051] Also, the processing unit 104 controls the operating time of camera, processing time to generate actuation signal etc are considered while performing actuation. Also, if the speed of the agricultural vehicle needs to be controlled based on the different delays of processing that is also master controlled at the controller. Thereby the synchronous operation between the various units are achieved through the optimization module 202.
[0052] Based on the synchronous operation between various units, the optimization is achieved in identifying and removing the weeds from the crop. The identification and removal of weeds needs all the units of the de-weeding control system 100 to perform in an optimized way to achieve greater efficiency. In an embodiment, the position of camera is to be adjusted based on the crop practice followed. Further, the position of cameras can be altered based on distance between the camera and weed, co-ordinates of weeds. The operating speed of the agricultural vehicle is to be adjusted based on feedback from the control unit, delays of camera processing and the like. Therefore, all the units are performed in the synchronous manner to achieve the optimized operation of de-weeding.
[0053] The camera position adjusting module 310 instructs the camera position adjusting system (108) to adjust the position of the media acquisition devices (102a) along at least one of lengthwise, widthwise and height wise direction of the agricultural vehicle to accurately capture at least one of images and videos of weed, crop and weed along the crop corresponding to identification of weed. The camera position adjusting module 310 helps to identify the better position or co-ordinates of the weed grown along the crop.
[0054] FIG. 3 shows exemplary blocks of processing module 104 for identifying and removing weeds using agricultural implement(s) attached to an agricultural vehicle, but it is to be understood that other embodiments are not limited thereon. In other embodiments, processing module may include less or more number of blocks. Further, the labels or names of the blocks are used only for illustrative purpose and does not limit the scope of the embodiments herein. One or more blocks can be combined together to perform same or substantially similar function in the processing module.
[0055] FIG. 4 depicts media acquisition devices (102a) and camera position adjusting system (108) mounted on the agricultural implement (D). The agricultural implement D is coupled the three-point hitch. The three-point hitch includes an upper link and a pair of lower links. The agricultural implement D includes a main frame F, a plurality of movable tines (T) and plurality of de-weeding blades B. Each movable tines (T) is pivotably connected to the main frame (F). Each de-weeding blades (B) defines a blade column (BC), (as shown in fig. 5) movably connected to corresponding movable tine (T). The agricultural implement D is attached to one of rear end, front end, side and mid of the agricultural vehicle. In an embodiment, the camera position adjusting system (108) is adapted to alter a position of the media acquisition devices (102a). In an embodiment, the camera position adjusting system (108) comprises a plurality of first linear actuators (108F), a plurality of second linear actuators (108S), a plurality of third linear actuators (not shown), at least one first movable member (108MF) and a plurality of second movable members (108MS).
[0056] Each first linear actuator (108F) is adapted to move the media acquisition devices (102a) through the first movable member (108MF) thereby altering the position of the media acquisition devices (102a) along a lengthwise direction of the agricultural vehicle based on the instructions received from the controller (104b). Each first linear actuator (108F) is mounted to the main frame (F) of the agricultural implement (D). Each second linear actuator (108S) is mounted onto the first movable member (108MF). Each second linear actuator (108S) is adapted to move corresponding media acquisition device (102a) through corresponding second movable member (108MS) thereby altering the position of the media acquisition device (102a) along a width wise direction of the agricultural vehicle based on the instructions received from the controller (104b). Each third linear actuator (not shown) is mounted onto the second movable member (108MS). Each third linear actuator (not shown) is adapted to move corresponding media acquisition device (102a) thereby altering the position of the media acquisition device (102a) along a height wise direction of the agricultural vehicle based on the instructions received from the controller (104b).
[0057] The first movable member (108MF) is movably connected to the main frame (F) of the agricultural implement (D). For example, the first movable member (108MF) is slidably connected to cross members (FC), (as shown in fig. 4) of the main frame (F) of the agricultural implement (D). The first movable member (108MF) defines a plurality of guide portions (not shown) corresponding to the guide portion provided on each cross member (FC) of the main frame (F) of the agricultural implement (D). For the purpose of this description and ease of understanding, the guide portion (not shown) defined on the first movable member (108MF) is considered to be a groove and correspondingly the guide portions (not shown) defined on each cross member (FC) of the main frame (F) of the agricultural implement (D) is considered to be protrusion, and vice versa.
[0058] Each second movable member (108MS) is movably connected to the first movable member (108MF) and spaced away from the other second movable member (108MS). For example, each second movable member (108MS) is slidably connected to the first movable member (108MF). Each second movable member (108MS) defines a plurality of guide portions (not shown) corresponding to the guide portion provided on the first movable member (108MF). For the purpose of this description and ease of understanding, the guide portion (not shown) defined on each second movable member (108MS) is considered to be a protrusion and correspondingly the guide portions (not shown) defined on the first movable member (108MF) is considered to be groove, and vice versa.
[0059] For the purpose of this description and ease of understanding, each first, second and third linear actuator (108F, 108S) of the camera position adjusting system (108) is considered to be an electric linear actuator. Each first, second and third linear actuator (108F, 108S) of the camera position adjusting system (108) includes an electric motor (not shown), a plurality of gears (not shown), a leadscrew (not shown), a threaded member (not shown), a connecting rod (108FC, 108SC), a guide member (not shown) and a plurality of limit switches (not shown). The electric motor of each first, second and third linear actuator (108F, 108S) includes a controller unit (not shown) provided in communication with the controller (104b). The threaded member (not shown) is movably connected to the leadscrew (not shown).
[0060] One end of the connecting rod (108FC) of each first linear actuator (108F) is connected to the threaded member (not shown) and another end of the connecting rod (108FC), (as shown in fig. 4) of the first linear actuator (108F) is connected to the first movable member (108MF). One end of the connecting rod (108SC) of each second linear actuator (108S) is connected to the threaded member (not shown) and another end of the connecting rod (108SC), (as shown in fig. 4) of the second linear actuator (108F) is connected to corresponding second movable member (108MS). One end of the connecting rod (not shown) of each third linear actuator (not shown) is connected to the threaded member (not shown) and another end of the connecting rod (not shown) of the third linear actuator (not shown) is connected to corresponding media acquisition device (104a). For the purpose of this description and ease of understanding, the threaded member (not shown) is considered to be a nut. In an embodiment, the threaded member and the connecting rod (108FC, 108SC) are separate parts. In another embodiment, the threaded member and the connecting rod (108FC, 108SC) is considered to be a single integrated part. The guide member (not shown) is adapted to guide the movable connecting rod (108FC, 108SC) during operation of the linear actuator.
[0061] The electric motor of each first linear actuator (108F) is activated by the controller (104b) through the controller unit of the electric motor. On energization of the electric motors, an output shaft (not shown) of each electric motor rotates corresponding leadscrew (not shown) through corresponding plurality of gears (not shown). As the leadscrew rotates, the threaded member (not shown) moves linearly on the leadscrew thereby moving the connecting rod (108FC). The connecting rod (108FC) of each first linear actuator (108F) in turn moves the first movable member (108MF) thereby altering the position of media acquisition devices (102a) along the lengthwise direction of the agricultural vehicle.
[0062] The electric motor of each second linear actuator (108S) is activated by the controller (104b) through the controller unit of the electric motor. On energization of the electric motors, an output shaft (not shown) of each electric motor rotates corresponding leadscrew (not shown) through corresponding plurality of gears (not shown). As the leadscrew rotates, the threaded member (not shown) moves linearly on the leadscrew thereby moving the connecting rod (108SC). The connecting rod (108SC) of each second linear actuator (108S) in turn moves the second movable member (108MF) thereby altering the position of media acquisition devices (102a) along the widthwise direction of the agricultural vehicle.
[0063] The electric motor of each third linear actuator (not shown) is activated by the controller (104b) through the controller unit of the electric motor. On energization of the electric motors, an output shaft (not shown) of each electric motor rotates corresponding leadscrew (not shown) through corresponding plurality of gears (not shown). As the leadscrew rotates, the threaded member (not shown) moves linearly on the leadscrew thereby moving the connecting rod (not shown) of each third linear actuator. The connecting rod (108FC) of each third linear actuator in turn moves the media acquisition devices (102a) thereby altering the position of media acquisition devices (102a) along the height wise direction of the agricultural vehicle.
[0064] In another embodiment, the plurality of gears (not shown) can be replaced by one of chain and sprockets, and belt and pulley or a combination of both. Further, it is also within the scope of the invention to consider each first, second and third linear actuator (108F, 108S) of the camera position adjusting system (108) as one of a mechanical linear actuator, electro-pneumatic linear actuator, electro-hydraulic linear actuator, solenoid operated linear actuator, telescopic linear actuator, ball screw linear actuator, any other type of electric linear actuators and any other type of linear actuators.
[0065] In an embodiment, the de-weeding blade position adjusting system (110) is adapted to alter the position of the de-weeding blade (B) between one of a stowed position (initial position/ non-cutting position) in which the de-weeding blade (B) is disposed away from the crops and a de-weeding position in which the de-weeding blade (B) removes the weeds. The de-weeding blade position adjusting system (110) includes a plurality of first linear actuators (110F), a plurality of second linear actuators (110S) and a linkage system (110L). Each first linear actuator (110F) is mounted on the main frame (F) of the agricultural implement (D).Each first linear actuator (110F) is adapted to move the de-weeding blades (B) through the tines (T) and linkage system (110L) thereby altering the position of de-weeding blades (B) with respect to the movement axis (A) of the tines T, (as shown in fig. 6) based on the instructions received from the control unit (106). In another embodiment, each first actuator (110F) is adapted to independently move corresponding de-weeding blade (B) through corresponding tine (T) thereby altering the position of corresponding de-weeding blade (B) along a widthwise direction of the agricultural vehicle based on the instructions received from the control unit (106). For the purpose of this description and ease of understanding, each first linear actuator (110F) of the de-weeding blade position adjusting system (110) is considered to be an electric linear actuator. Each first linear actuator (110F) includes an electric motor (not shown), a plurality of gears (not shown), a leadscrew (not shown), a threaded member (not shown), a connecting rod (110FC), (as shown in fig. 5), a guide member (not shown) and a plurality of limit switches (not shown). The electric motor of each first linear actuator (110F) includes a controller unit (not shown) provided in communication with the control unit (106). The threaded member (not shown) is movably connected to the leadscrew (not shown). One end of the connecting rod (110FC) of each first linear actuator (110F) is connected to the threaded member (not shown) and another end of the connecting rod (110FC), (as shown in fig. 5) of the first linear actuator (110F) is connected to corresponding tine (T) through the linkage system (110L). For the purpose of this description and ease of understanding, the threaded member (not shown) is considered to be a nut. In an embodiment, the threaded member and the connecting rod (110FC) are separate parts. In another embodiment, the threaded member and the connecting rod (110FC) is considered to be a single integrated part. The electric motor is activated by the controller (106) through the controller unit of the electric motor. On energization of the electric motor, an output shaft (not shown) of the electric motor rotates the leadscrew (not shown) through the plurality of gears (not shown). As the leadscrew rotates, the threaded member (not shown) moves linearly on the leadscrew thereby moving the connecting rod (110FC). The connecting rod (110FC) in turn moves the de-weeding blades (B) through the tines (T) and linkage system (110L) thereby altering the position of de-weeding blades (B) with respect to the movement axis (A) of the tines (T). The guide member (not shown) is adapted to guide the movable connecting rod (110FC) during operation of the linear actuator (110F). In another embodiment, the plurality of gears (not shown) can be replaced by one of chain and sprockets, and belt and pulley or a combination of both. Further, it is also within the scope of the invention to consider each first linear actuator (110F) of the de-weeding blade position adjusting system (110) as one of a mechanical linear actuator, electro-pneumatic linear actuator, electro-hydraulic linear actuator, solenoid operated linear actuator, telescopic linear actuator, ball screw linear actuator, any other type of electric linear actuators and any other type of linear actuators.
[0066] Each second linear actuator (110S) is adapted to move corresponding de-weeding blade (B) with respect to t corresponding tine (T) along a vertical direction thereby altering the position of de-weeding blade (B) along a height wise direction of the vehicle based on the instructions from the control unit (106). Each second linear actuator (110S) is mounted onto corresponding tine (T) of the agricultural implement (D). For the purpose of this description and ease of understanding, each second linear actuator (110S) of the de-weeding blade position adjusting system (110) is considered to be an electric linear actuator. Each second linear actuator (110S) includes an electric motor (not shown), a plurality of gears (not shown), a leadscrew (not shown), a threaded member (not shown), a connecting rod (110SC), (as shown in fig. 6), a guide member (not shown) and a plurality of limit switches (not shown). The electric motor of each second linear actuator (110S) includes a controller unit (not shown) provided in communication with the control unit (106). The threaded member (not shown) is movably connected to the leadscrew (not shown). One end of the connecting rod (110SC) of each second linear actuator (110S) is connected to the threaded member (not shown) and another end of the connecting rod (110SC) of the second linear actuator (110S) is connected to the blade column (BC). For the purpose of this description and ease of understanding, the threaded member (not shown) is considered to be a nut. In an embodiment, the threaded member and the connecting rod (110SC) are separate parts. In another embodiment, the threaded member and the connecting rod (110SC) is considered to be a single integrated part. The electric motor is activated by the controller (106) through the controller unit of the electric motor. On energization of the electric motor, an output shaft (not shown) of the electric motor rotates the leadscrew (not shown) through the plurality of gears (not shown). As the leadscrew rotates, the threaded member (not shown) moves linearly on the leadscrew thereby moving the connecting rod (110SC). The connecting rod (110SC) in turn moves the de-weeding blade (B) with respect to the tine (T) along the vertical direction thereby altering the position of de-weeding blade (B) along the height wise direction of the vehicle. The guide member (not shown) is adapted to guide the movable connecting rod (110FC) during operation of the linear actuator (110F). In another embodiment, the plurality of gears (not shown) can be replaced by one of chain and sprockets, and belt and pulley or a combination of both. Further, it is also within the scope of the invention to consider each second linear actuator (110S) of the de-weeding blade position adjusting system (110) as one of a mechanical linear actuator, electro-pneumatic linear actuator, electro-hydraulic linear actuator, solenoid operated linear actuator, telescopic linear actuator, ball screw linear actuator, any other type of electric linear actuators and any other type of linear actuators.
[0067] The linkage system (110L) is adapted to couple corresponding tine (T) to corresponding first linear actuator (110F). The linkage system (110L) includes a plurality of first linkages (110LA), a plurality of second linkages (110LB) and an interconnecting linkage (110LC), (as shown in fig. 5). One end of each first linkage (110LA) is coupled to the connecting rod (110FC) of corresponding first linear actuator (110F) and another end of the first linkage (110LA) is connected to top end of corresponding tine (T). Each second linkage (110LB) is connected on top end of corresponding tine (T) in a direction opposite to the first linkage (110LA). One end of the interconnecting linkage (110LC) is connected to top end of the second linkage (110LB) and another end of the interconnecting linkage (110C) is connected to the bottom end of another second linkage (110LB) thereby connecting each tine (T) with the other tine (T).
[0068] The controller (104b) is configured to receive information about soil condition of the agricultural fields, geo-cordinates and weather condition from at least one of the user interface unit (204) and sensors (not shown) provided to the agricultural vehicle. In another embodiment, the media acquisition device (102a) can be movably mounted on at least one drone (not shown). The position of the media acquisition device (102a) provided on the drone can be altered automatically by the drone itself or manually by using manual operated drone remote controller. The drone (not shown) is adapted to be provided in communication with the controller (104b) through a communication network such as but not limited to Internet, a wired network (a Local Area Network (LAN), a Controller Area Network (CAN) network, a bus network, Ethernet and so on), a wireless network (a Wi-Fi network, a cellular network, a Wi-Fi Hotspot, Bluetooth, Zigbee and so on) and so on. The drone is provided with positioning modules can be, but is not limited to, a Global Positioning System (GPS) unit, a Local Positioning System (LPS), a Global Navigation Satellite System (GNSS) and so on. It should be noted that the drone disclosed herein may use any type of positioning systems without otherwise deterring the intended function of collecting the geo position of at least one of crops, weeds and agricultural field as can be deduced from this description and corresponding drawings. The drone is adapted to capture video or spectral imagery of the agricultural field and sends or downloads at least one information to the controller (104b). The information sent from the drone (not shown) to the controller (104b), includes agricultural field data such as but not limited to plant data, soil data and plant location data and weed location data.
[0069] In another embodiment, a docking station can be provided on board the agricultural vehicle (not shown) for the at least one drone which collects visual and spectral field data. This data from the drone can be downloaded to the vehicle for analytics during docking of the drone to update digital maps for autonomous guidance. The drone can be launched from the vehicle for multiple missions after charging and can be communicated to multiple machines for synchronized operations with telemetry modules on the drone and the vehicle.
[0070] Further, the sensor module (102) includes motion sensors and positioning modules. The positioning module is configured to provide information about geo position of at least one of crops, weeds and vehicle. Examples of the positioning modules can be, but is not limited to, a Global Positioning System (GPS) unit, a Local Positioning System (LPS), a Global Navigation Satellite System (GNSS) and so on. It should be noted that the embodiments disclosed herein may use any type of positioning systems without otherwise deterring the intended function of collecting the geo position of at least one of crops, weeds and vehicle as can be deduced from this description and corresponding drawings. The controller (104b) is configured to receive inputs from positioning modules and motion sensors of the sensor module (102). In another embodiment, the controller (104b) is configured to adapted to provide at least one of semi-autonomous and autonomous guidance to the vehicle based on the information received from at least one drone (not shown), the positioning modules and so on. The controller (104b) is configured to generate and communicate at least one weed removal map to the control unit (106) based on the information from the drone and positioning modules. Further, controller (104b) is configured to guide the agricultural vehicle by selecting an optimum path to reach at least one target weed in the agricultural field.
[0071] Further, the system (100) may include server and databases which include parameters or information corresponding to the removal of weeds. In an embodiment, the system (100) may include a cloud computing platform/system, where the cloud computing system can be part of a public cloud or a private cloud. The server may be a standalone server or a server on a cloud. Further, the server may be any kind of computing device such as those, but not limited to a personal computer, a notebook, a tablet, desktop computer, a laptop, a handheld device a mobile device, and so on. Although not shown, some or all of the devices in the system (100) can be connected to the cloud computing platform via a gateway. Also, the cloud platform can be connected to devices (drones, vehicles, user interface units, remote control systems and so on) located in same or different geographical locations.
[0072] In another embodiment, the control unit (106) can be configured to control another control unit provided on another agricultural vehicle for synchronizing geographical location of the first and second agricultural vehicles employed for identifying and removing weeds in the agricultural field, wherein the geographical location of the first and second agricultural vehicles is synchronized to collect geo-position synchronized from captured parameter required for identifying and removing weeds at various locations of the agricultural field. In the same manner, more number of the agricultural vehicles can be provided in communication with each other for synchronized and effective identification and removal of weeds at various locations of the agricultural field.
[0073] FIG. 7 is a flow diagram 600 illustrating a method (600) for controlling agricultural implement (D) for removing weed, according to embodiments as disclosed herein. At step (602), the method (600) includes monitoring, by a sensor module (102), at least one parameter corresponding to identification and removal of at least one weed.
[0074] At step 604, the method (600) includes processing, by a processing unit (104), the at least one parameter received from the sensor module (102) to generate and communicate at least one control output signal to a control unit (106).
[0075] At step 606, the method (600) includes altering by, a de-weeding blade position adjusting system (108), a position of at least one de-weeding blade (B) of the agricultural implement (D) to a weed removing position in which the at least one de-weeding blade (B) removes at least one weed based on the at least one control output signal received from the control unit (106).
[0076] The method step (602) of monitoring, by a sensor module (102), at least one parameter corresponding to identification and removal of at least one weed includes capturing and communicating, by at least one media acquisition device (102a), images of crops and weeds, to the processing unit (104)
[0077] Further, the method step (602) of monitoring, by a sensor module (102), at least one parameter corresponding to identification and removal of at least one weed includes capturing and communicating, by at least one media acquisition device (102a), at least one of video of the agricultural field, image of row crops, images of weeds grown along the crop and images of crop growth stages, to the processing unit (104).
[0078] The method step (602) of monitoring, by a sensor module (102), at least one parameter corresponding to identification and removal of at least one weed includes,
detecting and communicating, by a speed sensor (102b), a speed of an agricultural vehicle to the processing unit (104); and
detecting and communicating, by a load sensor (102c), a load of the agricultural implement (D) to the processing unit (104).
[0079] The method (600) comprises altering, by a camera position adjusting system (108), a position of at least one media acquisition device (102a) based on signal received from the controller (104b) of the processing unit (104), where the least one media acquisition device (102a) is provided on at least one of the agricultural implement (D) and the vehicle.
[0080] The method step of altering by, the camera position adjusting system (108), the position of at least one media acquisition device (102a) comprises at least one of,
moving, by at least one first linear actuator (108F), the at least one media acquisition device (102a) through a first movable member (108MF) thereby altering the position of the at least one media acquisition device (102a) along a lengthwise direction of the agricultural vehicle based on instruction received from the controller (104b) of the processing unit (104);
moving, by at least one second linear actuator (108S), the at least one media acquisition device (102a) through at least one second movable member (108MS) thereby altering the position of the at least one media acquisition device (102a) along a width wise direction of the agricultural vehicle based on instruction received from the controller (104b); and
moving by, at least one third linear actuator, the at least one media acquisition device (102a) thereby altering the position of the at least one media acquisition device (102a) along a height wise direction of the agricultural vehicle based on instruction received from the controller (104b).
[0081] The method step (606) of altering by, the de-weeding blade position adjusting system (108), the position of at least one de-weeding blade (B) of the agricultural implement (D) comprises at least one of,
moving, by at least one first linear actuator (110F), the at least one de-weeding blade (B) through corresponding tine (T) and a linkage system (110L) thereby altering the position of the at least one de-weeding blade (B) with respect to a movement axis (A) of corresponding tine (T) based on the control output signal received from the control unit (106)); and
moving, by at least one second linear actuator (110S), the at least one de-weeding blade (B) with respect to corresponding tine (T) along a vertical direction thereby altering the position of de-weeding blade (B) along a height wise direction of agricultural vehicle based on the control output signal received from the control unit (106).
[0082] Further, the method (600) comprises classifying, by a learning module (104a) of the processing unit (104), the captured at least one parameter from the at least one media acquisition device (102a) to a list of crops and weeds based on a list of trained data thereby identifying the at least one weed.
[0083] Further, the method (600) comprises,
providing, by an user interface unit (204), at least one user defined input to an optimization module 202, where the at least one user defined input is at least one of depth of the weed to be removed, nature of weed, nature of crop, crop height, weed height, crop growth stage, weed growth stage, geo-coordinates, depth of the soil, nature of the soil and weather condition;
processing, by optimization module 202, inputs received from the speed sensor 102b, the learning module 104a, the load sensor 102c and the user interface unit (204);
generating, by the optimization module 202, an optimized control signal to the control unit (106) to remove the identify and weed in an optimal manner by at least one of controlling the shutter speed of the media acquisition device (102a), adjusting the position of the media acquisition device (102a), controlling the processing speed of the media acquisition device (102a), adjusting the position of the de-weeding blade (B) and speed of the agricultural vehicle through the controller (104b).
[0084] In another embodiment, the method step (606) monitoring, by the sensor module (102), at least one parameter corresponding to identification and removal of at least one weed comprises,
capturing and communicating, by at least one media acquisition device (102a) of at least one drone, at least one of images of crops, images of weeds, video of the agricultural field, image of row crops, images of weeds grown along the crop and images of crop growth stages, to the processing unit (104); and
communicating, by a positioning module of the at least one drone, the geographical position to controller (104b), where the geographical position is at least one of geographical position of at least one of crops, weeds and agricultural field.
[0085] Furthermore, method (600) comprises,
generating and communicating, by the controller (104b), at least one weed removal map to the control unit (106) based on the information received from the at least one drone;
communicating, by a positioning module of the sensor module (102), the geographical position to controller (104b) where the geographical position is at least one of geographical position of at least one of crops, weeds, agricultural field and the agricultural vehicle; and
guiding, by the controller (104b), the agricultural vehicle to reach at least one target weed by selecting an optimum path to reach the at least one target weed in the agricultural field.
[0086] The various actions, acts, blocks, steps, or the like in the method and the flow diagram 600 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some of the actions, acts, blocks, steps, or the like may be omitted, modified, skipped, or the like without departing from the scope of the invention.
[0087] The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the network elements. The network elements shown in FIG. 1 include blocks, which can be at least one of a hardware device, or a combination of hardware device and software module.
[0088] The embodiments disclosed herein describe methods and systems for removing a weed using an agricultural implement of an agricultural vehicle. Therefore, it is understood that the scope of the protection is extended to such a program and in addition to a computer readable means having a message therein, such computer readable storage means contain program code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device.
[0089] The method is implemented in at least one embodiment through or together with a software program written in e.g. Very high-speed integrated circuit Hardware Description Language (VHDL) another programming language or implemented by one or more VHDL or several software modules being executed on at least one hardware device. The hardware device can be any kind of device which can be programmed including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof, e.g. one processor and two FPGAs.
[0090] The device may also include means which could be e.g. hardware means like e.g. an ASIC, or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means are at least one hardware means and/or at least one software means. The method embodiments described herein could be implemented in pure hardware or partly in hardware and partly in software. The device may also include only software means. Alternatively, the invention may be implemented on different hardware devices, e.g. using a plurality of CPUs.
[0091] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modifications within the spirit and scope of the embodiments as described herein.
REFERENCE NUMERAL
100 - De-weeding control system
102 - Sensor module
102a - Media acquisition device
102b - Speed sensor
102c - Load sensor
104 - Processing unit
104a - Learning module
104b - Controller
106 - Control unit
107 - Storage unit
108 - Camera position adjusting system
108F - First linear actuator of camera position adjusting system
108FC - Connecting rod of first linear actuator
108S - Second linear actuator of camera position adjusting system
108SC - Connecting rod of second linear actuator
108MF - First movable member of camera position adjusting
system
108MS - Second movable member of camera position adjusting
system
110 - De-weeding blade position adjusting system
110F - First linear actuator of De-weeding blade position
adjusting system
110FC - Connecting rod of First linear actuator
110S - Second linear actuator of De-weeding blade position
adjusting system
110SC - Connecting rod of second linear actuator
110L - Linkage system
110LA - First linkage
110LB - Second linkage
110LC - Interconnecting linkage
202 - Optimization module
204 - User interface unit
302 - Media processing module
304 - Classification module
306 - Actuation controlling module
310 - Camera position adjusting module
D - Agricultural implement
F - Main frame of agricultural implement
FC - Cross member of Main frame
T - Movable tine of agricultural implement
B - De-weeding blade of agricultural implement
BC - Blade column
A - Movement axis of tine
| # | Name | Date |
|---|---|---|
| 1 | 202041048166-STATEMENT OF UNDERTAKING (FORM 3) [04-11-2020(online)].pdf | 2020-11-04 |
| 2 | 202041048166-REQUEST FOR EXAMINATION (FORM-18) [04-11-2020(online)].pdf | 2020-11-04 |
| 3 | 202041048166-PROOF OF RIGHT [04-11-2020(online)].pdf | 2020-11-04 |
| 4 | 202041048166-POWER OF AUTHORITY [04-11-2020(online)].pdf | 2020-11-04 |
| 5 | 202041048166-FORM 18 [04-11-2020(online)].pdf | 2020-11-04 |
| 6 | 202041048166-FORM 1 [04-11-2020(online)].pdf | 2020-11-04 |
| 7 | 202041048166-DRAWINGS [04-11-2020(online)].pdf | 2020-11-04 |
| 8 | 202041048166-DECLARATION OF INVENTORSHIP (FORM 5) [04-11-2020(online)].pdf | 2020-11-04 |
| 9 | 202041048166-COMPLETE SPECIFICATION [04-11-2020(online)].pdf | 2020-11-04 |
| 10 | 202041048166-FER.pdf | 2022-05-19 |
| 11 | 202041048166-OTHERS [14-11-2022(online)].pdf | 2022-11-14 |
| 12 | 202041048166-FER_SER_REPLY [14-11-2022(online)].pdf | 2022-11-14 |
| 13 | 202041048166-CORRESPONDENCE [14-11-2022(online)].pdf | 2022-11-14 |
| 14 | 202041048166-CLAIMS [14-11-2022(online)].pdf | 2022-11-14 |
| 15 | 202041048166-US(14)-HearingNotice-(HearingDate-31-10-2023).pdf | 2023-09-22 |
| 16 | 202041048166-FORM-26 [04-10-2023(online)].pdf | 2023-10-04 |
| 17 | 202041048166-Correspondence to notify the Controller [04-10-2023(online)].pdf | 2023-10-04 |
| 18 | 202041048166-Written submissions and relevant documents [14-11-2023(online)].pdf | 2023-11-14 |
| 19 | 202041048166-PatentCertificate05-01-2024.pdf | 2024-01-05 |
| 20 | 202041048166-IntimationOfGrant05-01-2024.pdf | 2024-01-05 |
| 21 | 202041048166- Certificate of Inventorship-044000201( 06-03-2025 ).pdf | 2025-03-06 |
| 1 | 202041048166_SearchE_18-05-2022.pdf |