Abstract: METHOD AND SYSTEM FOR IDENTIFICATION OF PLANT(S) Embodiments disclosed herein relate to agricultural vehicles and more particularly, to capturing media content related to identification of plant(s) using plant identification system provided on agricultural implement(s) attached to an agricultural vehicle. The system (10) comprises a plurality of media acquisition devices (102a), a processing unit (104), a storage unit (107) and a camera position adjusting system (108). The system (10) precisely identifies at least one plant in an optimized way to in real-time scenario. FIG. 1
Claims:We claim:
1. A method (700) for identification of plant(s), the method (700) comprising:
capturing, by at least one media acquisition device (102a), at least one media parameter corresponding to identification of at least one plant, where the at least one media acquisition device (102a) is provided on at least one of an implement (D) and a vehicle;
processing, by a processing unit (104), the captured at least one media parameter received from the at least one media acquisition device (102a) to identify at least one plant; and
altering, by a camera position adjusting system (108), a position of the at least one media acquisition device (102a) to capture desired media parameter required for identification of the at least one plant based on instructions received from a controller (104b) of the processing unit (104).
2. The method as claimed in claim 1, wherein said capturing, by the at least one media acquisition device (102a), the at least one media parameter corresponding to identification of at least one plant comprises,
capturing and communicating, by the at least one media acquisition device (102a), at least one of images of weeds, images of crops, video of the agricultural field, images of row crops, images of weeds grown along the crop and images of crop growth stages, to the processing unit (104).
3. The method (700) as claimed in claim 1, wherein said processing, by the processing unit (104), the captured at least one media parameter received from the at least one media acquisition device (102a)f to identify at least one plant comprises,
classifying, by a learning module (104a) of the processing unit (104), the captured at least one media parameter received from the at least one media acquisition device (102a) to a list of crops, weeds, healthy crops, disease crops and off variety plants based on a list of trained data thereby identifying the at least one plant.
4. The method (700) as claimed in claim 1, wherein said altering by, the camera position adjusting system (108), the position of at least one media acquisition device (102a) comprises at least one of,
moving, by at least one first linear actuator (108F), the at least one media acquisition device (102a) through a first movable member (108MF) thereby altering the position of the at least one media acquisition device (102a) along a lengthwise direction of the agricultural vehicle based on instruction received from the controller (104b) of the processing unit (104);
moving, by at least one second linear actuator (108S), the at least one media acquisition device (102a) through at least one second movable member (108MS) thereby altering the position of the at least one media acquisition device (102a) along a width wise direction of the agricultural vehicle based on instruction received from the controller (104b; and
moving by, at least one third linear actuator, the at least one media acquisition device (102a) thereby altering the position of the at least one media acquisition device (102a) along a height wise direction of the agricultural vehicle based on instruction received from the controller (104b).
5. The method (700) as claimed in claim 3, wherein said method (700) comprises,
geographical referencing, by a positioning module, the at least one plant to said processing unit (104);
providing, by a user interface unit (204), at least one user defined input corresponding to the identification of at least one plant, to an optimization module 202;
processing, by the optimization module 202, inputs received from the learning module (104a), the media acquisition device (102a), the positioning module and the user interface unit (204); and
generating, by the optimization module 202, an optimized control signal to the controller (104b) to identify the at least one plant in an optimal manner by at least one of controlling the shutter speed of the media acquisition device (102a), adjusting the position of the media acquisition device (102a), controlling the processing speed of the media acquisition device (102a) and speed of the agricultural vehicle through the controller (104b).
6. A system (10) for identification of plant(s), said system (10) comprising:
at least one media acquisition device (102a) configured to capture at least one media parameter corresponding to identification of at least one plant, where the at least one media acquisition device (102a) is provided on at least one of an implement (D) and a vehicle;
a processing unit (104) configured to process the captured at least one media parameter received from said at least one media acquisition device (102a) identify at least one plant; and
a camera position adjusting system (108) configured to alter a position of said at least one media acquisition device (102a) to capture desired media parameter required for identification of the at least one plant based on instructions received from a controller (104b) of the processing unit (104).
7. The system (10) as claimed in claim 6, wherein the at least one media acquisition device (102a) is adapted to capture and communicate at least one of images of weeds, images of crops, video of the agricultural field, images of row crops, images of weeds grown along the crop and images of crop growth stages, to said processing unit (104).
8. The system (10) as claimed in claim 6, wherein said processing unit (104) comprises a learning module (104a) configured to classify the captured at least one media parameter received from the at least one media acquisition device (102a) to a list of crops, weeds, healthy crops, disease crops and off variety plants based on a list of trained data thereby identifying the at least one plant.
9. The system (10) as claimed in claim 6, wherein said camera position adjusting system (108) comprises,
at least one first movable member (108MF) adapted to be movably connected to a main frame (F) of the agricultural implement (D);
at least one first linear actuator (108F) configured to move the at least one media acquisition device (102a) through said at least one first movable member (108MF) thereby altering the position of said at least one media acquisition device (102a) along a lengthwise direction of the agricultural vehicle based on instruction received from the controller (104b) of said processing unit (104);
at least one second movable member (108MS) adapted to be movably connected to said first movable member (108MF);
at least one second linear actuator (108S) configured to move said at least one media acquisition device (102a) through said at least one second movable member (108MS) thereby altering the position of said at least one media acquisition device (102a) along a widthwise direction of the agricultural vehicle based on instruction received from the controller (104b); and
at least one third linear actuator configured to move said at least one media acquisition device (102a) thereby altering the position of said at least one media acquisition device (102a) along a height wise direction of the agricultural vehicle based on received from the controller (104b).
10. The system (10) as claimed in claim 9, where each of said first, second and third linear actuator (108F, 108S) comprises,
an electric motor;
a plurality of gears adapted to be rotatably connected to said electric motor;
a leadscrew adapted to be rotatably connected to said plurality of gears;
a threaded member adapted to be linearly movable on said leadscrew; and
a connecting rod (108FC, 108SC) connected to said threaded member,
wherein
said first linear actuator (108F) is mounted on the main frame (F) of the agricultural implement (D);
said second linear actuator (108S) is mounted on said first movable member (108MF);
said third linear actuator (108) is mounted on said second movable member (108MS);
one end of said connecting rod (108FC) of said first linear actuator (108F) is connected to said threaded member and another of said connecting rod (108FC) is connected to said first movable member (108MF);
one end of said connecting rod (108SC) of said second linear actuator (108MS) is connected to said threaded member and another of said connecting rod (108SC) is connected to said second movable member (108MF); and
one end of said connecting rod of said third linear actuator is connected to said threaded member and another of said connecting rod of said third linear actuator is connected to said media acquisition device (102a).
11. The system (10) as claimed in claim 8, where said system (10) comprises,
a positioning module configured to geographically reference the at least one plant to said processing unit (104); and
a user interface unit (204) configured to provide at least one user defined input corresponding to the identification of at least one plant to said processing unit (104),
wherein
said processing unit (104) comprises an optimization module (202) configured to receive inputs from said learning module (104a), said at least one media acquisition device (102a), said positioning module and said user interface unit (204); and
said optimization module (202) generates an communicates an optimized control signal to the controller (104b) to identify the at least one plant in an optimal manner by at least one of controlling the shutter speed of said at least one media acquisition device (102a), adjusting the position of said at least one media acquisition device (102a), controlling the processing speed of said at least one media acquisition device (102a) and speed of the agricultural vehicle through the controller (104b).
, Description:TECHNICAL FIELD
[001] Embodiments disclosed herein relate to agricultural vehicles and more particularly, to capturing media content related to identification of plants (crops, weeds, disease crops, off variety plants and the like) using a plant identification system provided on agricultural implement(s) attached to an agricultural vehicle.
BACKGROUND
[002] Weeds are unwanted plants that compete with healthy crops for essential resources such as space, water, nutrients, light and carbon dioxide. Weeds distribution is heterogeneous in agricultural fields and weed presence reduces overall agricultural output, thereby reducing total productivity. Identification of weed is the important process involved in the agricultural practices, as the weed tend to grow along the crops. Identification and removal of weed is necessary because the unwanted plants competing with the crop plants may reduce the growth of the crops by obtaining nutrients and other resources provided to the crops, thereby reducing the yield of the food crops. While detecting the weed, it is important to identify the species of weeds, the spread of weed and the weed growth stages and the like. The evaluation and assessment of weeds among crops are typically performed by field personnel, either by walking or the use of a motor vehicle. However, the ability to determine weed growth physical area at a glance is not immediate.
[003] In conventional approaches, identification of weeds between the crops may be a tedious task which involves identifying large quantity of weeds between crops grown in rows. Further, manual identification of weeds is tedious and time consuming especially in vast agricultural fields. Manually driven strategies for identifying weeds, disease crops, healthy crops and off variety crops can be error prone and inefficient due to dependency on skill sets of a farmer especially for a novice. Thus, the manual driven strategies may result in incorrect agricultural practices.
[004] In addition, the conventional approaches may not use smart method to precisely identify the weeds along the plants. Thereby, resulting in improper or wrong removal of plants, that is undesirable and causes chaos in the agricultural practice and also affects the production of crops.
[005] Also, the crop-cultivation practiced in various countries differs in different regions, so it is difficult to manufacture a system to implement all the subtle differences in the crop practices.
OBJECTS
[006] The principal object of embodiments herein is to disclose methods and systems for identifying plants (crops, weeds, disease crops, off variety plants and the like) using a plant identification system provided on an agricultural implement.
[007] Another object of the embodiments relates to precise identification of plants (crops, weeds, disease crops, off variety plants and the like) in an optimized way in real-time scenario.
[008] These and other objects of embodiments herein will be better appreciated and understood when considered in conjunction with following description and accompanying drawings. It should be understood, however, that the following descriptions, while indicating embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
BRIEF DESCRIPTION OF DRAWINGS
[009] The embodiments are illustrated in the accompanying drawings, throughout which like reference letters indicate corresponding parts in various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
[0010] FIG. 1 illustrates a plant identification system for identifying plant(s), where media acquisition devices are provided on an agricultural implement attached to an agricultural vehicle, according to embodiments as disclosed herein;
[0011] FIG. 2 illustrates an optimization module required to achieve synchronous operation in identifying the plant(s) in an optimal manner, according to embodiments as disclosed herein;
[0012] FIG. 3 illustrates a processing module for identifying plant(s) , according to embodiments as disclosed herein;
[0013] FIG. 4 depicts media acquisition device(s) and a camera position adjusting system mounted on the agricultural implement, according to embodiments as disclosed herein; and
[0014] FIG. 5 is a flow diagram illustrating a method for identifying the plant(s), according to embodiments as disclosed herein.
DETAILED DESCRIPTION
[0015] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0016] The embodiments herein disclose methods and systems for precisely identifying plant(s) in an optimized way in real-time scenario. Referring now to the drawings, and more particularly to FIGS. 1 through 5, where similar reference characters denote corresponding features consistently throughout the figures, there are shown embodiments.
[0017] FIG. 1 illustrates a plant identification system 10 for identifying plant(s), where media acquisition devices are provided on agricultural implement(s) (D) attached to an agricultural vehicle, according to embodiments as disclosed herein. The agricultural vehicle herein refers to any vehicle or farm machinery that can be used for performing agricultural related operations in agricultural fields. An example of the agricultural vehicle can be, but not limited to, a tractor. Embodiments herein are further explained considering the tractor as the agricultural vehicle, but it may be obvious to a person of ordinary skill in the art that any suitable vehicle or agricultural machines can be considered.
[0018] The agricultural vehicle can include at least one agricultural implement (D) attached to the vehicle. In an embodiment, the agricultural implement (D) can be attached to the vehicle permanently. In an embodiment, the agricultural implement (D) can be attached to the vehicle using a detachable means, such as a three-point hitch/linkage, and so on. Examples of the agricultural implement can be, but is not limited to, sprayers, harrows, plows, planters, harvesters/reapers, fertilizer spreader and so on.
[0019] The plant identification system 10 can be mounted on at least one of the agricultural implements (D) and a mounting structure (not shown) attached to the agricultural vehicle. The plant identification system 10 is configured to classify the identified plant(s) as healthy crops, weeds, disease crops, off variety plants and the like. In an embodiment, the plant identification system 10 can be dust proof, leak proof and able to withstand dry land and wet land cultivation and vibration as per the farm requirements. The plant identification system 10 includes a plurality of media acquisition devices 102a, a processing unit 104, a storage unit 107 and a camera position adjusting system 108. Each media acquisition device 102a is coupled to the processing unit 104 through a communication network. The communication network can be, but is not limited to, the Internet, a wired network (a Local Area Network (LAN), a Controller Area Network (CAN) , a bus network, Ethernet and so on), a wireless (Wi-Fi) network, a cellular network, a Wi-Fi Hotspot, Bluetooth, Zigbee and so on) and so on.
[0020] The media acquisition device 102a is adapted to capture media related to identification of at least one plant. The captured media can include but not limited to video of the agricultural field, image of row crops, images of weeds, images of crops, images of weeds along the crop, images of crop growth stages and so on. Each media acquisition device 102a can include but not limited to, cameras, RGB (Red, green and blue) cameras, a thermal camera, an ultraviolet (UV) camera, near-infrared (NIR) camera, multispectral cameras, and so on. However, it is also within the scope of the embodiments disclosed herein to use any type of cameras without otherwise deterring the intended function of the media acquisition device 102a as can be deduced from this description and corresponding drawings. Each media acquisition device 102a and the camera position adjusting system (108) can be mounted on the agricultural implement (D) such that position of the media acquisition device 102a can be moved in a suitable direction to capture best quality images of crops, images of weeds, image of off variety plants, crops grown in rows, different growth stages of the crops, the weeds grown along the crops, and so on. In another embodiment, the plurality of media acquisition devices (102a) and the camera position adjusting system (108) can be mounted on a mounting structure (not shown) which is to be attached at one of a front-end, rear end, side and mid of the agricultural vehicle. The positions of plants (weeds, crops and off variety plants) are geographical referenced for identification of plants (weeds, crops and off variety plants) at respective locations.
[0021] In an embodiment, for example, the media acquisition device 102a can be moved forward, backward, sideways in horizontal plane parallel to ground and also in vertical direction by using the camera position adjusting system 108. Adjusting the position of the media acquisition device 102a allows capturing of the plant(s) (crops, weed, disease crops, off variety plants and the like) for optimally identifying the plant(s). For example, weeds can be identified from crops. .
[0022] The processing unit 104 can be configured to collect and process the captured media parameters from the media acquisition device 102a. In an embodiment, the media parameters can be collected for a pre-determined time period of operation of the agricultural vehicle and the agricultural implement. In an embodiment, the measured parameters can be collected on one or more pre-determined events occurring. In an embodiment, the processing unit 104 may process the captured media parameters from the media acquisition device 102a. For example, the processing unit 104 may analyze the video or images captured by the media acquisition devices 102a and process the captured image or video frame wise to obtain the basic images of the weed, images of crops, images of disease crops, images of healthy crops and images of off variety plants. In another embodiment, the processing unit 104 can be configured to operate other components/devices associated with the agricultural vehicle. For example, the processing unit 104 may control the speed of the agricultural vehicle. The processing unit 104 includes a learning module 104a and a controller 104b.
[0023] The learning module 104a embedded in the processing unit 104 receives the captured parameters from the media acquisition device 102a. The learning module 104a uses a set of rules derived from a machine learning model to identify the plant(s) (crops, weeds, disease crop, healthy crop and off variety plants) based on the captured parameter received from the media acquisition device (104a). The machine learning model can include at least one of a supervised learning algorithm, an unsupervised learning algorithm and so on. For example, the learning module 104a hosts a set of deep learning methods which includes a list of trained data which relates to plant identification process. The list of trained data relates to the set of data used to train the learning module 104a and to predict the desired outcome from the series of processing data. For example the list of trained data comprising plant information and other related data such as weed type, weed nature, plant diseases, off variety plant nature and type of soil.
[0024] Also, the learning module 104a along with the trained dataset classifies the captured parameter from the media acquisition device 102a. For example, the captured image or video may be analyzed by the learning module 104a based on the trained data to classify the plant(s) (crops, weed, disease crops, healthy crops and off variety plants). The classification result provided by the learning module 104a plays a vital role in identifying the plant(s) (crops, weeds, disease crops, healthy crops and off variety plants). For example, the classification result provided by the learning module 104a plays a vital role in identifying weeds from the crops, the classification may be made based on type of weed, nature of crop along the weed, growth stage of weed and so on.
[0025] The controller 104b can be configured to generate the control output signal corresponding to the measured parameters analyzed by the learning module 104a. The controller 104b provides the control output signal to the camera position adjusting system (108) and other controller units of the agricultural vehicle. The control output signal indicates required optimum parameter(s) of the agricultural implement for operating the at least one of the agricultural implements and the agricultural vehicle. In an embodiment, examples of the required optimum parameters of the agricultural implement can be, but is not limited to, adjusting the position of the media acquisition devices 102a along at least one of lengthwise, widthwise and height wise direction of the agricultural vehicle to capture the media content related to identification of at least one plant and so on. In an embodiment, the controller 104b can include a processor, a microcontroller, a memory, a storage unit and so on.
[0026] The media content related to the identification of at least one plant may include but not limited to the video of the crops along with the weed, the video of the growing stages of the weed, the video of the crop plant along the growing stages, the image of the weed along the crop plant, images of disease crops, images of off variety plants, the image of the crop plant in the field and the like.
[0027] The camera position adjusting system 108 receives the control output signal from the controller 104b based on instructions from the learning module 104a.The camera position adjusting system 108 instructs at least one media acquisition device 102a to adjust the position of the media acquisition device 102a for accurate capturing of at least one plant (weeds and/or crops).
[0028] Based on the analysis of the learning module 104a with respect to the captured parameters of the media acquisition devices 102a, the learning module 104a provides signal to the camera position adjusting system 108 through the controller 104b to identify the at least one plant in its exact position. For example, the learning module 104a classifies the plants (weed, healthy crop, disease crops and off variety plants) based on the captured image or video from the media acquisition devices 102a. The learning module 104a provides the co-ordinates of the at least one plant to the controller 104b which instructs the camera position adjusting system 108 to adjust the position of media acquisition devices 102a to aptly the capture the at least one plant.
[0029] The learning module 104a calculates the position of the media acquisition devices 102a required to accurately capture the at least one plant based on the inputs received from the positioning modules to reach the accurate co-ordinates of the at least one plant. The learning module 104a calculates the adjustments required to be configured on the media acquisition device 102a for effective identification of the at least one plant. For example, the settings of the camera including shutter speed adjustments, adjusting capture quality of the images, capture ratio of the image and video and the like can be adjusted based on the control signal received from the learning module 104a. Also, the processing speed of the camera and other parameters related to the media acquisition device 102a can be adjusted based on the set of trained data present in the learning module 104a.
[0030] The learning module 104a controls the speed of the agricultural vehicle to identify the at least one plant. For example, the speed of the agricultural vehicle is controlled by the controller 104b based on different delays of processing. Therefore, the learning module 104a controls the dynamic position of the vehicle and the settings of the media acquisition device 102a. Further, the learning module (104a) controls the position of the media acquisition device 102a through camera position adjusting system 108.
[0031] The camera position adjusting system (108) is provided on the agricultural implement to adjust the position of the media acquisition device 102a for easy identification of the at least one plant. The camera position adjusting system (108) may be employed to perform the multi-row plant identification operation based on the captured parameter and the control signal provided by the learning module 104a.
[0032] Further, the controller 104b transmit at least one of the control output signal to the camera position adjusting system 108 and multiple devices using at least one of a wired network (a LAN, a CAN network, an ISO bus, a bus network, Ethernet and so on), a wireless network (a Wi-Fi network, a GSM, a cellular network, Wi-Fi Hotspot, Bluetooth, Zigbee and so on) and so on. The multiple devices can be, but is not limited to, the media acquisition devices 102a, the camera position adjusting system (108), the agricultural vehicle, a cloud server, an electronic device (mobile, smartphone, laptop, tablet and so on), and so on. In an embodiment, the multiple devices can present the control output signal according to requirements received from the media acquisition device 102a.
[0033] The storage unit 107 can be configured to store the measured parameters and operating count values of the agricultural implement. The storage unit 107 includes at least one of a file server, a data server, a memory, a server, a cloud and so on. The memory may include one or more computer-readable storage media. The memory may include non-volatile storage elements. Examples of such non-volatile storage elements may include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. In addition, the memory may, in some examples, be considered a non- transitory storage medium.
[0034] The term “non-transitory” may indicate that the storage medium is not embodied in a carrier wave or a propagated signal. However, the term “non- transitory” should not be interpreted to mean that the memory is non-movable. In some examples, the memory can be configured to store larger amounts of information than the memory. In certain examples, a non- transitory storage medium may store data that can, over time, change (e.g., in Random Access Memory (RAM) or cache).
[0035] FIG. 1 shows exemplary blocks of the plant identification system 10, but it is to be understood that other embodiments are not limited thereon. In other embodiments, the plant identification system 10 may include less or more number of blocks. Further, the labels or names of the blocks are used only for illustrative purpose and does not limit the scope of the embodiments herein. One or more blocks can be combined together to perform same or substantially similar function in the plant identification system.
[0036] FIG. 2 illustrates an optimization module required to achieve synchronous operation in identifying the at least one plant in an optimal manner, according to embodiments.
[0037] The user interface unit 204 may provide the communicating interface between the user and the plant identification system 10. The user may provide user defined inputs through the user interface unit 204 to be analyzed by the optimization module 202. The input provided by the user can include but is not limited to co-ordinates of the weed, position of the weed along the crop plant, nature of weed, type of crop diseases, type of off variety plants, nature of the crop, crop height, weed height, crop growth stage, weed growth stage, nature of the soil and other plant and weed identification inputs and the like. The user can include, but is not limited to the farmer, cultivator, planter, botanist, gardener, reaper, weed remover and the like.
[0038] The optimization module 202 receives input from the media acquisition device 102a, the learning module 104a and the user interface unit 204. The optimization module 202 incorporates a procedure which is executed iteratively with the received inputs to obtain the most optimized results.
[0039] The optimized results received from the optimization module 202 may be used to identify the at least one plant in an optimal manner. For example, the optimized results received from the optimization module 202 may be used to identify at least one of weed, crop, disease crop, healthy crop and off variety plants. The optimized result may include the accurate co-ordinates required to track the at least one plant (weed, crop, disease crop and off variety plant), the depth of the weed, the depth of the crop, the speed required by the agricultural vehicle to reach the co-ordinates of the at least one plant, the speed of processing of the media acquisition device 102a and the like. In an embodiment the optimized result from the optimization module 202 can include, but not limited to controlling the shutter speed of the camera, adjusting the position of the media acquisition devices (102a) (cameras), controlling the processing speed of the camera, and the like. The optimized result obtained from the module 202 aims in providing the most optimized and efficient way in identifying the at least one plant.
[0040] The optimization module 202 also helps in achieving the synchronous operations based on the inputs received from the various units. In an embodiment, the position of the camera on the crop practice can be adjusted, the speed of processing of the camera is adjusted based on the vehicle speed through the control unit. The operation time of camera, the time for processing at the system level controller to generate actuation signal etc are considered while performing actuation. Also, if the speed of the agricultural vehicle needs to be controlled based on the different delays of processing that is also master controlled at the controller. Thereby the synchronous operations between the various units are achieved through the optimization module 202.
[0041] FIG. 2 illustrates system 10 with optimization module 202 required to achieve synchronous operation in identifying the at least one plant in the optimal manner, but it is to be understood that other embodiments are not limited thereon. In other embodiments, the system 10 may include less or more number of blocks. Further, the labels or names of the blocks are used only for illustrative purpose and does not limit the scope of the embodiments herein. One or more blocks can be combined together to perform same or substantially similar function in the system 10.
[0042] FIG. 3 illustrates a processing module 104 for identifying at least one plant, according to an embodiment. The processing unit 104 includes the learning module (104a), the controller (104b), a media processing module 302, a classification module 304, the optimization module 202 and a camera position adjusting module 310. The learning method 104a being embedded in the processing unit 104 receives captured parameters from the media acquisition device 102a. The processing unit 104 processes the received input and provides the output signal to the camera position adjusting system 108 through the controller 104b.
[0043] The media processing module 302 may receive the captured parameters from the media acquisition device 102a. The media processing module 302 may process the captured parameters from the media acquisition device 102a, the captured parameter can include and is not limited to video of the agricultural field, image of row crops, images of weeds, images of disease crops, images of healthy crops, images of off variety plants, weeds along the crops, images of the plant growth stages and so on. In an embodiment, the media processing module 302 may process the captured parameter into its basic elements to identify the further features present in the parameters. For example, in an embodiment, the captured video or image is being processed framewise to obtain the depth of weeds along the group crops, nature of soil of the weed, crop growth stage, weed growth stage and the like. Therefore, the media processing module processes the received media and identifies the basic elements present in each frame.
[0044] The classification module 304 receives the processed elements from the media processing module 302 containing basic elements present in the captured parameters of the media acquisition device 102a. On receiving the basic elements of the processed parameter, the classification module 304 identifies the at least one plant, and classifies the plant based on its type and nature of the plant.
[0045] Further, classification of plants involves comparing the received elements from the captured parameter to the trained data from the learning module 104a. The learning module 104a comprises a large set of trained data related to plant identification and classification of plants based on its nature, type, depth and other related parameters.
[0046] The optimization module 202 performs the plant identification operation to efficiently identify the at least one plant which helps in achieving the synchronous operations between various units. In an embodiment, the media acquisition device 102a, the agricultural vehicle and other components are operated synchronously to achieve plant identification. For an example, the position of the camera on the crop practice can be adjusted, the speed of processing of the camera is adjusted based on the vehicle speed through the control unit.
[0047] Also, the processing unit 104 controls the operating time of camera, processing time to generate camera actuation signal etc. are considered while performing actuation. Also, if the speed of the agricultural vehicle needs to be controlled based on the different delays of processing that is also master controlled at the controller. Thereby the synchronous operation between the various units are achieved through the optimization module 202.
[0048] Based on the synchronous operation between various units, the optimization is achieved in identifying the at least one plant. The identification of the at least one plant (weeds and/or crops) needs all the units of the plant identification system 10 to perform in an optimized way to achieve greater efficiency. In an embodiment, the position of camera is to be adjusted based on the crop practice followed. Further, the position of cameras can be altered based on distance between the camera and weed, co-ordinates of at least one plant. The operating speed of the agricultural vehicle is to be adjusted based on feedback from the control unit, delays of camera processing and the like. Therefore, all the units are performed in the synchronous manner to achieve the optimized operation of identifying plants.
[0049] The camera position adjusting module 310 instructs camera position adjusting system (108) to adjust the position of the media acquisition devices (102a) along at least one of lengthwise, widthwise and height wise direction of the agricultural vehicle to accurately capture at least one of images and videos of weed and crops, diseases crops, healthy crops, off variety plants, and weed along the crop. The camera position adjusting module 310 helps to identify the better position or co-ordinates of the at least one plant (weed grown along the crop).
[0050] FIG. 3 shows exemplary blocks of processing module 104 for identifying plants, but it is to be understood that other embodiments are not limited thereon. In other embodiments, processing module may include less or more number of blocks. Further, the labels or names of the blocks are used only for illustrative purpose and does not limit the scope of the embodiments herein. One or more blocks can be combined together to perform same or substantially similar function in the processing module.
[0051] FIG. 4 depicts media acquisition devices (102a) and camera position adjusting system (108) mounted on the agricultural implement (D). The agricultural implement D is coupled the three-point hitch. The three-point hitch includes an upper link and a pair of lower links. The agricultural implement D includes a main frame F. The agricultural implement D is attached to one of rear end, front end, side and mid of the agricultural vehicle.
[0052] In an embodiment, the camera position adjusting system (108) is adapted to alter a position of the media acquisition devices (102a). In an embodiment, the camera position adjusting system (108) comprises a plurality of first linear actuators (108F), a plurality of second linear actuators (108S), a plurality of third linear actuators (not shown), at least one first movable member (108MF) and a plurality of second movable members (108MS).
[0053] Each first linear actuator (108F) is adapted to move the media acquisition devices (102a) through the first movable member (108MF) thereby altering the position of the media acquisition devices (102a) along a lengthwise direction of the agricultural vehicle based on the instructions received from the controller (104b). Each first linear actuator (108F) is mounted to the main frame (F) of the agricultural implement (D). Each second linear actuator (108S) is mounted onto the first movable member (108MF). Each second linear actuator (108S) is adapted to move corresponding media acquisition device (102a) through corresponding second movable member (108MS) thereby altering the position of the media acquisition device (102a) along a width wise direction of the agricultural vehicle based on the instructions received from the controller (104b). Each third linear actuator (not shown) is mounted onto the second movable member (108MS). Each third linear actuator (not shown) is adapted to move corresponding media acquisition device (102a) thereby altering the position of the media acquisition device (102a) along a height wise direction of the agricultural vehicle based on the instructions received from the controller (104b).
[0054] The first movable member (108MF) is movably connected to the main frame (F) of the agricultural implement (D). For example, the first movable member (108MF) is slidably connected to cross members (FC), (as shown in fig. 4) of the main frame (F) of the agricultural implement (D). The first movable member (108MF) defines a plurality of guide portions (not shown) corresponding to the guide portion provided on each cross member (FC) of the main frame (F) of the agricultural implement (D). For the purpose of this description and ease of understanding, the guide portion (not shown) defined on the first movable member (108MF) is considered to be a groove and correspondingly the guide portions (not shown) defined on each cross member (FC) of the main frame (F) of the agricultural implement (D) is considered to be protrusion, and vice versa.
[0055] Each second movable member (108MS) is movably connected to the first movable member (108MF) and spaced away from the other second movable member (108MS). For example, each second movable member (108MS) is slidably connected to the first movable member (108MF). Each second movable member (108MS) defines a plurality of guide portions (not shown) corresponding to the guide portion provided on the first movable member (108MF). For the purpose of this description and ease of understanding, the guide portion (not shown) defined on each second movable member (108MS) is considered to be a protrusion and correspondingly the guide portions (not shown) defined on the first movable member (108MF) is considered to be groove, and vice versa.
[0056] For the purpose of this description and ease of understanding, each first, second and third linear actuator (108F, 108S) of the camera position adjusting system (108) is considered to be an electric linear actuator. Each first, second and third linear actuator (108F, 108S) of the camera position adjusting system (108) includes an electric motor (not shown), a plurality of gears (not shown), a leadscrew (not shown), a threaded member (not shown), a connecting rod (108FC, 108SC), a guide member (not shown) and a plurality of limit switches (not shown). The electric motor of each first, second and third linear actuator (108F, 108S) includes a controller unit (not shown) provided in communication with the controller (104b). The threaded member (not shown) is movably connected to the leadscrew (not shown).
[0057] One end of the connecting rod (108FC) of each first linear actuator (108F) is connected to the threaded member (not shown) and another end of the connecting rod (108FC), (as shown in fig. 4) of the first linear actuator (108F) is connected to the first movable member (108MF). One end of the connecting rod (108SC) of each second linear actuator (108S) is connected to the threaded member (not shown) and another end of the connecting rod (108SC), (as shown in fig. 4) of the second linear actuator (108F) is connected to corresponding second movable member (108MS). One end of the connecting rod (not shown) of each third linear actuator (not shown) is connected to the threaded member (not shown) and another end of the connecting rod (not shown) of the third linear actuator (not shown) is connected to corresponding media acquisition device (104a). For the purpose of this description and ease of understanding, the threaded member (not shown) is considered to be a nut. In an embodiment, the threaded member and the connecting rod (108FC, 108SC) are separate parts. In another embodiment, the threaded member and the connecting rod (108FC, 108SC) is considered to be a single integrated part. The guide member (not shown) is adapted to guide the movable connecting rod (108FC, 108SC) during operation of the linear actuator.
[0058] The electric motor of each first linear actuator (108F) is activated by the controller (104b) through the controller unit of the electric motor. On energization of the electric motors, an output shaft (not shown) of each electric motor rotates corresponding leadscrew (not shown) through corresponding plurality of gears (not shown). As the leadscrew rotates, the threaded member (not shown) moves linearly on the leadscrew thereby moving the connecting rod (108FC). The connecting rod (108FC) of each first linear actuator (108F) in turn moves the first movable member (108MF) thereby altering the position of media acquisition devices (102a) along the lengthwise direction of the agricultural vehicle.
[0059] The electric motor of each second linear actuator (108S) is activated by the controller (104b) through the controller unit of the electric motor. On energization of the electric motors, an output shaft (not shown) of each electric motor rotates corresponding leadscrew (not shown) through corresponding plurality of gears (not shown). As the leadscrew rotates, the threaded member (not shown) moves linearly on the leadscrew thereby moving the connecting rod (108SC). The connecting rod (108SC) of each second linear actuator (108S) in turn moves the second movable member (108MF) thereby altering the position of media acquisition devices (102a) along the widthwise direction of the agricultural vehicle.
[0060] The electric motor of each third linear actuator (not shown) is activated by the controller (104b) through the controller unit of the electric motor. On energization of the electric motors, an output shaft (not shown) of each electric motor rotates corresponding leadscrew (not shown) through corresponding plurality of gears (not shown). As the leadscrew rotates, the threaded member (not shown) moves linearly on the leadscrew thereby moving the connecting rod (not shown) of each third linear actuator. The connecting rod (108FC) of each third linear actuator in turn moves the media acquisition devices (102a) thereby altering the position of media acquisition devices (102a) along the height wise direction of the agricultural vehicle.
[0061] In another embodiment, the plurality of gears (not shown) can be replaced by one of chain and sprockets, and belt and pulley or a combination of both. Further, it is also within the scope of the invention to consider each first, second and third linear actuator (108F, 108S) of the camera position adjusting system (108) as one of a mechanical linear actuator, electro-pneumatic linear actuator, electro-hydraulic linear actuator, solenoid operated linear actuator, telescopic linear actuator, ball screw linear actuator, any other type of electric linear actuators and any other type of linear actuators.
[0062] The controller (104b) is configured to receive information about soil condition of the agricultural fields and weather condition from at least one of the user interface unit (204) and sensors (not shown) provided to the agricultural vehicle. In another embodiment, the media acquisition device (102a) can be movably mounted on at least one drone (not shown). The position of the media acquisition device (102a) provided on the drone can be altered automatically by the drone itself or manually by using manual operated drone remote controller. The drone (not shown) is adapted to be provided in communication with the controller (104b) through a communication network such as but not limited to Internet, a wired network (a Local Area Network (LAN), a Controller Area Network (CAN) network, a bus network, Ethernet and so on), a wireless network (a Wi-Fi network, a cellular network, a Wi-Fi Hotspot, Bluetooth, Zigbee and so on) and so on. The drone is provided with positioning modules can be, but is not limited to, a Global Positioning System (GPS) unit, a Local Positioning System (LPS), a Global Navigation Satellite System (GNSS) and so on. It should be noted that the drone disclosed herein may use any type of positioning systems without otherwise deterring the intended function of collecting the geo position of at least one of crops, off variety plants, disease crops, weeds and agricultural field as can be deduced from this description and corresponding drawings. The drone is adapted to capture video or spectral imagery of the agricultural field and sends or downloads at least one information to the controller (104b). The information sent from the drone (not shown) to the controller (104b), includes agricultural field data such as but not limited to plant data, soil data and plant location data and weed location data.
[0063] In another embodiment, a docking station can be provided on board the agricultural vehicle (not shown) for the at least one drone which collects visual and spectral field data. This data from the drone can be downloaded to the vehicle for analytics during docking of the drone to update digital maps for autonomous guidance. The drone can be launched from the vehicle for multiple missions after charging and can be communicated to multiple machines for synchronized operations with telemetry modules on the drone and the vehicle.
[0064] Further, the sensor module (102) includes motion sensors and positioning modules. The positioning module is configured to provide information about geo position of at least one of crops, off variety plants, disease crops weeds and vehicle. Examples of the positioning modules can be, but is not limited to, a Global Positioning System (GPS) unit, a Local Positioning System (LPS), a Global Navigation Satellite System (GNSS) and so on. It should be noted that the embodiments disclosed herein may use any type of positioning systems without otherwise deterring the intended function of collecting the geo position of at least one of crops, weeds and vehicle as can be deduced from this description and corresponding drawings. The controller (104b) is configured to receive inputs from positioning modules and motion sensors of the sensor module (102). The controller (104b) is configured to generate at least one of plant presence map (weed presence map, disease plant presence map, healthy crop presence map, off-variety plant presence map) based on the information from the drone.
[0065] Further, the plant identification system (10) may include server and databases which include parameters or information corresponding to identification of at least one plant. In an embodiment, the plant identification system (10) may include a cloud computing platform/system, where the cloud computing system can be part of a public cloud or a private cloud. The server may be a standalone server or a server on a cloud. Further, the server may be any kind of computing device such as those, but not limited to a personal computer, a notebook, a tablet, desktop computer, a laptop, a handheld device a mobile device, and so on. Although not shown, some or all of the devices in the plant identification system (10) can be connected to the cloud computing platform via a gateway. Also, the cloud platform can be connected to devices (drones, vehicles, user interface units, remote control systems and so on) located in same or different geographical locations.
[0066] In another embodiment, the control unit (104b) can be configured to control another control unit provided on another agricultural vehicle for synchronizing geographical location of the first and second agricultural vehicles employed for identification of various types of plants in the agricultural field, wherein the geographical location of the first and second agricultural vehicles is synchronized to collect geo-position synchronized from captured parameter required for identification of various plants at various locations of the agricultural field. In the same manner, more number of the agricultural vehicles can be provided in communication with each other for synchronized and effective identification of various plants at various locations of the agricultural field.
[0067] FIG. 5 is a flow diagram 700 illustrating a method for identification of at least one plant, according to embodiments as disclosed herein. At step 702, the method (700) includes, capturing, by the at least one media acquisition device (102a), at least one media parameter corresponding to identification of at least one plant, where the at least one media acquisition device (102a) is provided on at least one of an agricultural implement (D) and an agricultural vehicle.
[0068] At step (704), the method (700) includes, processing, by a processing unit (104), the captured at least one media parameter received from the at least one media acquisition device (102a) to identify at least one plant.
[0069] Further, the method (700) includes altering, by a camera position adjusting system (108), a position of the at least one media acquisition device (102a) to capture desired media parameter required for identification of the at least one plant based on instructions received from a controller (104b) of the processing unit (104).
[0070] The method step (702) of capturing, by the at least one media acquisition device (102a), at least one media parameter corresponding to identification of at least one plant comprises,
capturing and communicating, by the at least one media acquisition device (102a), at least one of images of weeds, images of crops, images of off variety plants, video of the agricultural field, images of row crops, images of weeds grown along the crop and images of crop growth stages, to the processing unit (104).
[0071] The method step (704) of processing, by the processing unit (104), the captured at least one media parameter corresponding to the identification of at least one plant comprises,
classifying, by a learning module (104a) of the processing unit (104), the captured at least one media parameter received from the at least one media acquisition device (102a) to a list of crops, weeds, healthy crops, disease crops and off variety plants based on a list of trained data thereby identifying the at least one plant.
[0072] The method step of altering by, the camera position adjusting system (108), the position of at least one media acquisition device (102a) comprises at least one of,
moving, by at least one first linear actuator (108F), the at least one media acquisition device (102a) through a first movable member (108MF) thereby altering the position of the at least one media acquisition device (102a) along a lengthwise direction of the agricultural vehicle based on instruction received from the controller (104b) of the processing unit (104);
moving, by at least one second linear actuator (108S), the at least one media acquisition device (102a) through at least one second movable member (108MS) thereby altering the position of the at least one media acquisition device (102a) along a width wise direction of the agricultural vehicle based on instruction received from the controller (104b); and
moving by, at least one third linear actuator, the at least one media acquisition device (102a) thereby altering the position of the at least one media acquisition device (102a) along a height wise direction of the agricultural vehicle based on instruction received from the controller (104b).
[0073] Further, the method (700) comprises,
geographical referencing, by a positioning module, the at least one plant, to said processing unit (104);
providing, by a user interface unit (204), at least one user defined input corresponding to the identification of at least one plant, to an optimization module 202;
processing, by the optimization module 202, inputs received from the learning module (104a), the media acquisition device (102a), the positioning module and the user interface unit (204); and
generating, by the optimization module 202, an optimized control signal to the controller (104b) to identify the at least one plant in an optimal manner by at least one of controlling the shutter speed of the media acquisition device (102a), adjusting the position of the media acquisition device (102a), controlling the processing speed of the media acquisition device (102a) and speed of the agricultural vehicle through the controller (104b).
[0074] The various actions, acts, blocks, steps, or the like in the method and the flow diagram 700 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some of the actions, acts, blocks, steps, or the like may be omitted, modified, skipped, or the like without departing from the scope of the invention.
[0075] The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the network elements. The network elements shown in FIG. 1 include blocks, which can be at least one of a hardware device, or a combination of hardware device and software module.
[0076] The embodiments disclosed herein describe methods and systems for removing a weed using an agricultural implement of an agricultural vehicle. Therefore, it is understood that the scope of the protection is extended to such a program and in addition to a computer readable means having a message therein, such computer readable storage means contain program code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device.
[0077] The method is implemented in at least one embodiment through or together with a software program written in e.g. Very high speed integrated circuit Hardware Description Language (VHDL) another programming language, or implemented by one or more VHDL or several software modules being executed on at least one hardware device. The hardware device can be any kind of device which can be programmed including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof, e.g. one processor and two FPGAs.
[0078] The device may also include means which could be e.g. hardware means like e.g. an ASIC, or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means are at least one hardware means and/or at least one software means. The method embodiments described herein could be implemented in pure hardware or partly in hardware and partly in software. The device may also include only software means. Alternatively, the invention may be implemented on different hardware devices, e.g. using a plurality of CPUs.
[0079] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modifications within the spirit and scope of the embodiments as described herein.
REFERENCE NUMERAL
10 - Plant identification system
102a - Media acquisition device
104 - Processing unit
104a - Learning module
104b - Controller
107 - Storage unit
108 - Camera position adjusting system
108F - First linear actuator
108FC - Connecting rod of first linear actuator
108S - Second linear actuator
108SC - Connecting rod of second linear actuator
108MF - First movable member
108MS - Second movable member
202 - Optimization module
204 - User interface unit
302 - Media processing module
304 - Classification module
310 - Camera position adjusting module
D - Agricultural implement
F - Main frame of agricultural implement
FC - Cross member of Main frame
| # | Name | Date |
|---|---|---|
| 1 | 202041048169-STATEMENT OF UNDERTAKING (FORM 3) [04-11-2020(online)].pdf | 2020-11-04 |
| 2 | 202041048169-REQUEST FOR EXAMINATION (FORM-18) [04-11-2020(online)].pdf | 2020-11-04 |
| 3 | 202041048169-PROOF OF RIGHT [04-11-2020(online)].pdf | 2020-11-04 |
| 4 | 202041048169-POWER OF AUTHORITY [04-11-2020(online)].pdf | 2020-11-04 |
| 5 | 202041048169-FORM 18 [04-11-2020(online)].pdf | 2020-11-04 |
| 6 | 202041048169-FORM 1 [04-11-2020(online)].pdf | 2020-11-04 |
| 7 | 202041048169-DRAWINGS [04-11-2020(online)].pdf | 2020-11-04 |
| 8 | 202041048169-DECLARATION OF INVENTORSHIP (FORM 5) [04-11-2020(online)].pdf | 2020-11-04 |
| 9 | 202041048169-COMPLETE SPECIFICATION [04-11-2020(online)].pdf | 2020-11-04 |
| 10 | 202041048169-CORRESPONDENCE [16-11-2022(online)].pdf | 2022-11-16 |
| 10 | 202041048169-abstract.jpg | 2021-10-18 |
| 11 | 202041048169-FER.pdf | 2022-05-23 |
| 12 | 202041048169-OTHERS [16-11-2022(online)].pdf | 2022-11-16 |
| 13 | 202041048169-FER_SER_REPLY [16-11-2022(online)].pdf | 2022-11-16 |
| 14 | 202041048169-CORRESPONDENCE [16-11-2022(online)].pdf | 2022-11-16 |
| 15 | 202041048169-CLAIMS [16-11-2022(online)].pdf | 2022-11-16 |
| 16 | 202041048169-US(14)-HearingNotice-(HearingDate-21-06-2024).pdf | 2024-05-17 |
| 17 | 202041048169-Correspondence to notify the Controller [17-06-2024(online)].pdf | 2024-06-17 |
| 18 | 202041048169-FORM-26 [18-06-2024(online)].pdf | 2024-06-18 |
| 19 | 202041048169-Written submissions and relevant documents [03-07-2024(online)].pdf | 2024-07-03 |
| 20 | 202041048169-PatentCertificate20-12-2024.pdf | 2024-12-20 |
| 21 | 202041048169-IntimationOfGrant20-12-2024.pdf | 2024-12-20 |
| 22 | 202041048169- Certificate of Inventorship-044000200( 06-03-2025 ).pdf | 2025-03-06 |
| 1 | sserE_23-05-2022.pdf |