Abstract: Disclosed herein is a spraying device having inbuilt image processing and deep learning mechanism that detect diseases of two different crops cultivated at a time in same field, and spray appropriate doses of selected pesticides on the diseased crops. The device comprises a carriable enclosure (100) having two compartments (102, 104) for filling with different pesticides; two spray nozzles (202, 302) coupled with two motor pumps (208, 308); a touch screen interface (400) mounted on the enclosure; an imaging sensor (500) disposed between the spray nozzles for capturing crop images; a microcontroller (600); and a power circuit (700). The motor pumps (208, 308) are adapted for pumping the pesticides from the compartments (102, 104) into the corresponding spray nozzles (202, 302), and spraying therefrom the pesticides onto two different target crops. The touch screen interface (400) provides access to operator to select two crops out of a predefined set of crops, and tag each selected crop with one motor pump. The microcontroller (600) is configured to: detect whether the crops are healthy or affected by any disease type/class, estimate a severity factor of the disease, trigger appropriate signals for the motor driver (800) to turn on/off the corresponding motor pump (208, 308), and determine doses of the pesticides to be sprayed on the diseased crops by regulating the pumping pressure of corresponding motor pump and opening/closing of corresponding nozzle. Fig. 1
Description:FIELD OF THE INVENTION
The present invention broadly relates to agricultural crop sprayer. Particularly, the present invention relates to a portable battery-operated dual compartment spraying device having inbuilt image processing module and deep neural network-based vision transformer model that detect diseases of two different crops cultivated at a time in same field, and spray appropriate doses of selected pesticides on the diseased crops.
BACKGROUND OF THE INVENTION
The agricultural crops/plants/lands are very often infected by various diseases caused by insects/pests and microorganisms (bacteria, fungi, and viruses); thus, affecting the crop/plant growth and reducing overall agricultural yield. Therefore, pesticide spraying becomes inevitable to minimize the crop damage by controlling insects, pests, and other microorganisms. The commercially available sprayers include backpack sprayers, hand sprayers, and truck-mounted spray tank systems. The backpack sprayer is a preferred choice of most of the farmers as it can be easily carried to regions/areas, where large or stationary equipment cannot reach. However, such backpacks are either manual hand piston operated in which the farmers need to pull/push its piston continuously, or motorized-pump operated in which the farmers need to squeeze its trigger valve (pumping lever) periodically, so continuous attention and labour are required throughout the spraying operation. Therefore, further automation is required in agricultural spraying operation to make the farmers’ job easy and comfortable.
A reference may be made to US9079200B2 that discloses a multi-container manual-pump-operated backpack sprayer, in which a piston pump is used to pump chemicals from three containers into a user-actuated spray wand, backflush valves are used to revert the excess chemical back into the containers. Especially, the operator has to reciprocate a handle bar in an up-and-down motion to cause a linkage assembly to reciprocate the piston in piston pump, that in turn pressurizes pressure chamber and pumps chemical out of a selected one of the containers. Therefore, operating this sprayer appears to be quite tedious and labourious, and the operators need to manually identify the infected area of the crops before spraying the chemicals.
Another reference may be made to US10654068B2 that discloses a battery-operated induction electrostatic spray apparatus in which an electric centrifugal liquid pump is used to cause atomization of a fluid flow from a chemical reservoir into an electrostatic spray nozzle that further discharges the said atomized fluid onto the crops. However, the manual inspection of the diseased crops is required before initiating every spray.
Further, the plant/crop diseases significantly harm global crop production, impacting food security and economies. With annual losses averaging 42% for key food crops, timely disease detection is crucial. Faster, accurate detection methods are vital for improving agricultural productivity and combating these threats efficiently. The traditional disease monitoring methods relying on visual inspection and manual assessment are subjective and time-consuming. However, the recent advancements in artificial intelligence (AI), the Internet of Things (IOT), and computer vision have opened up new opportunities for plant disease monitoring.
Moreover, it is very crucial to precisely identify the target infected area, type of disease, and its severity, and decide appropriate amount/doses of pesticide to applied thereon. Because, less amount of pesticide application may not cure the crop diseases, whereas excess amount of pesticide application may be poisonous for crop health and cause environmental pollution. Therefore, there is felt a need for an integrated control mechanism that can automatically detect the crop diseases very precisely and activating pump mechanism of the sprayer to precisely spray/sprinkle the pesticide on the crops at desired doses, thus curing the crop disease without causing environmental (soil/water) pollution.
The conventional or commercially available agricultural sprayers have many limitations towards pumping mechanism, sustainable design, application and complexity involved in manufacturing/configuration, it is required to devise an improved approach, especially a pesticide spraying device integrated with image processing/analysis and AI-based tools which would in turn address issues of crop disease detection/diagnosis and simultaneous pesticide application; thereby achieving automatic crop disease detection with severity classification, optimal utilization of pesticides, and minimizing crop damages in more simple, sustainable, eco-friendly, and cost-effective manner. Moreover, it is desired to develop a portable battery-operated agricultural spraying device integrated with image processing/analysis and AI-based mechanisms, which includes all the advantages of the conventional/ existing techniques/methodologies and overcomes the deficiencies of such techniques/methodologies.
OBJECT OF THE INVENTION
It is an object of the present invention to develop an automatic portable battery-operated spraying device for relieving crop diseases in agricultural field.
It is another object of the present invention to integrate mechanisms of crop/plant disease detection and simultaneous application of pesticides on infected crops, thus minimizing crop damages and enhancing crop yield.
It is one more object of the present invention to design an improved pesticide spray control mechanism for optimal utilization of pesticides in appropriate dosages, thus contributing towards plant/crop growth and environmental pollution control with sustainability.
It is a further object of the present invention is to devise a cost-effective, and user-friendly dual compartment based spraying device infused with an image processing module and deep learning neural network-based vision transformer model for precisely detecting diseases of two different crops cultivated in same land at a time, and simultaneously spraying pesticides from corresponding compartments onto infected crops at desired doses.
SUMMARY OF THE INVENTION
In one aspect, the present invention provides an automatic spraying device for crops. The device comprises a carriable enclosure having two compartments for filling with different pesticides; two spray nozzles coupled with two motors; a touch screen interface mounted on the enclosure; an imaging sensor disposed between the spray nozzles; a microcontroller; and a power circuit. The motor pumps are adapted for pumping the pesticides from the compartments into the corresponding spray nozzles, and spraying therefrom the pesticides onto two different target crops cultivated at a time in one field. The touch screen interface provides access to operator to select two crops out of a predefined set of crops, and tag each selected crop with one motor pump. The imaging sensor captures images of the cultivated crops before initiating spray. The microcontroller is communicatively coupled with the motor pumps, the touch screen interface, and the imaging sensor. The power circuit is housed in the enclosure for supplying power to the motor pumps, the touch screen interface, the imaging sensor, and the microcontroller. The microcontroller is embedded with an image processing and deep learning neural network-based vision transformer model configured to: detect whether the selected crops are healthy or affected by any disease type/class, estimate a severity factor of the disease, trigger appropriate signals for the motor driver to turn on/off the corresponding motor pump, and determine doses of the pesticides to be sprayed on the diseased crops by regulating the pumping pressure of corresponding motor pump and opening/closing of corresponding nozzle.
Other aspects, advantages, and salient features of the present invention will become apparent to those skilled in the art from the following detailed description, which delineate the present invention in different embodiments.
BRIEF DESCRIPTION OF DRAWINGS
These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying figures.
Fig. 1 illustrates various components of the pesticide spraying device, in accordance with an embodiment of the present invention.
Fig. 2 illustrates pesticide release pipes/nozzles of the the spraying device, in accordance with an embodiment of the present invention.
Fig. 3 illustrates power supply and signal/data transmission among various components of the pesticide spraying device, in accordance with an embodiment of the present invention.
Fig. 4 illustrates operation flow chart of the pesticide spraying device, in accordance with an embodiment of the present invention.
Fig. 5 illustrates vision transformer architecture as employed in the pesticide spraying device, in accordance with an embodiment of the present invention.
Fig. 6 illustrates canny edges and K-means clustering results of sample affected crop leaves for severity prediction as employed in the spraying device, in accordance with an embodiment of the present invention.
Fig. 7 illustrates disease severity percentage estimation technique as employed in the spraying device, in accordance with an embodiment of the present invention.
Fig. 8 illustrates training accuracy vs validation accuracy graph of improvise trained vision transformer (proposed) model, in accordance with an embodiment of the present invention.
Fig. 9 illustrates training loss vs validation loss graph of the vision transformer model, in accordance with an embodiment of the present invention.
List of reference numerals
100 carriable enclosure
102 first compartment
102a top opening of first compartment
104 second compartment
104a top opening of second compartment
200 first pesticide release unit
202 first nozzle
204 first brass lance
206 first rubber pipe
208 first motor pump
300 second pesticide release unit
302 second nozzle
304 second brass lance
306 second rubber pipe
308 second motor pump
400 touch screen interface (display)
500 imaging sensor (camera)
600 microcontroller
700 power circuit
702 battery
704 voltage converter
800 motor drive
DETAILED DESCRIPTION OF THE INVENTION
Various embodiments described herein are intended only for illustrative purposes and subject to many variations. It is understood that various omissions and substitutions of equivalents are contemplated as circumstances may suggest or render expedient, but are intended to cover the application or implementation without departing from the scope of the present invention. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.
The use of terms “comprise,”, ‘include’ or “having” and variations thereof herein are meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Further, the terms, “an” and “a” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items. The term ‘crop’ used herein refers to all cultivational plants. The term ‘pesticide’ used herein refers to a liquid substance including chemicals, pest control solution, fertilizers, biocontrol agents, and medicines which are used for disease control and plant/crop growth in agriculture/cultivation.
According to an embodiment of the present invention, as shown in Fig. 1, the spraying device for crops is depicted. The device comprises a carriable enclosure (100); a first pesticide release unit (200), a second pesticide release unit (300), a touch screen interface (400), an imaging sensor (500), a microcontroller (600); and a power circuit (700). The enclosure (100) has a first and a second compartments (102, 104) provided with pesticide filling top openings (102a, 104a). Two different pesticides are filled in two different compartments (102, 104) for application on two different target crops which are usually cultivated at same time in same field/land, thus saving the farmers’ time, labour, and cost spent on pest control and crop disease management. The first pesticide release unit (200) includes a first nozzle (202), a first brass lance (204), a first rubber pipe (206), and a first motor pump (208), all being serially coupled together. The second pesticide release unit (300) includes a second nozzle (302), a second brass lance (304), a second rubber pipe (306), and a second motor pump (308), all being serially coupled together. The first motor pump (208) is coupled with the first compartment (102), and the second motor pump (308) is coupled with the second compartment (104). The motor pumps (208, 308) are adapted for pumping the pesticides from the corresponding compartments (102, 104) into the corresponding spray nozzles (202, 204) respectively, and spraying therefrom the pesticides onto the target crops. The microcontroller (600) is housed in the enclosure (100) and communicatively coupled with the motor pumps (208, 308), the touch screen interface (400), and the imaging sensor (500). The power circuit (700) is housed in the enclosure (100) for supplying power to the motor pumps (208, 308), the touch screen interface (400), the imaging sensor (500), and the microcontroller (600).
According to an embodiment of the present invention, as shown in Fig. 2, the imaging sensor (500) is a high-resolution camera that is preferably disposed/fitted between the first and the second spray nozzles (202, 302). The camera captures images of the cultivated crops and transmits the images to the microcontroller (600) for plant/crop disease detection through image processing and deep learning application based on which appropriate signals are generated for activating/deactivating the respective motor pumps to start/stop pesticide pumping action. The spray nozzles (202, 302) on their heads have minute pores (having pore size of 0.5 millimetre). The pores are opened or closed by an internally integrated solenoid type valve that is electrically or electromagnetically operated based on signal received from the microcontroller (600). The pumped pesticides pass through the pipes (206, 208), then the brass lances (204, 304), and finally release outside through the holes of the nozzles (202, 302).
According to an embodiment of the present invention, the carriable enclosure (100) is coupled with backpack straps which offer advantages of portability to the operators. The enclosure (100) is made of light-weight yet high-strength plastic material (such as High-Density Polyethylene). The enclosure (100) has at least one transparent wall for ease of visibility of the pesticide filling in the compartments (102, 104).
According to an exemplary embodiment of the present invention, as shown in Fig. 3, the motor pumps (208, 308) are 12V DC (direct current) water pumps having 125 PSI pressure capacity, which are coupled to pesticide outlets of the respective chambers (102, 104) of the enclosure (100) at one end, and a (L298N) motor driver (800) at another end. The touch screen interface (400) is LCD display that shows input images, disease detection results, and availability of the pesticides inside the chambers. The microcontroller (600) is Raspberry pi that is configured to receive inputs from the camera (500) and the LCD display (400), then generate appropriate signals for turning on/off the motor driver (800) with its pumping pressure regulation, and the opening/closing valves of the nozzle heads of the corresponding the pesticide release units (200, 300). The LCD display (400) is configured to show information associated with crop health status (healthy or diseased), battery charging status, and pesticide filling status. The power circuit (700) includes a 12V rechargeable battery (702), and a voltage converter (704). The voltage converter (704) converts 12V into 5V for running the Raspberry pi (600).
According to an embodiment of the present invention, the microcontroller (600) is embedded with an image processing module and a deep learning model configured to: detect whether the crops are healthy or affected by any disease type/class, estimate a severity factor of the disease, trigger appropriate signals for the motor driver (800) to turn on/off the corresponding motor pump (208, 308) based on the disease type/class and the severity factor, and determine doses of the pesticides to be sprayed on the diseased crops by regulating the pumping pressure of corresponding motor pumps and the opening/closing of corresponding nozzle heads.
In an exemplary embodiment, the raw images as captured by the camera undergo few standardized operational steps such as preprocessing (resizing, noise removal, and grayscale conversion) and removal of background in images, then the resultant images are fed into the deep neural network followed by the image processing module.
According to an embodiment of the present invention, as shown in Fig. 4, the spraying operation is depicted. The operator carries the pesticides filled spraying device on his/her back, and switches on its power button. First the operator gives the inputs by selecting two target crop/plant names (crop-1, crop-2) out of a predefined set of crop/plant list, then tag/assign crop-1 and crop-2 with motor-1 and motor-2 respectively. In other words, the motor-1 is assigned to spray the first pesticide from the first chamber to cure a disease associated with crop-1. Similarly, the motor-2 is assigned to spray the second pesticide from the second chamber to cure a disease associated with crop-2. Now the device is ready for taking images and spraying appropriate pesticides from corresponding chambers. The images are transmitted to the microcontroller for analysis (disease detection). If no disease is found on target body/part of the crops, then the motor pumps remain turned off, thus no spraying happens. If the crop-1 disease is found, then the microcontroller activates the motor-1 through the motor driver resulting in spraying the first pesticide from the first chamber to the target part of the crop-1. Similarly, if the crop-2 disease is found, then the microcontroller activates the motor-2 through the motor driver resulting in spraying the second pesticide from the second chamber to the target part of the crop-2.
According to an embodiment of the present invention, the deep learning neural network employs a vision transformer architecture/model that is trained using multiple large crop disease dataset containing various crops and diseases. The vision transformer architecture is selected for the present invention as it is flexible and can work well with different sizes of images, especially breaks down the captured images (of leaves, stem, fruits, and flowers of crops) into patches, then processes them using transformers, and finally aggregates information for crop disease detection/classification. For example, the trained model figures out what is in the images, if it is a healthy leaf/crop, or a diseased leaf/crop, or non-crop material, then accordingly the microcontroller triggers signals for the motor pumps.
According to an embodiment of the present invention, as shown in Fig. 5, the vision transformer processes an input image, dividing it into non-overlapping patches of a fixed size. These patches are then linearly transformed into flat vectors, serving as input for subsequent layers. Undergoing linear transformation, the patch embeddings are converted into feature vectors which are subsequently reshaped into a 2D grid, forming input for transformer encoder. To compensate lack of inherent spatial information, position embeddings are introduced and these embeddings provide the model with positional awareness, with options including learned or fixed sinusoidal functions. Comprising multiple identical layers, the transformer encoder incorporates self-attention mechanisms and feedforward neural networks. Each sub-layer in the transformer encoder, including attention and feedforward, is succeeded by layer normalization and connected through residual connections, which promotes training stability and gradient flow. Post self-attention, the output traverses a feedforward neural network comprising fully connected layers, which introduces non-linearities, enabling the model to discern complex patterns. The self-attention involves calculating attention scores between each position and determining the influence of other positions; which allows the model to focus on various parts of the input sequence, capturing dependencies over longer ranges. Following the transformer encoder, the output undergoes global average pooling to aggregate information. Subsequently, a Multi-Layer Perceptron (MLP) head, featuring one or more fully connected layers, generates the final output for classification. The operational/prediction efficiency is further improved by way of introducing pre-trained weights, such as Data-efficient Image Transformer weights in the patch/positioning embedding stage. Incorporation of the Data-efficient Image Transformer into the vision transformer elevates its performance, rendering it more data-efficient and capable of achieving competitive outcomes with reduced computational resources.
According to an embodiment of the present invention, after distinguishing the diseased crops from the healthy crops based on visual characteristics, the model is configured to predict type of the detected crop disease and its severity factor (quantifying degree of damage). the diseased/damaged regions are outlined through contour drawing using canny edge detector, visually depicting the extent of damage as depicted in Fig. 6. Analyzing damaged areas within the contours helps to quantify severity by measuring size and shape. The frame segmentation is preferred for improved accuracy for detecting disease-related features in individual frames. To enhance prediction, the the image processing module employs an algorithm such as K-means clustering and grouping similar data points are incorporated to analyze severity patterns and predict disease levels. K-means clustering is selected for the present invention as it offers simplicity, computational efficiency, and high accuracy, and help in estimating the severity percentage/factor as shown in Fig. 7. The severity percentage (factor) is estimated by dividing damaged area (in pixels) of the crop appearing in the image by total area (in pixels) of the corresponding image and followed by multiplication with one hundred, as shown in equation 1
Based on the severity factors of the diseased crops, the microcontroller further regulates pumping pressure of the motor pump, and opens/closes the pores of the corresponding nozzle heads. For example, if the severity factor is high and the affected area is wide/large, then the pumping pressure is increased to release more amount the pesticide on the damaged areas, and the pores of the corresponding nozzle head remains opened for longer duration. In contrary, if the severity factor is low and the affected area is small, then the pumping pressure is decreased to release less amount the pesticide on the damaged areas, and the pores of the corresponding nozzle head gets closed intermittently. In this way the spray doses are regulated with optimal usage automatically based on type of disease and its severity without any human interference.
According to an embodiment of the present invention, the proposed image processing and deep learning modules are developed using Python 3.9, and analysed using a dataset of almost 80,000 images from 88 different crop disease groups are procured from public available database, whose details are shown in Table 1.
Table 1
Sr. No Class No. of Images Sr. No Class No. of Images
1 Apple Rust 357 28 Grape healthy 470
2 Apple Black Rot 621 29 Grape leaf blight 889
3 Apple Healthy 1649 30 Jamun diseased 345
4 Apple Scab 700 31 Jamun healthy 279
5 Cassava bacterial blight 486 32 Lemon diseased 77
6 Cassava brown streak 558 33 Lemon healthy 159
7 Cassava green mottle 471 34 Mango diseased 265
8 Cassava healthy 677 35 Mango healthy 170
9 Cassava mosaic disease 444 36 Peach bacterial spot 2297
10 Cherry healthy 906 37 Peach healthy 363
11 Cherry powdery mildew 1052 38 Pepper bell bacterial spot 1067
12 Chili healthy 100 39 Pepper bell healthy 1539
13 Chili leaf curl 100 40 Pomegranate diseased 272
14 Chili leaf spot 100 41 Pomegranate healthy 287
15 Chili whitefly 100 42 Potato early 1096
16 Chili yellowish 100 43 Potato healthy 152
17 Coffee cercospora leaf spot 55 44 Potato late blight 1093
18 Coffee healthy 439 45 Rice brown spot 653
19 Coffee red spider mite 167 46 Rice healthy 1488
20 Coffee rust 442 47 Sugercane Red rot 174
21 Corn common rust 1308 48 Rice hispa 565
22 Corn gray leaf spot 1094 49 Rice leaf blast 981
23 Corn healthy 1162 50 Rice neck blast 100
24 Corn northern leaf blight 1223 51 Strawberry leaf scorch 1109
25 Cucumber diseased 350 52 Strawberry healthy 456
26 Cucumber healthy 341 53 Sugercane bactterial blight 100
27 Gauva diseased 142 54 Sugercane healthy 180
55 Gauva healthy 277 72 Soyabean caterpillar 3309
56 Grape black measles 1383 73 Soyabean diabrotica speciosa 2205
57 Grape black rot 11390 74 Soyabean downy mildew 51
58 Sugercane red strip 53 75 Soyabean healthy 5998
59 Sugercane Rust 93 76 Soyabean mosaic virus 22
60 Tea algal leaf 339 77 Soyabean powdery mildew 77
61 Tea anthracnose 300 78 Soyabean rust 65
62 Tea bird eye spot 300 79 Soyabean southern blight 62
63 Tea brown blight 339 80 Tomato bacterial spot 2136
64 Tea healthy 222 81 Tomato healthy 1599
65 Tea red leaf spot 429 82 Tomato late blight 1919
67 Wheat brown rust 916 83 Tomato leaf mold 957
68 Wheat healthy 1225 84 Tomato mosaic virus 382
69 Wheat septoria 97 85 Tomato septoria leaf spot 1782
70 Wheat yellow rust 1132 86 Tomato spider mite 1676
71 Soyabean bacterial blight 56 87 Tomato target spot 1404
88 Tomato yellow leaf curl virus 3214
For real-time detection, bounding box annotation is performed using roboflow to label the different classes. The time required for converting low-resolution images to high-resolution ones is an important factor to consider. Through practical testing, the conversion process is found to take approximately 0.4 seconds. To expedite the labelling process, an auto-labelling tool is developed, allowing for data labelling based on prior learning in 1.2 seconds. The training and validation process utilizes the pre-processed high-resolution images. For real-time detection, the overall dataset is split in a 70-20-10 ratio. This configuration involves using 70% of the images for training, 20% for model validation, and the remaining 10% for testing.
Referring to Fig. 8-9, the model achieves an accuracy of 99.12%. When the validation loss increases and the validation accuracy decreases, it indicates that the model is memorizing values rather than learning. An increase in validation loss accompanied by validation accuracy suggests overfitting or diverse probability values, indicating the need to employ the softmax function in the output layer. A decrease in validation loss and an increase in validation accuracy indicate that the model is learning and functioning effectively.
The proposed vision transformer model is compared with other published models (such as ResNet-50, Densenet, and VGG-19) to evidence improved technical effects in terms of highest real-time accuracy (99.12%) and severity factor computation parameters, as shown in Table 2.
Table 2
Parameters Amara et al Vin Gia Nhi et al Brahimi et al Wang et al Garg et al Present Invention Model
Dataset Banana Dataset (Plant Village) Plant Village Dataset Plant Village Dataset
Apple Dataset (Plant Village) Plant Village Dataset
Plant Merged Dataset
No. of Species 1 14 14 1 14 27
No of Classes 3 N/A 38 4 38 88
Number of Images 3700 54306 54323 2086 54305 79046
Real Time N/A N/A N/A N/A N/A Vision Transformer
Learning Rate 0.001 0.001
0.001
0.001
0.001
0.001
Activation Function Sigmoid N/A
N/A
SGD Softmax Softmax
Batch Size 10 N/A 20 N/A 64 32
Train-Test Ratio 60%-40% N/A
80%-20%
80%-20%
80%-20%
70%-20%-10%
Optimizer SGD N/A N/A N/A Adam Adam
No of Iterations 30 N/A
N/A
N/A
30 40
Real Time Accuracy N/A
N/A
N/A
N/A
N/A
0.9912
According to an exemplary embodiment of the present invention, the spraying device becomes advantageous in terms of saving the farmers’ time, money, energy, and labour; since it can detect diseases of two crops cultivated at a time in the same field, and spray two different pesticides as needed in required doses/quantity. For example, corn can be grown with potato at the same time in the same field. In that event, the operator can choose ‘corn’ and ‘potato’ name in the LCD display, then assign the motor-1 for ‘corn’ and the motor-2 for ‘potato’. Then the operator shows the nozzles/camera towards the crop leaves/stems/flowers (target parts) to initiate image capturing and subsequent disease detection with spraying action. If the diseased leaves of corn are detected in the images, then the motor-1 is turned on to start spraying the corresponding pesticide on the corn leaves. If the diseased leaves of potato are detected in the images, then motor-2 is turned on to start spraying the corresponding pesticide on the potato leaves. The appropriate/optimal doses/quantity (threshold range) of pesticides to be required for curing various crop diseases at different severity conditions are be predetermined as per the standard agricultural practices, and the same are coded in the microcontroller. As soon as the microcontroller detects any of the listed diseases as selected and estimate its severity percentage, the signals are generated for activation/deactivation of the corresponding pump and nozzle for releasing the desired pesticide at correct doses (for example 5ml-10ml per plant/crop). Further, the doses of pesticide spraying automatically vary based on the severity factor as estimated by the equation 1, thus achieving optimal and precise usage of pesticides for crop protection from various diseases, increasing agricultural yield and minimizing environmental pollution.
The foregoing descriptions of exemplary embodiments of the present invention have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The exemplary embodiment was chosen and described in order to best explain the principles of the invention and its practical application, to thereby enable the persons skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is understood that various omissions, substitutions of equivalents are contemplated as circumstance may suggest or render expedient, but is intended to cover the application or implementation without departing from the scope of the claims of the present invention. , Claims:We Claim:
1. A spraying device for crops, the device comprises:
a carriable enclosure (100) having a first and a second compartments (102, 104) dimensioned for filling with different pesticides;
a first and a second motor pumps (208, 308) adapted for pumping the pesticides from the first and the second compartments (102, 104) into a first and a second spray nozzles (202, 302) respectively, and spraying therefrom the pesticides onto two different crops cultivated at same time in same field;
a touch screen interface (400) mounted on an outer surface of the enclosure (100) for providing access to operator to select two crops out of a predefined set of crops, and tag one selected crop with the first motor pump (208) and another selected crop with the second motor pump (308);
an imaging sensor (500) disposed between the first and the second spray nozzles (202, 302) for capturing images of the cultivated crops;
a microcontroller (600) communicatively coupled with the motor pumps (208, 308), the touch screen interface (400), and the imaging sensor (500); and
a power circuit (700) housed in the enclosure (100) for supplying power to the motor pumps (208, 308), the touch screen interface (400), the imaging sensor (500), and the microcontroller (600),
wherein the microcontroller (600) is embedded with image processing and deep neural network configured to:
detect if the selected crops are affected by any disease type/class through a trained vision transformer model deployed on the captured images,
estimate a severity factor of the disease as detected in the images through division of damaged area of the crop appearing in the image by total area of the corresponding image,
trigger signals to a motor driver (800) that turns on/off the corresponding motor pump (208, 308) based on the disease type/class and the severity factor; and
determine doses of the pesticide sprayable on the diseased crops based on the severity factor, thereby regulating pumping pressure of the corresponding motor pump (208, 308), and opening/closing of pores of the corresponding nozzle (202, 302).
2. The device as claimed in claim 1, wherein the carriable enclosure (100) is coupled with backpack straps.
3. The device as claimed in claim 1, wherein the carriable enclosure (100) has at least one transparent wall for ease of visibility of the pesticide filling in the compartments (102, 104).
4. The device as claimed in claim 1, wherein the spray nozzles (202, 302) are coupled with the motor pumps (208, 308) through brass lances (204, 304) and rubber pipes (206, 306) therebetween.
5. The device as claimed in claim 1, wherein the pores of the spray nozzles (202, 302) opened/closed by electrically operated valves.
6. The device as claimed in claim 1, wherein the motor pumps (208, 308) are 12V DC (direct current) water pumps having 125 PSI pressure capacity.
7. The device as claimed in claim 1, wherein the power circuit (700) includes a 12V rechargeable battery (702), and a voltage converter (704).
8. The device as claimed in claim 1, wherein the touch screen interface (400) is configured to display information associated with crop health status, battery charging status, and pesticide filling status.
9. The device as claimed in claim 1, wherein the damaged area is measured by canny edge and K-means clustering.
| # | Name | Date |
|---|---|---|
| 1 | 202321090076-FORM 1 [30-12-2023(online)].pdf | 2023-12-30 |
| 2 | 202321090076-DRAWINGS [30-12-2023(online)].pdf | 2023-12-30 |
| 3 | 202321090076-COMPLETE SPECIFICATION [30-12-2023(online)].pdf | 2023-12-30 |
| 4 | 202321090076-Proof of Right [02-01-2024(online)].pdf | 2024-01-02 |
| 5 | 202321090076-FORM-9 [02-01-2024(online)].pdf | 2024-01-02 |
| 6 | 202321090076-FORM-26 [02-01-2024(online)].pdf | 2024-01-02 |
| 7 | 202321090076-FORM 3 [02-01-2024(online)].pdf | 2024-01-02 |
| 8 | Abstact.jpg | 2024-01-24 |
| 9 | 202321090076-FORM 18A [02-02-2024(online)].pdf | 2024-02-02 |
| 10 | 202321090076-FER.pdf | 2024-03-18 |
| 11 | 202321090076-OTHERS [16-05-2024(online)].pdf | 2024-05-16 |
| 12 | 202321090076-FER_SER_REPLY [16-05-2024(online)].pdf | 2024-05-16 |
| 13 | 202321090076-CLAIMS [16-05-2024(online)].pdf | 2024-05-16 |
| 14 | 202321090076-ABSTRACT [16-05-2024(online)].pdf | 2024-05-16 |
| 15 | 202321090076-US(14)-HearingNotice-(HearingDate-30-09-2024).pdf | 2024-09-12 |
| 16 | 202321090076-Correspondence to notify the Controller [25-09-2024(online)].pdf | 2024-09-25 |
| 17 | 202321090076-US(14)-ExtendedHearingNotice-(HearingDate-08-10-2024)-1600.pdf | 2024-10-04 |
| 18 | 202321090076-Correspondence to notify the Controller [05-10-2024(online)].pdf | 2024-10-05 |
| 19 | 202321090076-Written submissions and relevant documents [11-10-2024(online)].pdf | 2024-10-11 |
| 20 | 202321090076-Annexure [11-10-2024(online)].pdf | 2024-10-11 |
| 21 | 202321090076-PatentCertificate23-10-2024.pdf | 2024-10-23 |
| 22 | 202321090076-IntimationOfGrant23-10-2024.pdf | 2024-10-23 |
| 23 | 202321090076- Certificate of Inventorship-022000063( 14-01-2025 ).pdf | 2025-01-14 |
| 24 | 202321090076- Certificate of Inventorship-022000065( 24-02-2025 ).pdf | 2025-02-24 |
| 1 | SearchHistory(4)E_12-03-2024.pdf |