Abstract: In our invention, an unmanned drone is used to capture a video of the field and use convolution neural networks and transfer learning technique to identify the diseases and weeds affecting the crop and suggests the remedies for the problem through the farmer’s smartphone. The proposed model calculates the area over which the disease has spread using the parameters from the drone and also calculate the necessary amount of chemicals to neutralize the infection affecting the crop. The model comprises PIR sensors to take care of the unwanted animal’s aspect of the problem by notifying the farmer about the area of movement through the smartphone whenever the sensor is triggered. The model proposed in our invention makes farming more efficient and profitable by decreasing the work of the farmer and increasing the yield. 4 Claims 4 Figures
Description:Field of the Invention
The present invention pertains to monitoring crop health and detecting unwanted organisms on the field and giving remedies if a disease or pest is detected.
Objective of the invention
The primary objective of this idea is of a system to monitor the agricultural fields and to identify diseases and animals affecting the crop as quickly as possible. This model uses the best algorithm for image classification called Convolutional Neural Networks. It classifies the pictures of the crop as heathy and unhealthy and gives the remedies in precise measurements. PIR sensors are used to notify the farmer about unusual movement in the field which is usually animals grazing. Hence it helps eliminate some of the biggest problems faced by farmers.
Background of the Invention
Agriculture plays a very important role in the development of a country. It is responsible for feeding the growing population and is a solution to global warming. Crop diseases and unwanted animals on the field are two of the main obstacles for agriculture as they harm the crops which lead to decreased or low-quality yield. This leads to starvation and economic downfall.
These are detected using manual observation of weeds growing on the field or partially damaged crop and comparing the symptoms to symptoms of the diseases. This method is very complex and improper knowledge of the diseases that affect each crop leads to the farmer using excessive number of pesticides, insecticides, etc. for treating them which negatively affects the crop and cost more than necessary. Detecting animals on the field manually is also not possible to do before they damage the crop partially as it would require constant monitoring. Especially detecting rodents on the field can be very difficult due to their small size.
Hence it is required to use technology for identifying the problems that may affect the crop more accurately and faster than the manual methods so that the crops are treated at the right time with the least possible amount of chemicals to increase quality and yield of the crops.
Some of the patented technologies proposed to improve farming are:
IN202041032004, is an invention is related to technology driven and automated leaf disease detection towards Precision Agriculture. The model collects leaf images through satellite imagery. To improve quality, there is noise removal that controls quality of input images. The images are then subjected to segmentation process in order to identify regions of interest. Once these are known, those regions are used for feature extraction and optimization to improve quality of training. Deep-learning based CNN is used for the machine learning algorithm. With the knowledge gained from pre-trained images and learned knowledge, the deep learning gives a leaf disease detection model that is capable of discriminating healthy and infected leaves. Daisy descriptor is used for feature extraction. It represents features and is the best for matching images. The proposed system is part of precision agriculture and can be integrated with any application.
CN110033015, is an invention that relates to a plant disease detection method based on residual network. This method involves building a dataset of detecting plant diseases which uses a brand-new leaf disease image data set of AI Challenger, carrying out a statistical analysis of varieties of datasets, andprocessing the images by using random angle rotation and random horizontal and vertical mirror rotation, etc. A deep learning network called CNN is used on the pre-processed images.ResNet is used to train the CNN to make a fast plant disease detection method in which 27 diseases of up to 10 crops can be accurately identified.
IN201621019775, is a system which is based on the application of Internet of things [IOT] in precision agriculture. It has a user-friendly mobile app for the farmer to control an automated irrigation system, Farm security system and Farm expense records. The system has a distributed wireless network of soil-moisture sensors placed in the root zone of the plants. In addition, a gateway unit processes the sensor information and transmits data to the mobile phone through a Server and it triggers actuators accordingly. The controlling action has to be taken with the authentication of the farmer; also the farmer can control the water outlet irrespective of the soil conditions. Farm security is provided using wireless PIR Sensors that are deployed at the boundaries of the field. When any Animal crosses the boundary, a buzzer is activated to frighten the animals and notification is sent to the smartphone. Farm expense record is provided to control the expenses of the farm and past agricultural work data can also be stored in Mobile App.
IN202311013686, is an invention that relates to the field of an agricultural monitoring & spraying system. It is an agricultural drone for use in combination with agricultural equipment. The agricultural drone for monitoring and spraying includes a set of sensors to get all the various attributes of the surroundings, a frame with a thinstrong beam in the middle and two base structures at two ends, four motors with propellers rotatably mounted to each of the two base structures of the frame, a tubing element with an array of nozzles suspended below the said beam with retractable wires, a memory for storing input data receiving from the one or more input sensor, a data log unit operatively coupled to the one or more processing unit, a storage unit configured to store machine-readable instruction, a one or more processing unit for monitoring the agricultural activities, a communication module to communicate data to the connected device, and a power source positioned in one of the base structures.
IN202041036033, is also a plant disease detection which detects diseases from images of plant leaves. Their objective is to use significant features extraction methods like sobel operator, segmentations and prediction is done using computer vision. This method consists of the steps: download the image from server, convert image into grayscale, show defected part of the leaf, predict plant disease through mobile cloud computing.
We aim to improve agricultural field monitoring technology using similar technologies as above using our invention.
Summary of the Invention
To solve the above problem the present invention aims to make the disease and weed detection process more efficient using AI and also use PIR sensors to detect unwanted animals like rodents in the field. The disease detection is made easy through an application using image-classification and image-segmentation based deep learning technique called convolutional neural networks with the help of transfer learning.
Images of the crop are captured by a drone with a camera and the pictures are processed by the deep learning model to check if the crop is healthy. If the crop is affected by a disease, the model identifies the disease. If the model detects any weeds or unnecessary plants growing, it detects the area over which it is growing.
Then the correct remedy is suggested for most efficient treatment.
The PIR sensors which are connected to Arduinos are placed in the field and when they detect movement in the field which may be an animal it sends a signal to the Arduino which sends a notification to the system running the application. The farmer can know the part of the field and the time at which the animals are damaging the crop so the countermeasures taken by the farmer can be more effective.
Brief Description of Drawings
The invention will be described in detail with reference to the exemplary embodiments shown in the figures wherein:
Figure 1: Picture showing healthy and infected leaves.
Figure 2: Flowchart of transfer of video from drone to app.
Figure 3: Flowchart of CNN algorithm with transfer learning.
Figure 4: Flowchart of working of PIR sensors and Arduino.
Detailed Description of the Invention
The drone is flown over the field once a day with the camera parallel to the ground using an automated flight plan that flies a route at a fixed altitude(4ft) and a fixed speed that does not affect the quality of video, covering the field with the crop without deviation or intervention and the video of the crop is shot by the drone. The video is then transferred to the Android smart phone containing the application of the CNN algorithm by wireless download. The frame rate is decreased for faster processing of the video. Each frame is saved by their timecode value so that they can be used later to pinpoint the area over which the video has been shot for that particular frame.
A CNN is a deep learning algorithm that takes an image and assigns weights to the components of the images and classifies the entire image. It extracts the features from the image automatically and combines them with learned parameters to detect the features and patterns of the features. With enough training the pre-processing that is needed to be done to the images is significantly lower than other algorithms. In this model it classifies images to differentiate between healthy and diseased crop.
Transfer learning is the process where we apply a model created for one task to another. We train the CNN model using a dataset similar to the images of the crop with the diseases and weeds. The same model is then trained again with a second dataset with a different class distribution, here the images of crops with diseases and weeds is used. The names of the diseases and the remedies are saved along with the images in the dataset to provide the farmer with the necessary information as fast as possible. As the model is already trained with similar data to the images of crop diseases, a smaller dataset is sufficient to fine tune the model to be accurate enough.
The application detects the type disease of the crop or the weed from the frames of the video. It classifies each frame as infected or healthy and saves the frames with the name of the disease. The area infected by the disease or weed can be calculated using information like altitude and speed of the drone, width of field covered by the camera, timecode value of the frame which has been classified by the application as infected. The first and the last frame classified as infected with all the frames in between also classified as infected are taken. The timecode values can be used to find out the time interval between the two frames based on the video. The width of field covered by the camera perpendicular to direction of movement of drone) is multiplied by time interval between the two frames and speed of the drone to get the area covered. The area is pinpointed using the path followed by the drone. The application suggests the remedy and calculates and gives the amount of pesticide or fungicide necessary using the following calculations.
Quantity of material required= ((Rate of application)/(Active ingredient in %)) X 100 X Area
Amount of pesticide = (volume of spray solution(l) X (% strength of pesticide solution to be sprayed))/% strength of pesticide given(l/kg)
The farmer will use this information and spray the given type of chemicals and the exact amount of the chemicals on the infected area to ensure good quality
yield.
Solar powered PIR sensors are placed in the field with 20 meters gap between each sensor and each one of them are connected to a solar powered Arduino. The Arduino is connected to the Wi-Fi. Whenever the PIR sensor is triggered, the Arduino sends a push notification of the location of the sensor to the Android or IOS smartphone of the farmer using Firebase API. The farmer can use this information to kill or scare the animals away
Advantages of the proposed model,
The proposed model can detect any obstacles the farmer may face before it is too late which help reduce loss of crop due to pests, weed, animals etc. The drone which has the camera follows an automated flight plan hence it does not require the farmer to learn how to fly a drone. All the person needs using this model is the drone, their smartphone, the PIR sensors and the Arduino.
The farmer does not need to manually monitor and check for abnormalities in the crop. The model provides the farmer with the right information and guidance required to maximize the yield resulting in high profits.
The farmer is notified anytime an abnormal movement is sensed by the PIR sensors which may be an animal grazing the field so the animal can be scared away or put down in time.
4 Claims & 4 Figures , Claims:The scope of the invention is defined by the following claims:
Claims:
1) The proposed model for agricultural field monitoring through smartphone using deep
learning and IOT comprises:
a) A unmanned drone with a camera parallel to the ground for capturing the video of the field.
b) A image segmentation and image classification to identify disease of the crop using individual frames of the video using the CNN model with transfer learning.
c) A PIR sensors is connected to detect the motion of animals like rodents, pigs, etc. and notify the farmer by giving the area over which the animals are damaging the crop.
2) As per the claim 1, the image segmentation and image classification detects if there are any weeds or unnecessary plants growing on the field It also.
3) As per the claim 1, the model calculates and pinpoints the area over which the crop has been infected using information like speed and altitude of the drone, width of the field covered by the camera and timecode values.
4) As per the claim 1, the model identifies the type of the disease affecting the crop and suggests the type of chemicals and calculates the amount of chemicals required for the detected area.
| # | Name | Date |
|---|---|---|
| 1 | 202341067750-REQUEST FOR EARLY PUBLICATION(FORM-9) [10-10-2023(online)].pdf | 2023-10-10 |
| 2 | 202341067750-FORM FOR STARTUP [10-10-2023(online)].pdf | 2023-10-10 |
| 3 | 202341067750-FORM FOR SMALL ENTITY(FORM-28) [10-10-2023(online)].pdf | 2023-10-10 |
| 4 | 202341067750-FORM 1 [10-10-2023(online)].pdf | 2023-10-10 |
| 5 | 202341067750-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [10-10-2023(online)].pdf | 2023-10-10 |
| 6 | 202341067750-EVIDENCE FOR REGISTRATION UNDER SSI [10-10-2023(online)].pdf | 2023-10-10 |
| 7 | 202341067750-EDUCATIONAL INSTITUTION(S) [10-10-2023(online)].pdf | 2023-10-10 |
| 8 | 202341067750-DRAWINGS [10-10-2023(online)].pdf | 2023-10-10 |
| 9 | 202341067750-COMPLETE SPECIFICATION [10-10-2023(online)].pdf | 2023-10-10 |
| 10 | 202341067750-FORM-9 [28-10-2023(online)].pdf | 2023-10-28 |