Abstract: The land of India is home to 1.4 billion people. The agricultural industry always addresses the food demand of such an enormously increasing population. India is the agricultural powerhouse of the world. However, the erratic weather and unfavorable environmental conditions lead to myriads of crop diseases. The traditional agricultural methods used to produce the yield make it challenging to keep track of the crops and prevent diseases. It can lead to a compromise in the food security of the Indian population. Hence, deploying a plant disease detection system with a surveillance system can enhance crop productivity by improving efficiency with minimal human intervention and assistance. This paper talks in depth about the model under development which monitors plant growth and detects any diseases at the early stages of crop evolvement. The pre-trained model uses a Line Tracking algorithm to move around the farm to inspect every crop in detail. The pictures of the crops are taken and stored for monitoring and analytics of plant health to achieve real-time detection and information. The Convolutional Neural Networks and Transfer Learning technique help scrutinize images for diseases. A positive response to the presence of any disease outputs the name of the blight and immediate effective treatment for cure. The growth and development trends of healthy plants help to identify the challenges that occur during the exact crop yield for the next season of the plantation. Such techniques help refine agricultural practices and provide many opportunities for diverse innovations with effective and potential solutions.
Description:CROP MONITORING WITH AI BASED AUTONOMOUS FARM ROVER
Field of Invention
The proposed invention relates to agriculture surveillance to detect anomalies in plant growth. It describes in detail the plant disease detection methodology through a system that monitors and analyses the health trends of a crop during the cultivation season.
Background of the invention
Farming is a way to upkeep posterity. Meeting the demands of a large population with the traditional method of farming is arduous. Farmers put their entire hard work and endurance into cultivating their lands and producing a good yield, but the modern equipment's inadequacy directly or indirectly hinders their livelihoods. Putting a safe, sustainable, flexible, and reliable tool into the hands of a farmer can be a boon to modern agriculture. Many proposed models help in refining the techniques of the conventional methods of agriculture hence, providing security to the farmers with a healthy, substantial yield.
For instance, US10698402B2 discloses the working of an agricultural robot comprising a sensor module, a speaker operable to transmit a directional acoustic signal at an object, a microphone operable to register a reflection of the acoustic signal, and a sound analyzer. It distinguishes a plant object and a non-plant object. The agricultural robot identifies characteristics of the object such as its variety, fecundity and abundance on the plant, the ripeness of fruit, health conditions, an abundance of foliage and the level of pest infestation on the fruit. This system relies on acoustic signals to determine only the health conditions of the fruit on the plant.
Similarly, US10539545B2 relates to systems and methods for monitoring fruit production, plant growth, and plant vitality. It is also a monitoring system for agricultural products' transportation and delivery chains. The system uses a thermal imaging camera to determine plant quality based on the thermal data. It assembles point cloud data for a plant with a laser scanner in three dimensions to create a plurality of three-dimensional vertices with three-dimensional coordinates. It provides various instructions, such as geo-registering the assembly points and associating them with GPS and pixels of photographic data. It also provides classification data of the plant comprising stem diameter, height, volume, and leaf density. It also provides instructions to control a fruit tree sprayer considering the conditions and various parameters of the plant. But the absence of a disease-detection system and furthermore, a method that analyses the health trends, growth patterns and prediction of the next harvest is a liability.
US7854108B2 is also related to an agricultural robot system and method of harvesting, pruning, culling, weeding, measuring, and managing crops. It uses autonomous and semi-autonomous robots comprising machine vision using a camera that identifies and locates the fruit on each tree, points on a vine to prune, or may be utilized to measure agricultural parameters in managing agricultural resources. The scouting robot autonomously scouts the field without operator intervention. The robot autonomously moves to the grapevines and uses a camera to obtain data associated with agricultural elements, and finally, the data is geo-referenced.
WO2017194276A1 provides a system, method and computer program product for determining plant diseases. The system includes an interface module that receives image of a plant to which color normalization techniques are applied. Several other modules such as extractor module, filtering module, and a plant disease diagnosis module are used to evaluate multiple factors of various plant elements and probabilities to identify the presence of a particular disease. The extracting module extracts various portions of the normalized image while the filtering module is used to identify one or more clusters by one or more visual features within the extracted image portions to show characteristics of a plant disease. It uses a predefined threshold by using a Bayes classifier that models visual feature statistics.
US8577616B2 proposes methods of compiling a database of images of various plant species. It also describes the use of database to identify unknown plant genus. Images of the apical complexes of the plant are obtained and stored in a database to allow a comparison of the apical complexes with the unknown plant species. The invention provides a facile method for the identification of unknown or unidentified plant species.
An unmanned aerial vehicle-based method and system for intelligent identification and precise pesticide application are covered by CN106585992A. The process begins with segmenting a crop area into a number of sub regions, controlling an unmanned surveillance aerial vehicle to acquire crop images in each sub region and feed the images to a control centre, automatically comparing the acquired crop images with prestored images of crops subjected to pest and disease damages to identify the presence of pest and disease damages in each sub region, and controlling an unmanned spraying a pesticide if pest and disease damages exist. Instead of spraying pesticides over a large area to protect the entire range of crops, the method sprays pesticides only to sub regions with crops that are subject to pest and disease damages, reducing pesticide waste and protecting the environment. Additionally, corresponding proportions or corresponding types of pesticides are sprayed according to specific circumstances of pest and disease damages; the same ratio of pesticides is not adopted for different pest and disease damage.
In the disclosures, as mentioned earlier, there is a dearth of a system that analyses the crops' health trends and growth patterns. Also, there is a requirement for a technique that provides efficient crop protection and management, better guidance for yield improvement and future prediction of the next harvest of the same crop type. Inspired by these inventions and also bridging the gaps, on the other hand, the idea of developing this model was put forward. The present invention focuses on detecting abnormalities in plant growth by combining various agricultural surveillance methodologies and some other experimental propositions. The land rover-based agricultural monitoring system minimizes human intervention in plant growth surveillance. Monitoring and analyzing the plant to prevent diseases and pest attacks can become accessible through the deployment of this model. Also, the prototype is developed in such a way that it helps the farmers understand the growth trends by accessing the provisions of a better guidance system for an improvement in the yield. There is also an amenity that could forecast the following agricultural produce to manage resources and optimize performance. It helps to relieve the burden of farmers and can be a partner in the journey of a healthy yield.
Summary of the invention
The conventional crop protection and management methods are cumbersome and time-consuming. It requires a lot of human resources and effort to achieve a higher crop yield without compromising on the quality and quantity of the crop and any loss of harvest. Hence, a plant health surveillance system is proposed to enhance operation efficiency and management. The model focuses on detecting and identifying diseases. Later, suggestions and remedies are given to reduce the level of pest infestation in the short run using natural and artificial methods, thereby helping restore the crop's health. The model also helps establish a connection between the farmer and nearby test centers for further benefits in the long run. The system can be used to analyses the growth trends and patterns to predict the subsequent yield of the same crop. This way, one can track the harvest while bearing in mind the output regarding benefits to the farmer, consumer preferences and ecological responsibility.
The primary objective of propounding this idea of an agricultural surveillance system is to quickly examine and study the trends of crop growth and development. Specific reliable algorithms of Convolutional Neural Networks are used to detect the presence of any disease in the early stage of evolvement and later development as well. It suggests farmer of the medications for the recovery of plants in the short and long run. Hence, this helps to eliminate manual efforts of crop protection and management.
Brief Description of Drawings
The invention will be described in detail with reference to the exemplary embodiments shown in the figures wherein:
Figure-1: Flowgorithm representing the work flow of the agro-bot in the field
Figure-2: Diagrammatic representation of storage of the details of each crop in clusters
Figure-3: Flow chart representing the basic architecture and workflow the developed prototype
Detailed description of the invention
The model of the agricultural surveillance system is built to minimize human intervention by automating the process of plant supervision. In the initial stages, the model is trained using a data set to identify, predict and output the disease upon detection. The data set contains various pictures of different types and levels of plant infestation, along with the names of the ailments. Also, the corresponding suggestion and remedies are stored in key-value pairs where the names of the various diseases become the key. In contrast, the remedies and suggestions are stored as values. Such a system of data storing enables an easy method to represent real-world entities and faster lookup.
The testing model consists of several components. A robot chassis is used as the base frame to place other critical functional components. Two-gear motors are used. The Raspberry Pi 4B is the central processing unit employed for image processing and other machine learning and analytics tasks. This microcontroller has a 32-bit micro-processing unit while offering 8GB RAM to handle 480X360 video screens for image processing easily. A motor driver is an essential constituent that establishes an interface between the motor and the microcontroller. Hence, L298 dual-channel H-Bridge is the main microcontroller to remote control the DC gear motor via the Wi-Fi module. It contains three headers; the two headers are connected to two DC motors, respectively, while the three-pin third header is connected to the power source. A 10000mAH power bank is used to power up all the components and provide a backup time of at least 4 hours for continuous operation of the robot without a direct power supply of 240V from the inbuilt power cable. Arduino UNO R3 acts as a secondary device that receives instructions from the main microcontroller and controls the motors and other embedded sensors. Also, two camera modules of 8mp with CSC cable are utilized as both the primary fixed camera and dynamic camera in the mechanical arm for the robot. It is mounted and hooked up directly to the CSC port. Many jumpers and USB cables connect peripherals to the central processing unit.
The entire agricultural land is divided into several smaller clusters with a limited number of crops. As shown in Figure 2, the exemplar cropland is divided into six clusters, each consisting of four plants. Similarly, extensive farmlands can be divided into a suitable number of clusters. The cluster boundaries act as a hypervisor that helps the user to identify a particular crop during surveillance. The pre-trained model is deployed on the farm for real-time detection and identification. The autonomous robot is placed in the furrows adjacent to the planted crops. The camera-equipped agrobot uses a Line Tracking algorithm to determine the path of travel. Line tracking integrates sensing, actuation, and control algorithms to navigate a pre-marked path autonomously without human intervention. The robot moves around the agricultural land to identify the traveling paths and cluster boundaries. The database stores the data, and the memory maps a cluster boundary accordingly.
After identifying the paths and boundaries, the robot navigates around the farmland from one crop to another. During its movement, the bot captures several pictures of a crop from different angles. These pictures are processed and stored in the database for later examination and predictions. In the memory, the entire storage is divided into n clusters called parent class. Each parent class consists of m child classes. Each child class stores the processed images of a crop in a particular cluster.
Concerning Figure 3, after collecting and storing several images of the crops, the model analyses the captured photographs using Convolutional Neural Networks and Transfer Learning techniques. The knowledge gained while learning to recognize one plant can help recognize the ailments of the second plant using transfer learning. Convolutional neural networks help in analyzing visual imagery. While examining, if there is no plant infestation, the model moves on to the next plant for detection and identification. The entire cluster is declared unhealthy if there is an existence of a diseased plant. The details of the disease are output, such as its name, symptoms, and causes. Also, the model provides the user with the necessary details, such as immediate remedies to implement for cure in the short run. In the short run, the cure can be achieved by either using natural or artificial remedies. Natural remedies are like homemade remedies suggested to the user, which takes more time to heal but also hold the potential for strengthening and betterment. While on the other hand, the artificial cure involves using chemicals to achieve immediate relief. It also helps the user connect with the nearby divisions of plant pathology, agricultural labs, research, and testing centers to acquire better solutions and more information on cultivation practices and techniques. This method also helps to procure a benefit in the long run.
An indirect connection between the user and the nearby divisions of plant pathology, agricultural labs, research and testing centers is established via an AI assistant using third party API-GitHub. The AI assistant uses a GitHub account managed using git commands to receive insights and feedback from the diagnostic centers. New repositories are created weekly while the various images of crops collected everyday are updated under sub-branches labelled day1, day2, day 3 and so on. Furthermore, the test centers will have to frequently visit the GitHub repository to get updated with the status. On the basis of observations made, treatments for plant illnesses can be recommended by adding the relevant information and remedy into a new file and pushing it into that specific branch of the repository. This new file is read by the AIassistant and conveyed to the user.
A similar procedure is followed with the other remaining plants in various clusters. The data of the crops are collected and stored regularly in the fashion mentioned above. The user interacts with an interface that displays the required information on the screen. The user is also informed about the daily status of the growth and development of the crops. Later, the stored data is used to analyze the crops' health trends and growth patterns. Growth and health trends quantify a crop's growth rate and health rate over a given period. This analysis helps deduce other necessary details for predicting the forecast of the upcoming agricultural produce. This analysis also helps to manage resources and optimize performance. The current season's yield is compared with the previous produce hence, facilitating the user to compare renditions. Suppose the current harvest is quantitatively less than the previous plantation's produce. In that case, the user can connect with the agricultural labs to learn and understand better procedures to produce a higher yield without damaging the environment.
Also, this collected and generated data can be reused as training data to train the model further and obtain better results and predictions during the next harvest. It can help in further management, analysis, visualization, and interpretation to obtain finer and enhanced results.
The proposed model provides the capability to detect any abnormalities or changes in crop growth at an early stage, which can help reduce losses due to pests, diseases, floods, and other environmental factors. The system uses a camera mounted on the rover to quickly scan for any crop anomalies. The cameras are specifically tuned to detect crop health and identify any potential problems that could arise. This provides an efficient and reliable way of crop protection and management.
The use of eco-friendly surveillance as a new technology in agricultural monitoring has been gaining traction in recent years. Eco-friendly surveillance is an innovative approach to monitoring crops that use slow-impact techniques. Use of this method and system helps to reduce the environmental impact while providing detailed insights into crop growth and performance.
The provision of a better guidance system for yield enhancement can be leveraged using this model. AI here collects data from multiple sources which in the later stages in processed and analyzed, providing farmers with valuable insights into how certain conditions affect their crops’ growth. Such insights help farmers make informed decisions and improve yields by optimizing their processes while reducing costs and hence, leading to improved profitability.
Forecasting the next agricultural produce using AI can provide a range of benefits to manage resources and optimize performance. Predicting harvest using historical data and many other algorithms benefit farmers’ future. Planning and management of resources bring along with them many benefits such reduced operational costs and maximized yield.
This model also holds a potential of increasing labor competency through the use of data-driven decision making and analytics, which can help identify areas for improvement and suggest potential solutions. It helps in assistance with the preparation and harvesting of crops in the shortfall in farm labor.
5 Claims & 3 Figures , Claims:The scope of the invention is defined by the following claims:
Claim:
1. A crop monitoring with AI based autonomous farm rover comprising:
a) A mechanical arm with a dynamic camera is mounted to capture the dynamic shots of the plant. In order to capture the entire crop and determine its height and necessary characteristics, the camera physically moves and keeps changing its position to collect data.
b) The static camera helps to identify the travelling paths and the cluster boundaries. It also helps to discover the diseases at the lower levels of a plant in a particular cluster. After identifying blight, the user is alerted with details such as the name of the disease, its symptoms, and its causes.
c) The user is suggested specific remedies for reducing the intensity and severity of the disease. In the short run, the remedies suggested are either natural or artificial. The user can take necessary action based on the suggestions to prevent damage to the yield in the short run.
d) The user is also provided with the provision of establishing a connection by interacting with the nearest divisions of plant pathology, research and testing centers to learn more about ailments, conditions of the crop, and other necessary information to make informed decisions.
2. As mentioned in claim 1, the plant's health trends and growth patterns are recorded and saved in a database for future reference. The user is also informed about the daily status of the growth and development of the crop.
3. As mentioned in claim 1, based on the data collected from the previous plantation season, the trends of the next cultivation season can be predicted. The yield is predicted and compared with the previous produce.
4. According to claim 1, the farmer can also switch to healthier, safer options for providing adequate nutrition to the soil by considering the recommendations given by the system. For example, switching to compost manure gradually by reducing the usage of chemically synthesized fertilizers and tips to improve manure quality.
5. As mentioned in claim 1, Pest attacks can also be visualized with the help of this model. Later, appropriate measures can be taken to avoid such pest attacks suggested by the Pathology division of the agricultural research centers.
| # | Name | Date |
|---|---|---|
| 1 | 202341038965-REQUEST FOR EARLY PUBLICATION(FORM-9) [07-06-2023(online)].pdf | 2023-06-07 |
| 2 | 202341038965-FORM-9 [07-06-2023(online)].pdf | 2023-06-07 |
| 3 | 202341038965-FORM FOR SMALL ENTITY(FORM-28) [07-06-2023(online)].pdf | 2023-06-07 |
| 4 | 202341038965-FORM 1 [07-06-2023(online)].pdf | 2023-06-07 |
| 5 | 202341038965-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [07-06-2023(online)].pdf | 2023-06-07 |
| 6 | 202341038965-EVIDENCE FOR REGISTRATION UNDER SSI [07-06-2023(online)].pdf | 2023-06-07 |
| 7 | 202341038965-EDUCATIONAL INSTITUTION(S) [07-06-2023(online)].pdf | 2023-06-07 |
| 8 | 202341038965-DRAWINGS [07-06-2023(online)].pdf | 2023-06-07 |
| 9 | 202341038965-COMPLETE SPECIFICATION [07-06-2023(online)].pdf | 2023-06-07 |
| 10 | 202341038965-PA [18-03-2024(online)].pdf | 2024-03-18 |
| 11 | 202341038965-FORM28 [18-03-2024(online)].pdf | 2024-03-18 |
| 12 | 202341038965-ASSIGNMENT DOCUMENTS [18-03-2024(online)].pdf | 2024-03-18 |
| 13 | 202341038965-8(i)-Substitution-Change Of Applicant - Form 6 [18-03-2024(online)].pdf | 2024-03-18 |