Abstract: ABSTRACT A SYSTEM AND METHOD FOR DETECTING, MONITORING, AND MITIGATING PLANT PESTS AND DISEASES The present invention relates to a system for detecting, monitoring, and mitigating plant pests and diseases. The system (100) includes an imaging unit (102) to capture images of plants, a single board computer (104) to receive captured images from the imaging unit (102) and transmit the received images to a server, a wireless modem (106) to connect to the single board computer to transmit the images of plants to the server, a gantry system (108) to facilitate placement of the imaging unit (102), the single board computer (104), the wireless modem (106), and a power supply unit (110), and to enable movement to capture images of several plants, a power supply unit (110), and a server (112) to receive and perform analysis of the images to detect and classify pests and diseases, identify pests or diseased areas, and determine shape of the diseased areas, and provide recommendations to a user. FIG.1
Description:TECHNICAL FIELD
[0001] The present disclosure relates generally to pest and disease detection, and more particularly, to a system and method for detecting, monitoring, and mitigating plant pests and diseases by utilizing deep learning (DL), machine learning (ML), and computer vision techniques.
BACKGROUND
[0002] Pests are undesirable organisms that cause significant damage to plants, resulting in reduced yields, financial losses, and disruptions in ecological systems. These organisms, including insects, bugs, pathogens, and other pests, generally damage plants directly or indirectly. Direct damage occurs when pests feed on plants, such as by chewing leaves or burrowing into stems, fruits, or roots and indirect damage occurs when pests transmit bacterial, viral, or fungal infections to plants, causing deteriorated plant health, and increased susceptibility to diseases.
[0003] Further, pest infestations cause significant economic repercussions leading to substantial financial losses for farmers and stakeholders in the agricultural sector. These losses typically result from reduced crop yields, increased expenses for pest control measures, damage to agricultural infrastructure, and decreased market value of affected crops. The financial impact of pest infestations can be severe, affecting the livelihoods and profitability of farmers, as well as the overall economic stability of the agricultural industry. The impact is not limited to economic aspects as it can extend to ecological imbalances and potentially disrupt delicate balance among organisms within ecosystems.
[0004] Therefore, it is crucial to implement effective pest management strategies to mitigate these losses. Additionally, it is essential to focus on early detection and prevention of diseases in crops to ensure their health and vitality.
[0005] At present, disease detection is currently performed through manual observation, which involves a significant number of experts and continuous monitoring of plants. However, this method can be costly, particularly for large farms. Additionally, relying solely on human observation leaves room for human error and may not always provide timely detection of diseases.
[0006] Furthermore, pheromone traps are also utilized to trap pests and prevent plant damage. These traps typically use historical data on pest infestation during specific growth periods of crops/plants and treat the entire crops or plants with pesticides. However, this method lacks efficiency as it fails to detect pests and diseases at an early stage.
[0007] There exist numerous prior arts that disclose different types of systems and methods for detecting plant pests and diseases.
[0008] The existing prior art discloses a system and method to detect crop diseases using a UAV. The system and method to detect and identify diseases and insects for suitable control comprise an unmanned aerial vehicle with a first set of sensors, a sensor grid, a communication module that includes a transceiver 108-1, GPS 108-2 and GSM 108-3, and a computing unit in communication with the unmanned aerial vehicle. In an embodiment, the computing unit is configured to receive images of the crop to extract and identify any or a combination of diseases and insects using a machine learning algorithm and further configured to transmit to a mobile device through a network. Further, the electrical power supply to the system is coupled to a power module.
[0009] However, the existing prior art is silent about a system and method for detecting, monitoring, and mitigating plant pests and diseases using a technique based on a deep learning (DL), machine learning (ML), and computer vision techniques. Further, there is no mention of utilizing the ML model for detecting and identifying shape of the diseased area. In addition, the prior art is silent about the system and the method which is configured to provide a feedback on fertilizers and preventive measures to users based on severity and type of pests and diseases.
[0010] Therefore, in light of the foregoing discussion, there exists a need to overcome the aforementioned drawbacks, shortcomings, and limitations associated with the conventional plant pests and diseases detection system and method.
OBJECTIVES OF THE PRESENT DISCLOSURE
[0011] It is an objective of the present disclosure to provide an efficient system for detecting, monitoring, and mitigating plant pests and diseases.
[0012] It is yet another objective of the present disclosure to provide a system that utilizes deep learning, machine learning & computer vision based technology for automatic detection and classification of pests and plant diseases at an early stage,
[0013] It is yet another objective of the present disclosure to provide the system that is capable of identifying pests or diseased areas and determining shape of the diseased areas by utilizing a machine learning model.
[0014] It is another objective of the present disclosure to provide recommendations to users on fertilizers and treatment based on type of severity of pests and diseases.
[0015] It is yet another objective of the present disclosure to provide a system that has a Linux operating system for ease of building applications for future upgrades.
[0016] The other objectives, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0017] The present invention relates to a system for detecting, monitoring, and mitigating plant pests and diseases. The system includes an imaging unit to capture images of plants. In an embodiment, the imaging unit is a multi-lens camera. The system further includes a single board computer to receive captured images from the imaging unit and transmit the received images to a server through a wireless communication network. In an embodiment, the single board computer (104) is further configured to interface and receive information from various sensors and imaging units and transmit consolidated data securely to the server through the wireless modem. In an exemplary embodiment, the single board computer (104) is capable of interfacing with various sensors and imaging units via standard communication protocols such as USB, RS232, RS485, and Ethernet.
[0018] The system further includes a wireless modem to connect to the single board computer to transmit the images of plants to the server. In an embodiment, the wireless modem is a Long Term Evolution or LTE modem and is configured to encode the plant images into digital data and package them into packets suitable for wireless transmission
[0019] The system further includes a gantry system to facilitate placement of the imaging unit, the single board computer, the wireless modem, and a power supply unit, and further configured to enable movement to capture images of several plants from different angles and location. In an embodiment, the gantry system is configured to be operated using stepper motor drivers.
[0020] The system further includes a power supply unit to provide power to the imaging unit, the single board computer, the wireless modem, and the gantry system. In an embodiment, the power supply unit includes a solar panel that captures sunlight and converts it into electrical energy and a high capacity lithium battery to store the electrical energy converted by the solar panel.
[0021] Thereafter, the system includes a server to receive and perform analysis of the images to detect and classify pests and diseases, identify pests or diseased areas, and determine shape of the diseased areas, and provide recommendations to a user to mitigate impact of pests and diseases on the plants. In an embodiment, the server includes a detection module to detect pests and diseases, a classification module to classify pests and diseases, and a labeling module to identify pests or diseased areas and determine shape of the diseased areas to provide recommendations to the user for fertilizers and treatment based on severity and type of pests and diseases detected. It is important to note that all the modules in the server utilizes a deep learning (DL), machine learning (ML), and computer vision techniques. The server may be a cloud server or a remote server, or a nearby server.
[0022] In an embodiment, the system is configured to establish a private wireless network to connect to other single board computers in the same area.
[0023] The present invention further relates to a method for performing analysis of the images and providing recommendations to a user. The method includes receiving images of the plant. In an embodiment, the images include images of several plants captured at different angles and locations.
[0024] The method further includes preprocessing and segmenting the received images.
[0025] The method further includes extracting a plurality of characteristics from the segmented images.
[0026] The method further includes identifying and analyzing characteristics of the segmented images to detect and classify pests and diseases.
[0027] The method further includes identifying pests or diseased areas and determining shape of the diseased areas s to provide recommendations to the user for fertilizers and treatment based on severity and type of pests and diseases detected. In an embodiment, detection and classification of pests and diseases, identification of pests or diseased areas, and determination of shape of the diseased areas are performed by utilizing deep learning (DL), machine learning (ML), and computer vision techniques.
[0028] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described earlier, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
[0029] The accompanying drawings, which are incorporated herein and constitute a part of this disclosure, illustrate exemplary embodiments, and together with the description, serve to explain the disclosed principles. The same numbers are used throughout the figures to reference like features and components, wherein:
[0030] FIG. 1 depicts a block diagram of a system for detecting, monitoring, and mitigating plant pests and diseases, in accordance with one or more exemplary embodiments of the present disclosure;
[0031] FIG. 2 depicts a block diagram representing different components of a server, in accordance with one or more exemplary embodiments of the present disclosure; and
[0032] FIG. 3 depicts a flow diagram showing a method for performing analysis of the images and providing recommendations to the user, in accordance with one or more exemplary embodiments of the present disclosure
DETAILED DESCRIPTION OF EMBODIMENTS
[0033] In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one skilled in the art that these specific details are only exemplary and not intended to be limiting. Additionally, it may be noted that the apparatus and/or methods are shown in block diagram form only in order to avoid obscuring the present disclosure. It is to be understood that various omissions and substitutions of equivalents may be made as circumstances may suggest or render expedient to cover various applications or implementations without departing from the spirit or the scope of the present disclosure. Further, it is to be understood that the phraseology and terminology employed herein are for clarity of the description and should not be regarded as limiting.
[0034] Furthermore, in the present description, references to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. The appearance of the phrase “in one embodiment” in various places in the specification does not necessarily refer to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments.
[0035] Further, the terms “a” and “an” used herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described, which may be requirements for some embodiments but not for other embodiments.
[0036] Referring to FIG. 1, a system for detecting, monitoring, and mitigating plant pests and diseases is disclosed, according to an embodiment. The objective of the system is to detect and classify pests and diseases, identify pests or diseased areas, and determine shape of the diseased areas to provide early detection of pest infestations in crops such as tomato, Pomegranate, Cotton & Coffee. Additionally, the system provides recommendations to users to mitigate the impact of pests and diseases on these plants.
[0037] As depicted, the system (100) includes an imaging unit (102). In an embodiment, the imaging unit (102) is a multi-lens camera which is configured to capture images of plants. It is important to note that the multi-lens camera captures detailed and high-quality images of plants from different angles and perspectives. By utilizing the multi-lens camera, the system (100) is capable of capturing detailed and comprehensive images of the plants effectively. In another embodiment, the imaging unit (102) is a super-resolution camera or multi-camera arrays.
[0038] The system (100) further includes a single board computer (104). The single board computer (104) is configured to receive captured images from the imaging unit (102) and transmit the received images to a server through a wireless communication network. In an embodiment, the single board computer (104) is further configured to interface and receive information from various sensors and imaging units and transmit consolidated data securely to the server (112) through the wireless modem (106).
[0039] It is important to note that the single board computer is a complete computer system built on a single circuit board which typically includes a microprocessor, memory, input/output (I/O) interfaces, and other necessary components for a functional computer. In an exemplary embodiment, the single board computer (104) may be connected to one or more sensors to collect data related to pests or their activities. These sensors may include the imaging unit (102), a Hall sensor for detecting magnetic fields, an IR sensor for infrared detection, a microphone (MIC) for audio data, a gyro sensor for measuring orientation, a light sensor for ambient light detection, and a compass. Once the data is collected, the single board computer processes and analyzes the data using its computing capabilities and then transmits it to the server for further analysis or storage.
[0040] In an exemplary embodiment, the single board computer (104) is equipped with a Hexa-core processor that operates at a speed of up to 2 GHz. This processor provides ample computing power to handle data processing tasks efficiently. Additionally, the single board computer supports up to 4GB DDR3 RAM, which enhances its multitasking capabilities and allows for efficient data handling. Furthermore, it offers a storage capacity of 64GB for storing collected data and other relevant files.
[0041] The single board computer (104) is also capable of interfacing with a General Purpose Input/Output (GPIO) that allows for customization of hardware and communication ports. This capability of the single board computer (104) enables the users to connect and interface with a wide range of external devices or components, further expanding the functionality and adaptability of the pest detection system.
[0042] The system (100) further includes a wireless modem (106). The wireless modem (106) is configured to connect to the single board computer (104) to transmit the images of plants to the server. In an embodiment, the wireless modem (106) is a Long Term Evolution or LTE modem and is configured to encode the plant images into digital data and package them into packets suitable for wireless transmission.
[0043] The system (100) further includes a gantry system (108), which is configured to facilitate placement of the imaging unit (102), the single board computer (104), the wireless modem (106), and a power supply unit (110), and further configured to enable movement to capture images of several plants. In an embodiment, the gantry system (108) is configured to be operated using stepper motor drivers.
[0044] The system (100) further includes a power supply unit (110), which is configured to provide power to the imaging unit (102), the single board computer (104), the wireless modem (106), and the gantry system (108). In an embodiment, the power supply unit (110) comprises a solar panel that captures sunlight and converts it into electrical energy and a high capacity lithium battery to store the electrical energy converted by the solar panel.
[0045] The system (100) further includes a server (112) that receives and performs analysis of the images to detect and classify pests and diseases, identify pests or diseased areas, and determine shape of the diseased areas, and provide recommendations to a user to mitigate impact of pests and diseases on the plants, which is explained in detail in FIG. 2.
[0046] Referring to FIG. 2, a block diagram representing different components of the server is disclosed, in accordance with one or more exemplary embodiments of the present disclosure;
[0047] As depicted, the server (112) includes a detection module (202). The detection module (202) is configured to receive images of the plants to detect pests and diseases. The server (112) further includes a classification module (204) to classify pests and diseases. Thereafter, the server (112) includes a labeling module (206) to identify pests or diseased areas and determine the shape of the diseased areas to provide recommendations to the user for fertilizers and treatment based on severity and type of pests and diseases detected.
[0048] In an embodiment, the server (112) is a cloud server or a remote server or a nearby server.
[0049] Referring to FIG. 3, a flow diagram showing a method for performing analysis of the images and providing recommendations to the user is disclosed, in accordance with one or more exemplary embodiments of the present disclosure;
[0050] The method may be explained in conjunction with the system disclosed in FIG.1 and FIG.2. In the flow diagram, each block may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the blocks may occur out of the order noted in the drawings.
[0051] For example, two blocks shown in succession in FIG. 3 may be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
[0052] Any process descriptions or blocks in flowcharts should be understood as representing modules, segments, or portions of code that include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included within the scope of the example embodiments in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved. In addition, the process descriptions or blocks in flow charts should be understood as representing decisions made by a hardware structure such as a state machine. The flow diagram starts at step (302) and proceeds to step (308).
[0053] At step 302, images of the plants are received. In an embodiment, the received images include images of several plants captured at different angles and locations.
[0054] Successively, the received images are preprocessed and segmented, at step 304. It is important to note that preprocessing refers to a set of techniques applied to the received images to enhance their quality or extract relevant features. This may involve operations such as noise reduction, image resizing, contrast adjustment, or color normalization. The aim of preprocessing is to improve the accuracy and effectiveness of subsequent analysis and processing steps.
[0055] Further, segmentation involves dividing the preprocessed images into meaningful regions or segments. The segmentation is generally performed to separate the different components or objects present in the images. By segmenting the images, it becomes easier to isolate and analyze specific areas of interest, such as diseased area or pests.
[0056] Successively, a plurality of characteristics of the segmented images is identified and analyzed, at step 306, to detect and classify pests and diseases.
[0057] Thereafter, pests or diseased areas are identified and shape of the diseased areas is determined to provide recommendations to the user, at step 308, for fertilizers and treatment based on severity and type of pests and diseases detected. In an embodiment, detection and classification of pests and diseases, identification of pests or diseased areas, and determination of shape of the diseased areas are performed by utilizing deep learning (DL), machine learning (ML), and computer vision techniques.
[0058] In an embodiment, the system (100) may be configured to establish a private wireless network to connect to other single board computers in the same area.
[0059] In an embodiment, the system (100) may be configured to allow execution of machine learning (ML) algorithms locally. This means that the users may run ML models directly on the Linux-based systems without the need for the cloud server or external infrastructure
[0060] In an embodiment, the system (100) may be built on a Linux operating system (OS) which provides advantages in terms of flexibility and ease of building applications for future upgrades. It is important to note that Linux is known for its robustness, scalability, and compatibility with various software development tools and frameworks.
[0061] In an embodiment, the system (100) may include an inbuilt HTTP server and support REST API key features. This implies that the system (100) may handle HTTP requests and responses, allowing users to interact with the system (100) through REST APIs.
[0062] It has thus been seen that the system for detecting, monitoring, and mitigating plant pests and diseases according to the present invention achieves the purposes highlighted earlier. Such an apparatus can in any case undergo numerous modifications and variants, all of which are covered by the same innovative concept, moreover, all of the details can be replaced by technically equivalent elements. The scope of protection of the invention is therefore defined by the attached claims.
Dated this 29th day of May, 2024
Ankush Mahajan
Agent for the Applicant (IN/PA-1523)
OF CoreIP Legal Services Pvt. Ltd.
, Claims:We Claim:
1. A system for detecting, monitoring, and mitigating plant pests and diseases, the system (100) comprising:
an imaging unit (102), wherein the imaging unit (102) is configured to capture images of plants;
a single board computer (104), wherein the single board computer (104) is configured to receive captured images from the imaging unit (102) and transmit the received images to a server through a wireless communication network;
a wireless modem (106), wherein the wireless modem (106) is configured to connect to the single board computer to transmit the images of plants to the server;
a gantry system (108), wherein the gantry system (108) is configured to facilitate placement of the imaging unit (102), the single board computer (104), the wireless modem (106), and a power supply unit (110), and further configured to enable movement to capture images of several plants;
the power supply unit (110), wherein the power supply unit (110) is configured to provide power to the imaging unit (102), the single board computer (104), the wireless modem (106), and the gantry system (108); and
the server (112), wherein the server (112) receives and performs analysis of the images to detect and classify pests and diseases, identify pests or diseased areas, and determine shape of the diseased areas, and provides recommendations to a user to mitigate impact of pests and diseases on the plants.
2. The system (100) as claimed in claim 1, wherein the server (112) comprising:
a detection module (202), which is configured to detect pests and diseases;
a classification module (204), which is configured to classify the detected pests and diseases; and
a labeling module (206) to identify pests or diseased areas and determine shape of the diseased areas to provide recommendations to the user for fertilizers and treatment based on severity and type of pests and diseases detected, wherein the detection module (202), classification module (204), and the labeling module (206) utilize deep learning (DL), machine learning (ML), and computer vision techniques.
3. The system (100) as claimed in claim 1, wherein the imaging unit (102) is a multi-lens camera.
4. The system (100) as claimed in claim 1, wherein the single board computer (104) is further configured to interface and receive information from various sensors and imaging units and transmit consolidated data securely to the server (112) through the wireless modem (106).
5. The system (100) as claimed in claim 1, wherein the wireless modem (106) is a Long Term Evolution or LTE modem and is configured to encode the plant images into digital data and package them into packets suitable for wireless transmission.
6. The system (100) as claimed in claim 1, wherein the gantry system (108) is configured to be operated using stepper motor drivers.
7. The system (100) as claimed in claim 1, wherein the power supply unit (110) comprises a solar panel that captures sunlight and converts it into electrical energy and a high capacity lithium battery to store the electrical energy converted by the solar panel.
8. The system (100) as claimed in claim 1, wherein the server (112) is a cloud server or a remote server or a nearby server.
9. The system (100) as claimed in claim 1, wherein the system (100) is configured to establish a private wireless network to connect to other single board computers in the same area.
10. A method (200) for performing analysis of the images and providing recommendations to the user comprising steps:
receiving images of the plants;
preprocessing and segmenting the received images;
identifying and analyzing a plurality of characteristics of the segmented images to detect and classify pests and diseases; and
identifying pests or diseased areas and determining shape of the diseased areas to provide recommendations to the user for fertilizers and treatment based on severity and type of pests and diseases detected.
Dated this 29th day of May, 2024
Ankush Mahajan
Agent for the Applicant (IN/PA-1523)
OF CoreIP Legal Services Pvt. Ltd.
| # | Name | Date |
|---|---|---|
| 1 | 202441041636-STATEMENT OF UNDERTAKING (FORM 3) [29-05-2024(online)].pdf | 2024-05-29 |
| 2 | 202441041636-PROOF OF RIGHT [29-05-2024(online)].pdf | 2024-05-29 |
| 3 | 202441041636-POWER OF AUTHORITY [29-05-2024(online)].pdf | 2024-05-29 |
| 4 | 202441041636-FORM FOR STARTUP [29-05-2024(online)].pdf | 2024-05-29 |
| 5 | 202441041636-FORM FOR SMALL ENTITY(FORM-28) [29-05-2024(online)].pdf | 2024-05-29 |
| 6 | 202441041636-FORM 1 [29-05-2024(online)].pdf | 2024-05-29 |
| 7 | 202441041636-FIGURE OF ABSTRACT [29-05-2024(online)].pdf | 2024-05-29 |
| 8 | 202441041636-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [29-05-2024(online)].pdf | 2024-05-29 |
| 9 | 202441041636-EVIDENCE FOR REGISTRATION UNDER SSI [29-05-2024(online)].pdf | 2024-05-29 |
| 10 | 202441041636-DRAWINGS [29-05-2024(online)].pdf | 2024-05-29 |
| 11 | 202441041636-DECLARATION OF INVENTORSHIP (FORM 5) [29-05-2024(online)].pdf | 2024-05-29 |
| 12 | 202441041636-COMPLETE SPECIFICATION [29-05-2024(online)].pdf | 2024-05-29 |
| 13 | 202441041636-FORM 18 [03-06-2024(online)].pdf | 2024-06-03 |
| 14 | 202441041636-FORM-9 [16-07-2025(online)].pdf | 2025-07-16 |