Abstract: HANDHELD DEVICE FOR DIAGNOSING PLANT DISEASES ABSTRACT A handheld device (100) for diagnosing plant diseases is disclosed. The device (100) comprising: an image acquisition unit (104) adapted to receive plant images captured from an image capturing unit (102). A processing unit (106) is configured to: analyze the received plant images, using a computer vision algorithm (108), to locate an infested section in a plant; compare the infested section in the plant with a disease identification data stored in an offline database (110) for prognosis of a disease, wherein the offline database (110) stores organic treatment recommendations for the corresponding prognosed disease; deploy a natural language processing (NLP) engine (112) to process the treatment recommendations for generating a context-specific information in a farmer-friendly fashion; and activate an output unit (114) for disbursing the generated context-specific information. The device (100) embodies a lightweight design leading to ease in carry and use in fields. Claims: 10, Figures: 4 Figure 1 is selected.
Description:BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to diagnosis of plant infestations and particularly to a handheld device for diagnosing plant diseases.
Description of Related Art
[002] Agricultural productivity depends on maintaining plant health, yet plant diseases remain a significant challenge for farmers. Traditional methods for disease identification rely on expert knowledge, visual inspection, and laboratory analysis. These approaches often require substantial resources, making them inaccessible for small-scale farmers in remote areas. Early detection is essential for preventing large-scale crop damage, but farmers frequently lack reliable tools that enable timely diagnosis.
[003] Technological advancements have introduced digital solutions such as mobile applications and AI-driven models for plant disease identification. Some of these rely on image recognition algorithms that compare plant symptoms with known disease patterns. While such systems provide automated diagnosis, many require internet connectivity, limiting their usability in regions with poor network access. Furthermore, existing solutions primarily focus on disease detection, often neglecting the need for treatment recommendations tailored to specific farming conditions.
[004] Language and literacy barriers further complicate farmers' ability to access agricultural knowledge. Most available tools present information in text-based formats, making them difficult to use for individuals with limited literacy. Moreover, conventional treatment recommendations frequently emphasize chemical-based solutions, raising concerns about environmental sustainability and long-term soil health. A comprehensive system addressing offline accessibility, multilingual support, and sustainable treatment approaches remains absent in the existing technological landscape.
[005] There is thus a need for an improved and advanced handheld device for diagnosing plant diseases that can administer the aforementioned limitations in a more efficient manner.
SUMMARY
[006] Embodiments in accordance with the present invention provide a handheld device for diagnosing plant diseases. The system comprising an image acquisition unit adapted to receive plant images captured from an image capturing unit. The system further comprising a processing unit in communication with the image acquisition unit. The processing unit is configured to analyze the received plant images, using a computer vision algorithm, to locate an infested section in a plant; compare the infested section in the plant with a disease identification data stored in an offline database for prognosis of a disease, wherein the offline database stores organic treatment recommendations for the corresponding prognosed disease; deploy a natural language processing (NLP) engine to process the treatment recommendations for generating a context-specific information in a farmer-friendly fashion; and activate an output unit for disbursing the generated context-specific information.
[007] Embodiments in accordance with the present invention further provide a method for diagnosing plant diseases. The method comprising steps of receiving plant images captured from an image capturing unit; analyzing the received plant images, using a computer vision algorithm, to locate an infested section in a plant; comparing the infested section in the plant with a disease identification data stored in an offline database for prognosis of a disease, wherein the offline database stores organic treatment recommendations for the corresponding prognosed disease; deploying a natural language processing (NLP) engine to process the treatment recommendations for generating a context-specific information in a farmer-friendly fashion; and activating an output unit for disbursing the generated context-specific information.
[008] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide a handheld device for diagnosing plant diseases.
[009] Next, embodiments of the present application may provide a handheld device that operates without requiring an internet connection. The capability of the present application to work without internet may make it highly suitable for farmers in remote areas where connectivity is unreliable or unavailable.
[0010] Next, embodiments of the present application may provide a handheld device that provides recommendations in both audio and visual formats, allowing farmers with different literacy levels to understand and follow instructions easily.
[0011] Next, embodiments of the present application may provide a handheld device that promotes sustainable agriculture by suggesting organic, eco-friendly treatments for plant diseases.
[0012] Next, embodiments of the present application may provide a handheld device that ensures that farmers can quickly capture images of affected plants and receive clear, actionable recommendations without requiring technical expertise.
[0013] Next, embodiments of the present application may provide a handheld device that embodies lightweight design making the device easy to carry and use in the field, while its affordability ensures accessibility for small-scale farmers who may not have the resources for expensive disease detection systems.
[0014] These and other advantages will be apparent from the present application of the embodiments described herein.
[0015] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0017] FIG. 1 illustrates a schematic block diagram of a handheld device for diagnosing plant diseases, according to an embodiment of the present invention;
[0018] FIG. 2A illustrates plant diseases able to be identified using the handheld device, according to an embodiment of the present invention;
[0019] FIG. 2B illustrates the plant diseases able to be identified using the handheld device; and
[0020] FIG. 3 depicts a flowchart of a method for diagnosing plant diseases using the handheld device, according to an embodiment of the present invention.
[0021] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0022] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0023] In any embodiment described herein, the open-ended terms "comprising", "comprises”, and the like (which are synonymous with "including", "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", “consists essentially of", and the like or the respective closed phrases "consisting of", "consists of”, the like.
[0024] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0025] FIG. 1 illustrates a schematic block diagram of a handheld device 100 (hereinafter referred to as the device 100) for diagnosing plant diseases, according to an embodiment of the present invention. The device 100 may be adapted to capture images of a plant. The plant may be, but not limited to, a potato, a maize, a corn, a rice, a pulse, and so forth. Embodiments of the present invention are intended to include or otherwise cover any plant, including known, related art, and/or later developed technologies. Further, the device 100 may check for a presence for infestation in the plant. Upon detection of infestation, the device 100 may recommend treatment for the detected infestation. The device 100 may further render the recommended treatment in a language easily legible by farmers. Moreover, the recommended treatment may either be displayed on the device 100, or the device 100 may narrate/announce the recommended treatment. Furthermore, the device 100 may operate without an internet connection, being suitable for rural and remote agricultural areas.
[0026] In an embodiment of the present invention, the device 100 may be portable. The portability of the device 100 may enable a provision for being handheld. The device 100 may further feature means such as, straps, belts, bands, and so forth, for enhancing a security and portability of the device 100.
[0027] According to the embodiments of the present invention, the device 100 may incorporate non-limiting hardware components to enhance the processing speed and efficiency such as the device 100 may comprise an image capturing unit 102, an image acquisition unit 104, a processing unit 106, a computer vision algorithm 108, an offline database 110, a natural language processing (NLP) engine 112, and an output unit 114. In an embodiment of the present invention, the hardware components of the device 100 may be integrated with computer-executable instructions for overcoming the challenges and the limitations of the existing devices.
[0028] In an embodiment of the present invention, the image capturing unit 102 may be inbuilt and integrated in the device 100. The image capturing unit 102 may be adapted to capture plant images. The image capturing unit 102 may be, but not limited to, a camera, a video camera, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the image capturing unit 102, including known, related art, and/or later developed technologies.
[0029] In an embodiment of the present invention, the image acquisition unit 104 may be inbuilt and integrated in the device 100. The image acquisition unit 104 may be adapted to receive the plant images captured by the image capturing unit 102.
[0030] In an embodiment of the present invention, the processing unit 106 may be in communication with the image acquisition unit 104. The processing unit 106 may be configured to analyze the received plant images. The analysis of the plant images may be carried out using the computer vision algorithm 108. The analysis of the plant images may locate an infested section in the plant.
[0031] The processing unit 106 may be configured to compare the infested section in the plant with a disease identification data stored in the offline database 110. The comparison may enable the processing unit 106 to prognose a disease in the plant. The offline database 110 may further store organic treatment recommendations for the corresponding prognosed disease. The treatment recommendations may be, but not limited to, verified organic farming solutions, bio-pesticides, composting techniques, companion planting suggestions, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the treatment recommendations, including known, related art, and/or later developed technologies. The offline database 110 may further store region-specific disease profiles and organic treatment solutions to ensure location-relevant recommendations.
[0032] The processing unit 106 may be configured to deploy the natural language processing (NLP) engine 112 to process the treatment recommendations. The processing of the treatment recommendations may generate a context-specific information, relating to the treatment recommendations, in a farmer-friendly fashion. The generation of the context-specific information in the farmer-friendly fashion may include solutions such as, but not limited to, a use of local terminologies, a use of laymen language, a specification of brands, and so forth. Embodiments of the present invention are intended to include or otherwise cover any solutions for generation of the context-specific information in the farmer-friendly fashion, including known, related art, and/or later developed technologies.
[0033] The processing unit 106 may be configured to activate the output unit 114 for disbursing the generated context-specific information. The output unit 114 may be, but not limited to, a display screen for displaying the generated context-specific information, a speaker for narrating the generated context-specific information, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the output unit 114, including known, related art, and/or later developed technologies. The processing unit 106 may further be configured to translate the generated context-specific information in a language of choice of the farmer.
[0034] In an exemplary embodiment of the present invention, the device 100 may be used for diagnosing diseases in pearl millet (Pennisetum glaucum) plants and providing organic treatment recommendations. A farmer may capture images of the millet crop using the image capturing unit, particularly focusing on symptomatic areas such as leaves, stems, and panicles. The device 100 may process these images using a computer vision algorithm to detect abnormalities, such as yellow streaks, white downy growth on the underside of leaves, and stunted panicle formation, which may be indicative of downy mildew (Sclerospora graminicola).
[0035] The device 100 may then compare the detected symptoms with region-specific disease profiles stored in the offline database 110. For instance, in semi-arid regions, the database may identify pearl millet rust caused by Puccinia substriata based on observed reddish-brown pustules on leaves, while in humid regions, the device 100 may diagnose blast disease (Pyricularia grisea) from grayish lesions with dark margins on leaves and nodes. The offline database 110 also may include location-relevant organic treatment solutions. For downy mildew, the database may suggest treating seeds with hot water (50°C for 30 minutes) before sowing and applying Pseudomonas fluorescens as a biocontrol agent to infected plants.
[0036] To ensure accessibility, the processing unit 106 may translate the generated context-specific information into a language of the farmer’s choice, making the recommendations easier to understand. The output unit 114 may display the recommendations on a screen, narrate them through a speaker, or use both for better comprehension. This feature ensures that even farmers with limited literacy may receive the guidance effectively. Since the device 100 operates without an internet connection, the device 100 may be suitable for farmers in rural and remote agricultural areas where connectivity is limited. This ensures uninterrupted access to disease diagnosis and treatment recommendations even in isolated regions. The treatment recommendations provided by the device 100 may be selected from verified organic farming solutions, including bio-pesticides, composting techniques, and companion planting suggestions. For example, in case of pearl millet blast disease, the device may recommend spraying a neem oil-based bio-pesticide and intercropping pearl millet with cowpea to naturally reduce disease incidence. For rust disease, the device may suggest applying compost tea spray to boost plant immunity and introducing marigold as a companion plant to deter pests.
[0037] By integrating computer vision, NLP-based translation, and offline disease profiling, the device 100 may provide an effective, farmer-friendly, and region-specific solution for millet disease management, promoting sustainable agricultural practices.
[0038] FIG. 2A illustrates plant diseases 200a able to be identified using the device 100, according to an embodiment of the present invention. In an embodiment of the present invention, images illustrated in the FIG. 2A may be an exemplary plant images that may be captured by the image capturing unit 102 of the device 100. In an embodiment of the present invention, the plant diseases may be, but not limited to, an apple black rot, an apple cedar rust, an apple scab, a bean angular leaf spot, a bean rust, a citrus black spot, a citrus canker, a citrus greening, a potato early blight, a potato late blight, a rice bacterial leaf, a rice brown spot, a rice leaf smut, a tomato bacterial spot, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the plant diseases that may be identified using the device 100.
[0039] In an exemplary embodiment of the present invention, the device 100 may confirm a correct disease by analyzing the captured plant images using the computer vision algorithm 108, which identifies disease-specific patterns such as lesions, discoloration, or fungal growth. The detected symptoms may then be compared with the disease identification data stored in the offline database 110, ensuring an accurate prognosis. Additionally, the device 100 may cross-check multiple visual indicators to differentiate between diseases with similar symptoms, such as distinguishing potato early blight from late blight based on lesion shape and progression.
[0040] FIG. 2B illustrates the plant diseases 200b able to be identified using the device 100. In an embodiment of the present invention, the images illustrated in the FIG. 2B may be the exemplary plant images that may be captured by the image capturing unit 102 of the device 100. In an embodiment of the present invention, the plant diseases may be, but not limited to, a potato early blight, a potato late blight, a maize late blight, a maize common rust, a maize gray leaf spot, a black spot, a canker, a greening, a scab, a melanosis, a stem end root, an anthracnose, a chilling injury, a green mild, a greasy spot, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the plant diseases that may be identified using the device 100.
[0041] In an exemplary embodiment of the present invention, the device 100 may confirm the correct disease by utilizing the computer vision algorithm 108 to analyze key visual symptoms, such as leaf discoloration, fungal growth, necrotic spots, or chlorotic lesions. The detected abnormalities may then be compared against the disease identification data stored in the offline database 110 to ensure an accurate diagnosis. Furthermore, the device may evaluate multiple distinguishing factors, such as lesion shape, size, and distribution, to differentiate between similar diseases—for instance, distinguishing maize gray leaf spot from maize common rust based on lesion texture and color.
[0042] FIG. 3 depicts a flowchart of a method 300 for diagnosing plant diseases using the device 100, according to an embodiment of the present invention.
[0043] At step 302, the device 100 may receive the plant images captured from the image capturing unit 102.
[0044] At step 304, the device 100 may analyze the received plant images, using the computer vision algorithm 108, to locate the infested section in the plant.
[0045] At step 306, the device 100 may compare the infested section in the plant with the disease identification data stored in the offline database 110 for prognosis of the disease. The offline database 110 may further store the organic treatment recommendations for the corresponding prognosed disease.
[0046] At step 308, the device 100 may deploy the natural language processing (NLP) engine 112 to process the treatment recommendations for generating the context-specific information in the farmer-friendly fashion.
[0047] At step 310, the device 100 may activate the output unit 114 for disbursing the generated context-specific information.
[0048] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0049] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
I/We Claim:
1. A handheld device (100) for diagnosing plant diseases, the device (100) comprising:
an image acquisition unit (104) adapted to receive plant images captured from an image capturing unit (102);
a processing unit (106) in communication with the image acquisition unit (104), characterized in that the processing unit (106) is configured to:
analyze the received plant images, using a computer vision algorithm (108), to locate an infested section in a plant;
compare the infested section in the plant with a disease identification data stored in an offline database (110) for prognosis of a disease, wherein the offline database (110) stores organic treatment recommendations for the corresponding prognosed disease;
deploy a natural language processing (NLP) engine (112) to process the treatment recommendations for generating a context-specific information in a farmer-friendly fashion; and
activate an output unit (114) for disbursing the generated context-specific information.
2. The device (100) as claimed in claim 1, wherein the processing unit (106) is configured to translate the generated context-specific information in a language of choice of the farmer.
3. The device (100) as claimed in claim 1, wherein the offline database (110) includes region-specific disease profiles and organic treatment solutions to ensure location-relevant recommendations.
4. The device (100) as claimed in claim 1, wherein the output unit (114) is selected from a display screen for displaying the generated context-specific information, a speaker for narrating the generated context-specific information, or a combination thereof
5. The device (100) as claimed in claim 1, wherein the device (100) operates without an internet connection, being suitable for rural and remote agricultural areas.
6. The device (100) as claimed in claim 1, wherein the treatment recommendations are selected from verified organic farming solutions, bio-pesticides, composting techniques, companion planting suggestions, or a combination thereof.
7. A method (300) for diagnosing plant diseases, the method (300) is characterized by steps of:
receiving plant images captured from an image capturing unit (102);
analyzing the received plant images, using a computer vision algorithm (108), to locate an infested section in a plant;
comparing the infested section in the plant with a disease identification data stored in an offline database (110) for prognosis of a disease, wherein the offline database (110) stores organic treatment recommendations for the corresponding prognosed disease;
deploying a natural language processing (NLP) engine (112) to process the treatment recommendations for generating a context-specific information in a farmer-friendly fashion; and
activating an output unit (114) for disbursing the generated context-specific information.
8. The method (300) as claimed in claim 7, wherein the treatment recommendations are selected from verified organic farming solutions, bio-pesticides, composting techniques, companion planting suggestions, or a combination thereof.
9. The method (300) as claimed in claim 7, wherein the offline database (110) includes region-specific disease profiles and organic treatment solutions to ensure location-relevant recommendations.
10. The method (300) as claimed in claim 7, wherein the output unit (114) is selected from a display screen for displaying the generated context-specific information, a speaker for narrating the generated context-specific information, or a combination thereof.
Date: March 27, 2025
Place: Noida
Nainsi Rastogi
Patent Agent (IN/PA-2372)
Agent for the Applicant
| # | Name | Date |
|---|---|---|
| 1 | 202541030226-STATEMENT OF UNDERTAKING (FORM 3) [28-03-2025(online)].pdf | 2025-03-28 |
| 2 | 202541030226-REQUEST FOR EARLY PUBLICATION(FORM-9) [28-03-2025(online)].pdf | 2025-03-28 |
| 3 | 202541030226-POWER OF AUTHORITY [28-03-2025(online)].pdf | 2025-03-28 |
| 4 | 202541030226-OTHERS [28-03-2025(online)].pdf | 2025-03-28 |
| 5 | 202541030226-FORM-9 [28-03-2025(online)].pdf | 2025-03-28 |
| 6 | 202541030226-FORM FOR SMALL ENTITY(FORM-28) [28-03-2025(online)].pdf | 2025-03-28 |
| 7 | 202541030226-FORM 1 [28-03-2025(online)].pdf | 2025-03-28 |
| 8 | 202541030226-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [28-03-2025(online)].pdf | 2025-03-28 |
| 9 | 202541030226-EDUCATIONAL INSTITUTION(S) [28-03-2025(online)].pdf | 2025-03-28 |
| 10 | 202541030226-DRAWINGS [28-03-2025(online)].pdf | 2025-03-28 |
| 11 | 202541030226-DECLARATION OF INVENTORSHIP (FORM 5) [28-03-2025(online)].pdf | 2025-03-28 |
| 12 | 202541030226-COMPLETE SPECIFICATION [28-03-2025(online)].pdf | 2025-03-28 |