Abstract: HYBRID FRAMEWORK AND METHOD FOR AUTOMATED CLASSIFICATION OF ORYZA LEAF DISEASES ABSTRACT A hybrid framework (100) for automated classification of Oryza leaf diseases. The framework (100) comprising an image acquisition unit (106) adapted to receive leaf images from a computing device (102). A processing unit (108) is configured to pre-process the leaf images received by the image acquisition unit (106); isolate segments depicting leaves from the pre-processed leaf images; extract high-level features from the isolated segments; deploy a machine learning model (112) adapted to identify and classify discriminative physiology from the extracted high-level features; compare the identified discriminative physiology with a training dataset (114) comprising pretrained leaf images; and classify the corresponding leaf images into one of predefined categories such as healthy or diseased based on the learned discriminative physiology. The framework (100) allows farmers to upload the leaf images for immediate diagnosis. This user-centric and cost-effective solution improves accessibility and encourages timely action in crop management. Claims: 10, Figures: 3 Figure 1 is selected.
Description:BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to a plant disease detection and particularly to a hybrid framework for an automated classification of Oryza leaf diseases.
Description of Related Art
[002] Rice remains a vital crop for global food security, serving as the primary food source for a significant portion of the world’s population. The productivity of rice fields often suffers from various leaf diseases such as bacterial leaf blight, brown spot, and rice blast, which reduce yield and affect grain quality. Early identification and proper management of these diseases remain essential for minimizing crop losses. In many regions, particularly in developing countries, farmers continue to rely on manual visual inspection to identify symptoms. This method introduces delays, causes inconsistency in diagnosis, and increases dependence on human expertise.
[003] Image analysis has gained attention in agricultural disease detection. Computational models can extract complex features from leaf images and provide high classification accuracy when trained on large datasets. However, these models demand significant computational capacity and hardware support, which limits their adoption in rural or small-scale farming setups.
[004] Several commercial tools claim to offer AI-based solutions for plant disease recognition, yet most of them remain generalized and do not tailor their frameworks to specific crops like rice. Solutions such as Plantix and Agrio demonstrate the use of AI in agriculture but fall short in addressing the nuanced features of rice-specific diseases. Additionally, many existing systems apply single-model approaches that lack adaptability across different environmental conditions, crop varieties, and image qualities.
[005] There is thus a need for an improved and advanced hybrid framework for automated classification of Oryza leaf diseases that can administer the aforementioned limitations in a more efficient manner.
SUMMARY
[006] Embodiments in accordance with the present invention provide a hybrid framework for automated classification of Oryza leaf diseases. The framework comprising an image acquisition unit adapted to receive leaf images from a computing device. The framework further comprising a processing unit in communication with the image acquisition unit. The processing unit is configured to pre-process the leaf images received by the image acquisition unit; isolate segments depicting leaves from the pre-processed leaf images; and extract high-level features from the isolated segments. The high-level features are extracted using a deep learning computational technique. The deep learning computational technique is selected from a pretrained Convolutional Neural Network (CNN), a ResNet model, or a combination thereof; deploy a machine learning model adapted to identify and classify discriminative physiology from the extracted high-level features. The machine learning model is selected from a Support Vector Machine (SVM) algorithm, a Random Forest (RF) algorithm, or a combination thereof. The identified discriminative physiology is compared with a training dataset comprising pretrained leaf images; and classify the corresponding leaf images into one of predefined categories selected from healthy or diseased based on the learned discriminative physiology and classification confidence thresholds derived from the training dataset.
[007] Embodiments in accordance with the present invention further provide a method for an automated classification of Oryza leaf diseases. The method comprising steps of receiving leaf images from a computing device; pre-processing the received leaf images by an image acquisition unit; isolating segments depicting leaves from the pre-processed leaf images; extracting high-level features from the isolated segments. The high-level features are extracted using a deep learning computational technique. The deep learning computational technique is selected from a pretrained Convolutional Neural Network (CNN), a ResNet model, or a combination thereof; deploying a machine learning model adapted to identify and classify discriminative physiology from the extracted high-level features. The machine learning model is selected from a Support Vector Machine (SVM) algorithm, a Random Forest (RF) algorithm, or a combination thereof; comparing the identified discriminative physiology with a training dataset comprising pretrained leaf images; and classifying the corresponding leaf images into one of predefined categories such as healthy or diseased based on the learned discriminative physiology and classification confidence thresholds derived from the training dataset.
[008] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide a hybrid framework for automated classification of Oryza leaf diseases.
[009] Next, embodiments of the present application may provide a hybrid framework that combines the feature extraction strength of deep learning with the classification precision of traditional machine learning. This integration enhances the system’s ability to correctly identify and differentiate between various rice leaf diseases, even under challenging image conditions.
[0010] Next, embodiments of the present application may provide a hybrid framework that reduces the overall computational requirements. This enables effective deployment on devices with limited resources, such as smartphones or low-cost edge devices.
[0011] Next, embodiments of the present application may provide a hybrid framework that allows farmers to simply upload leaf images for immediate diagnosis. This user-centric design improves accessibility and encourages timely action in crop management.
[0012] Next, embodiments of the present application may provide a hybrid framework that allows scaling across different geographies and farm sizes, from smallholder farms to commercial agricultural operations.
[0013] Next, embodiments of the present application may provide a hybrid framework that exhibits robustness against variations in lighting, image quality, and background noise. This makes it well-suited for real-world agricultural environments, in which imaging conditions can fluctuate significantly.
[0014] These and other advantages will be apparent from the present application of the embodiments described herein.
[0015] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0017] FIG. 1 illustrates a schematic block diagram of a hybrid framework for automated classification of Oryza leaf diseases, according to an embodiment of the present invention;
[0018] FIG. 2 illustrates a block diagram of a processing unit, according to an embodiment of the present invention; and
[0019] FIG. 3 depicts a flowchart of a method for automated classification of Oryza leaf diseases, according to an embodiment of the present invention.
[0020] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0021] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0022] In any embodiment described herein, the open-ended terms "comprising", "comprises”, and the like (which are synonymous with "including", "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", “consists essentially of", and the like or the respective closed phrases "consisting of", "consists of”, the like.
[0023] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0024] FIG. 1 illustrates a schematic block diagram of a hybrid framework 100 (hereinafter referred to as the framework 100) for automated classification of Oryza leaf diseases, according to an embodiment of the present invention. In an embodiment of the present invention, Oryza leaves may be grass leaves derived from plants belonging to genus Oryza, including but not limited to Oryza sativa (commonly known as Asian rice), which are susceptible to various foliar diseases that impact crop yield and quality. The framework 100 may be configured to classify the Oryza leaves into healthy or diseased categories using image-based analysis, according to an embodiment of the present invention.
[0025] In an embodiment of the present invention, the framework 100 may be adapted to detect a presence of disease in a received leaf images. Moreover, the framework 100 may classify and evaluate a stage of the detected disease in the received leaf images. Furthermore, the framework 100 may train an artificially computable model for adaptive learning and disease progression prediction. Further, the training may be driven by real-time updates based on emerging disease trends. The framework 100 may utilize advanced feature extraction techniques to analyze progressive changes in leaves in correlation with emerging disease and/or past infestations.
[0026] According to the embodiments of the present invention, the framework 100 may incorporate non-limiting hardware components to enhance the processing speed and efficiency such as the framework 100 may comprise a computing device 102, a computer application 104, an image acquisition unit 106, a processing unit 108, a deep learning computational technique 110, a machine learning model 112, and a training dataset 114. In an embodiment of the present invention, the hardware components of the framework 100 may be integrated with computer-executable instructions for overcoming the challenges and the limitations of the existing frameworks.
[0027] In an embodiment of the present invention, the computing device 102 may be adapted to upload the leaf images to the framework 100. The computing device 102 may be, but not limited to, a laptop, a mobile, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the computing device 102, including known, related art, and/or later developed technologies. The computing device 102 may comprise the computer application 104 adapted to display the categorization of the leaf images conducted by the framework 100. The categorization of the leaf images conducted by the framework 100 may be healthy or diseased. The computer application 104 may be, but not limited to, a web application, a standalone application, an Unstructured Supplementary Service Data (USSD) application, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the computer application 104, including known, related art, and/or later developed technologies.
[0028] In an embodiment of the present invention, the image acquisition unit 106 may be adapted to receive the leaf images from the computing device 102.
[0029] In an embodiment of the present invention, the processing unit 108 may be in communication with the image acquisition unit 106. The processing unit 108 may further be configured to execute computer-executable instructions to generate an output relating to the framework 100. According to embodiments of the present invention, the processing unit 108 may be, but not limited to, a Programmable Logic Control (PLC) unit, a microprocessor, a development board, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the processing unit 108 including known, related art, and/or later developed technologies. In an embodiment of the present invention, the processing unit 108 may further be explained in conjunction with FIG. 2.
[0030] FIG. 2 illustrates a block diagram of the processing unit 108 of the framework 100, according to an embodiment of the present invention. The processing unit 108 may comprise the computer-executable instructions in form of programming modules such as a data receiving module 200, a data preprocessing module 202, a data extraction module 204, a data comparison module 206, and a data classification module 208.
[0031] In an embodiment of the present invention, the data receiving module 200 may be configured to receive the leaf images from a computing device 102. The data receiving module 200 may be configured to transmit the received leaf images to the data preprocessing module 202.
[0032] The data preprocessing module 202 may be activated upon receipt of the leaf images from the data receiving module 200. In an embodiment of the present invention, the data preprocessing module 202 may be configured to pre-process the leaf images. The preprocessing of the received leaf images may be carried out by resizing input images to a fixed dimension, normalizing pixel values, applying data augmentation, applying data normalization, flipping, rotating, brightness adjustment, and so forth. Embodiments of the present invention are intended to include or otherwise cover any means for preprocessing of the received leaf images, including known, related art, and/or later developed technologies. The data preprocessing module 202 may be configured to transmit the pre-processed leaf images to the data extraction module 204.
[0033] The data extraction module 204 may be activated upon receipt of the pre-processed leaf images from the data preprocessing module 202. In an embodiment of the present invention, the data extraction module 204 may be configured to isolate segments depicting the leaves from the pre-processed leaf images.
[0034] Further, the data extraction module 204 may be configured to extract high-level features from the isolated segments. The high-level features may be extracted using the deep learning computational technique 110. The deep learning computational technique 110 may be, but not limited to, a pretrained Convolutional Neural Network (CNN), a ResNet model, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the deep learning computational technique 110, including known, related art, and/or later developed technologies.
[0035] Further, the data extraction module 204 may be configured to deploy the machine learning model 112 may be adapted to identify and classify discriminative physiology from the extracted high-level features. The discriminative physiology may be, but not limited to, a bacterial leaf, a brown spot, a blast spot, a terminal blast, a blister, a blight infestation, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the discriminative physiology, including known, related art, and/or later developed technologies. The machine learning model 112 may be, but not limited to, a Support Vector Machine (SVM) algorithm, a Random Forest (RF) algorithm, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the machine learning model 112, including known, related art, and/or later developed technologies. The machine learning model 112 may provide higher accuracy in detecting the disease. Further, the deep learning computational technique 110 and the machine learning model 112 may achieve abilities enablement of handling a diverse and complaint set of the leaf images. The data extraction module 204 may be configured to transmit the discriminative physiology to the data comparison module 206.
[0036] The data comparison module 206 may be activated upon receipt of the discriminative physiology from the data extraction module 204. In an embodiment of the present invention, the data comparison module 206 may be configured to compare the identified discriminative physiology with the training dataset 114 comprising pretrained leaf images. Upon comparison, if the discriminative physiology may match with physiologies on the pretrained leaf images, then the data comparison module 206 may transmit a classification signal to the data classification module 208. Otherwise, the data comparison module 206 may reactivate the data extraction module 204 for re-identification and re-extraction of the discriminative physiology.
[0037] The data classification module 208 may be activated upon receipt of the classification signal from the data comparison module 206. The data classification module 208 may be configured to classify the corresponding leaf images into one of the predefined categories selected from the healthy category or the diseased category, based on the learned discriminative physiology and classification confidence thresholds derived from the training dataset 114.
[0038] The data classification module 208 may be configured to resupply the corresponding leaf images, classified into the healthy category or the diseased category to the training dataset 114 for strengthening of the framework 100 effectiveness and reliability. Further, the data classification module 208 may be configured to transmit retention and adjust hyper parameters to maximize an effectiveness and a reliability of the training dataset 114. In an embodiment of the present invention, the hyperparameters may be selected from a learning rate, batch size, number of training epochs, regularization factors, network architecture parameters such as number of layers or neurons per layer, and so forth. The adjustment in the hyper parameters may optimize training performance, minimize overfitting, and enhance classification accuracy, according to an embodiment of the present invention.
[0039] FIG. 3 depicts a flowchart of a method 300 for the automated classification of the Oryza leaf diseases, according to an embodiment of the present invention.
[0040] At step 302, the framework 100 may receive the leaf images from the computing device 102.
[0041] At step 304, the framework 100 may pre-process the leaf images.
[0042] At step 306, the framework 100 may isolate the segments depicting the leaves from the pre-processed leaf images.
[0043] At step 308, the framework 100 may extract the high-level features from the isolated segments using the deep learning computational technique 110.
[0044] At step 310, the framework 100 may deploy the machine learning model 112 to identify and classify the discriminative physiology from the extracted high-level features.
[0045] At step 312, the framework 100 may compare the identified discriminative physiology with the training dataset 114 comprising the pretrained leaf images.
[0046] At step 314, the framework 100 may classify the corresponding leaf images into one of predefined categories selected from the healthy category or the diseased category, based on the learned discriminative physiology and the classification confidence thresholds derived from the training dataset 114.
[0047] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0048] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
I/We Claim:
1. A hybrid framework (100) for automated classification of Oryza leaf diseases, the framework (100) comprising:
an image acquisition unit (106) adapted to receive leaf images from a computing device (102); and
a processing unit (108) in communication with the image acquisition unit (106), characterized in that the processing unit (108) is configured to;
pre-process the leaf images received by the image acquisition unit (106);
isolate segments depicting leaves from the pre-processed leaf images;
extract high-level features from the isolated segments, wherein the high-level features are extracted using a deep learning computational technique (110), wherein the deep learning computational technique (110) is selected from a pretrained Convolutional Neural Network (CNN), a ResNet model, or a combination thereof.
deploy a machine learning model (112) adapted to identify and classify discriminative physiology from the extracted high-level features, wherein the machine learning model (112) is selected from a Support Vector Machine (SVM) algorithm, a Random Forest (RF) algorithm, or a combination thereof.
compare the identified discriminative physiology with a training dataset (114) comprising pretrained leaf images; and
classify the corresponding leaf images into one of predefined categories selected from healthy or diseased based on the learned discriminative physiology and classification confidence thresholds derived from the training dataset (114).
2. The framework (100) as claimed in claim 1, wherein the computing device (102) comprises a computer application (104) adapted to display the categorization of the leaf images.
3. The framework (100) as claimed in claim 1, wherein the corresponding leaf images, classified into a healthy category or a diseased category, are resupplied to the training dataset (114) for strengthening of the framework (100) effectiveness and reliability.
4. The framework (100) as claimed in claim 1, wherein the preprocessing of the received leaf images is carried out by resizing input images to a fixed dimension, normalizing pixel values, applying data augmentation, applying data normalization, flipping, rotating, brightness adjustment, or a combination thereof.
5. The framework (100) as claimed in claim 1, wherein the discriminative physiology is selected from a bacterial leaf, a brown spot, a blast spot, a terminal blast, a blister, a blight infestation, or a combination thereof.
6. The framework (100) as claimed in claim 1, wherein the processing unit (108) is configured to transmit retention and adjust hyper parameters to maximize an effectiveness and a reliability of the training dataset (114).
7. A method (300) for an automated classification of Oryza leaf diseases, the method (300) is characterized by steps of:
receiving leaf images from a computing device (102);
pre-processing the received leaf images by an image acquisition unit (106);
isolating segments depicting leaves from the pre-processed leaf images;
extracting high-level features from the isolated segments, wherein the high-level features are extracted using a deep learning computational technique (110), wherein the deep learning computational technique (110) is selected from a pretrained Convolutional Neural Network (CNN), a ResNet model, or a combination thereof.
deploying a machine learning model (112) adapted to identify and classify discriminative physiology from the extracted high-level features, wherein the machine learning model (112) is selected from a Support Vector Machine (SVM) algorithm, a Random Forest (RF) algorithm, or a combination thereof.
comparing the identified discriminative physiology with a training dataset (114) comprising pretrained leaf images; and
classifying the corresponding leaf images into one of predefined categories selected from healthy or diseased based on the learned discriminative physiology and classification confidence thresholds derived from the training dataset (114).
8. The method (300) as claimed in claim 7, wherein the corresponding leaf images, classified into a healthy category or a diseased category, are resupplied to the training dataset (114) for strengthening of the framework (100) effectiveness and reliability.
9. The method (300) as claimed in claim 7, wherein the preprocessing of the received leaf images is carried out by resizing input images to a fixed dimension, normalizing pixel values, applying data augmentation, applying data normalization, flipping, rotating, brightness adjustment, or a combination thereof.
10. The method (300) as claimed in claim 7, wherein the discriminative physiology is selected from a bacterial leaf, a brown spot, a blast spot, a terminal blast, a blister, a blight infestation, or a combination thereof.
Date: April 25, 2025
Place: Noida
Nainsi Rastogi
Patent Agent (IN/PA-2372)
Agent for the Applicant
| # | Name | Date |
|---|---|---|
| 1 | 202541040284-STATEMENT OF UNDERTAKING (FORM 3) [25-04-2025(online)].pdf | 2025-04-25 |
| 2 | 202541040284-REQUEST FOR EARLY PUBLICATION(FORM-9) [25-04-2025(online)].pdf | 2025-04-25 |
| 3 | 202541040284-POWER OF AUTHORITY [25-04-2025(online)].pdf | 2025-04-25 |
| 4 | 202541040284-OTHERS [25-04-2025(online)].pdf | 2025-04-25 |
| 5 | 202541040284-FORM-9 [25-04-2025(online)].pdf | 2025-04-25 |
| 6 | 202541040284-FORM FOR SMALL ENTITY(FORM-28) [25-04-2025(online)].pdf | 2025-04-25 |
| 7 | 202541040284-FORM 1 [25-04-2025(online)].pdf | 2025-04-25 |
| 8 | 202541040284-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [25-04-2025(online)].pdf | 2025-04-25 |
| 9 | 202541040284-EDUCATIONAL INSTITUTION(S) [25-04-2025(online)].pdf | 2025-04-25 |
| 10 | 202541040284-DRAWINGS [25-04-2025(online)].pdf | 2025-04-25 |
| 11 | 202541040284-DECLARATION OF INVENTORSHIP (FORM 5) [25-04-2025(online)].pdf | 2025-04-25 |
| 12 | 202541040284-COMPLETE SPECIFICATION [25-04-2025(online)].pdf | 2025-04-25 |