Sign In to Follow Application
View All Documents & Correspondence

System And Method For Prediction Of Rice Plant Diseases

Abstract: SYSTEM AND METHOD FOR PREDICTION OF RICE PLANT DISEASES ABSTRACT A system (100) for prediction of rice plant diseases is disclosed. The system (100) comprises an image acquisition unit (106) to receive images of rice leaves from an image capturing device (102). The system (100) is configured to pre-process the received images of the rice leaves; isolate segments depicting the rice leaves from the pre-processed images of the rice leaves; and extract features from the isolated segments using Convolutional Neural Network (CNN) models (100); deploy advanced deep learning architectures (112) to identify and classify key features from the extracted features; and compare the identified key features with a training dataset (114). The system (100) is further configured to classify the corresponding images of the rice leaves into one of predefined categories such as a healthy category or a diseased category. The system (100) ensures flexibility in application across different scales of farming, from smallholdings to large commercial operations. Claims: 10, Figures: 3 Figure 1 is selected.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
17 April 2025
Publication Number
20/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SR University
SR University, Ananthasagar, Warangal Telangana India 506371 patent@sru.edu.in 08702818333

Inventors

1. Namala Shiva Prasad
SR University, Ananthasagar, Hasanparthy (PO), Warangal, Telangana, India-506371.
2. Dr. Vishwanath Bijalwan
SR University, Ananthasagar, Hasanparthy (PO), Warangal, Telangana, India-506371.
3. Dr. Sridhar Chintala
SR University, Ananthasagar, Hasanparthy (PO), Warangal, Telangana, India-506371.

Specification

Description:BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to a plant disease system for prediction of plant diseases and particularly to a system for prediction of rice plant diseases.
Description of Related Art
[002] Rice serves as a staple food for more than half of the world’s population and plays a vital role in global food security. However, rice crops often suffer from a wide range of diseases that adversely affect yield and quality. These diseases include fungal, bacterial, and viral infections that appear under various climatic and agronomic conditions. Traditional approaches to managing rice leaf diseases depend heavily on manual inspection and expert knowledge, both of which introduce delays and inconsistencies in disease identification. In remote or under-resourced areas, farmers often lack access to timely and accurate diagnostic support, leading to unchecked disease progression and significant crop losses.
[003] However, over the past decade, researchers and companies have explored the use of digital technologies in agriculture to support disease detection. Mobile-based platforms and AI-powered tools have emerged to offer disease prediction services, some of which rely on photographs captured by smartphones. These tools aim to classify the visual symptoms present on rice leaves and provide diagnostic feedback. Systems such as Plantix, Agrio, and Cropin represent notable efforts in this domain. However, these solutions vary in performance depending on image quality, lighting conditions, and environmental factors. In addition, many models struggle to generalize across different rice varieties, geographical zones, and disease severities, which limits their effectiveness in real-world scenarios.
[004] Moreover, commercial platforms that integrate artificial intelligence, such as Microsoft Farm Beats and IBM Watson Decision Platform for Agriculture, have incorporated crop health monitoring features. These platforms rely on a combination of satellite imagery, sensor data, and advanced analytics to manage large-scale agricultural operations. Although powerful, such systems often remain inaccessible to small-scale farmers due to their cost, complexity, and infrastructure demands. In academic settings, advanced deep learning architectures trained for rice disease detection often remain confined to controlled environments and lack commercial deployment.
[005] There is thus a need for an improved and advanced system for prediction of rice plant diseases that can administer the aforementioned limitations in a more efficient manner.
SUMMARY
[006] Embodiments in accordance with the present invention provide a system for prediction of rice plant diseases. The system comprising an image acquisition unit adapted to receive images of rice leaves from an image capturing device. The system further comprising a processing unit in communication with the image acquisition unit. The processing unit is configured to pre-process the images of the rice leaves received by the image acquisition unit; isolate segments depicting the rice leaves from the pre-processed images of the rice leaves; and extract features from the isolated segments. The features are extracted using Convolutional Neural Network (CNN) models. The Convolutional Neural Network (CNN) models are selected from a Visual Geometry Group 19-layer (VGG19), a ResNet50, an InceptionV3, or a combination thereof; identify and classify key features from the extracted features by deploying advanced deep learning architectures. The advanced deep learning architectures are selected from a You Only Look Once version 8 (YOLOv8), Mobile Net, a Vision Transformer (ViT), a Convolutional Block Attention Module (CBAM), or a combination thereof; compare the identified key features with a training dataset comprising pretrained images of the rice leaves; classify the corresponding images of the rice leaves into one of predefined categories selected from a healthy category or a diseased category based on the learned key features and the assigned weights.
[007] Embodiments in accordance with the present invention further provide a method for predicting rice plant diseases. The method comprising steps of receiving images of rice leaves from an image capturing device; pre-processing the received images of the rice leaves by an image acquisition unit; isolating segments depicting the rice leaves from the pre-processed images of the rice leaves; extracting features from the isolated segments. The features are extracted using Convolutional Neural Network (CNN) models. The Convolutional Neural Network (CNN) models are selected from a Visual Geometry Group 19-layer (VGG19), a ResNet50, an InceptionV3, or a combination thereof; identifying and classifying key features from the extracted features by deploying advanced deep learning architectures. The advanced deep learning architectures are adapted to assign weights to the identified and classified key features; comparing the identified key features with a training dataset comprising pretrained images of the rice leaves, and classifying the corresponding images of the rice leaves into one of predefined categories selected from a healthy category or a diseased category based on the learned key features and the assigned weights.
[008] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide a system for prediction of rice plant diseases.
[009] Next, embodiments of the present application may provide a system for prediction of rice plant diseases that achieves an accuracy of 94.65% by using advanced deep learning architectures such as VGG19, ResNet, and InceptionV3, combined with attention modules like CBAM. This significantly improves the reliability of disease detection compared to traditional or basic machine learning methods.
[0010] Next, embodiments of the present application may provide a system for prediction of rice plant diseases that utilizes a comprehensive dataset of pre-stored images of the rice leaves. The dataset may represent 10 different disease types captured under various lighting conditions and orientations. This enhances an ability of the system to generalize and perform well in real-world farming environments.
[0011] Next, embodiments of the present application may provide a system for prediction of rice plant diseases that allows farmers to upload images easily and receive instant feedback on disease type, severity, and recommended treatments, making it highly accessible even to those with minimal technical expertise.
[0012] Next, embodiments of the present application may provide a system for prediction of rice plant diseases that ensures the model emphasizes critical disease features, reducing false positives and enhancing classification of visually similar disease types.
[0013] Next, embodiments of the present application may provide a system for prediction of rice plant diseases that supports deployment on cloud-based platforms and edge devices for flexibility in application across different scales of farming, from smallholdings to large commercial operations. The system supports continuous improvement through ongoing data collection and model updates.
[0014] These and other advantages will be apparent from the present application of the embodiments described herein.
[0015] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0017] FIG. 1 illustrates a schematic block diagram of a system for prediction of rice plant diseases, according to an embodiment of the present invention;
[0018] FIG. 2 illustrates a block diagram of a processing unit, according to an embodiment of the present invention; and
[0019] FIG. 3 depicts a flowchart of a method predicting rice plant diseases, according to an embodiment of the present invention.
[0020] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0021] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0022] In any embodiment described herein, the open-ended terms "comprising", "comprises”, and the like (which are synonymous with "including", "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", “consists essentially of", and the like or the respective closed phrases "consisting of", "consists of”, the like.
[0023] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0024] FIG. 1 illustrates a schematic block diagram of a system 100 for prediction of rice plant diseases, according to an embodiment of the present invention. In an embodiment of the present invention, the system 100 may be adapted to detect a presence of disease in received images of rice leaves. Moreover, the system 100 may classify and evaluate a stage of the detected disease in the received images of the rice leaves. Furthermore, the system 100 may train an artificially computable model for adaptive learning and disease progression prediction. Further, the training may be driven by real-time updates based on emerging disease trends. The system 100 may utilize advanced feature extraction techniques to analyze progressive changes in rice leaves in correlation with emerging disease and/or past infestations.
[0025] According to the embodiments of the present invention, the system 100 may incorporate non-limiting hardware components to enhance the processing speed and efficiency such as the system 100 may comprise a image capturing device 102, a computer application 104, an image acquisition unit 106, a processing unit 108, Convolutional Neural Network (CNN) models 110, advanced deep learning architectures 112, and a training dataset 114. In an embodiment of the present invention, the hardware components of the system 100 may be integrated with computer-executable instructions for overcoming the challenges and the limitations of the existing systems.
[0026] In an embodiment of the present invention, the image capturing device 102 may be adapted to capture and upload the images of the rice leaves to the system 100. The images of the rice leaves may be captured under various conditions such as, but not limited to, different angles, disproportionate lighting, several rotations, and so forth to ensure diversity. The image capturing device 102 may be, but not limited to, a camera, a laptop, a mobile, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the image capturing device 102, including known, related art, and/or later developed technologies. The image capturing device 102 may comprise the computer application 104 adapted to display the categorization of the images of the rice leaves conducted by the system 100. The categorization of the images of the rice leaves conducted by the system 100 may be healthy or diseased. The computer application 104 may be, but not limited to, a web application, a standalone application, an Unstructured Supplementary Service Data (USSD) application, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the computer application 104, including known, related art, and/or later developed technologies.
[0027] In an embodiment of the present invention, the image acquisition unit 106 may be adapted to receive the images of the rice leaves from the image capturing device 102.
[0028] In an embodiment of the present invention, the processing unit 108 may be in communication with the image acquisition unit 106. The processing unit 108 may further be configured to execute computer-executable instructions to generate an output relating to the system 100. According to embodiments of the present invention, the processing unit 108 may be, but not limited to, a Programmable Logic Control (PLC) unit, a microprocessor, a development board, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the processing unit 108 including known, related art, and/or later developed technologies. In an embodiment of the present invention, the processing unit 108 may further be explained in conjunction with FIG. 2.
[0029] FIG. 2 illustrates a block diagram of the processing unit 108 of the system 100, according to an embodiment of the present invention. The processing unit 108 may comprise the computer-executable instructions in form of programming modules such as a data receiving module 200, a data preprocessing module 202, a data extraction module 204, a data comparison module 206, and a data classification module 208.
[0030] In an embodiment of the present invention, the data receiving module 200 may be configured to receive the images of the rice leaves from the image capturing device 102. The data receiving module 200 may be configured to transmit the received images of the rice leaves to the data preprocessing module 202.
[0031] The data preprocessing module 202 may be activated upon receipt of the images of the rice leaves from the data receiving module 200. In an embodiment of the present invention, the data preprocessing module 202 may be configured to pre-process the images of the rice leaves. The preprocessing of the received images of the rice leaves may be carried out by resizing input images to a fixed dimension, normalizing pixel values, applying data augmentation, applying data normalization, flipping, rotating, brightness adjustment, contrast enhancement, noise reduction, and so forth. Embodiments of the present invention are intended to include or otherwise cover any means for preprocessing of the received images of the rice leaves, including known, related art, and/or later developed technologies. The data preprocessing module 202 may be configured to transmit the pre-processed images of the rice leaves to the data extraction module 204.
[0032] The data extraction module 204 may be activated upon receipt of the pre-processed images of the rice leaves from the data preprocessing module 202. In an embodiment of the present invention, the data extraction module 204 may be configured to isolate segments depicting the rice leaves from the pre-processed images of the rice leaves.
[0033] Further, the data extraction module 204 may be configured to extract features from the isolated segments. The features may be extracted using the Convolutional Neural Network (CNN) models 110. The Convolutional Neural Network (CNN) models 110 may be, but not limited to, a Visual Geometry Group 19-layer (VGG19), a ResNet50, an InceptionV3, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the Convolutional Neural Network (CNN) models 110, including known, related art, and/or later developed technologies.
[0034] Further, the data extraction module 204 may be configured to deploy the advanced deep learning architectures 112. The advanced deep learning architectures 112 may be adapted to identify and classify key features from the extracted features. The advanced deep learning architectures 112 may be adapted to assign weights to the identified and classified key features.
[0035] The key features may be, but not limited to, a bacterial rice leaf, a brown spot, a blast spot, a terminal blast, a blister, a blight infestation, a hips spot, a fungal infection, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the key features, including known, related art, and/or later developed technologies.
[0036] The advanced deep learning architectures 112 may be, but not limited to, You Only Look Once version 8 (YOLOv8), Mobile Net, a Vision Transformer (ViT), a Convolutional Block Attention Module (CBAM), and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the advanced deep learning architectures 112, including known, related art, and/or later developed technologies.
[0037] The advanced deep learning architectures 112 may provide higher accuracy in detecting the disease. Further, the Convolutional Neural Network (CNN) models 110 and the advanced deep learning architectures 112 may achieve abilities enablement of handling a diverse and compliant set of the images of the rice leaves. The Convolutional Neural Network (CNN) models 110 and the advanced deep learning architectures 112 may be configured to generate a confusion matrix to evaluate classification performance to ensure reduced false positives and improved detection rates. The system 100 may undergo extensive testing using diverse test samples to assess accuracy, precision, recall, and F1-scores.
[0038] The data extraction module 204 may be configured to transmit the key features to the data comparison module 206.
[0039] The data comparison module 206 may be activated upon receipt of the key features from the data extraction module 204. In an embodiment of the present invention, the data comparison module 206 may be configured to compare the identified key features with the training dataset 114 comprising pretrained images of the rice leaves. The training dataset 114 may include 100,000 images, representing 10 distinct key features. The training dataset 114 may be refined by implementation of Grid Search CV and hyperparameter tuning techniques. The training dataset 114 may be trained on Google Colab to leverage capabilities of a Graphics Processing Unit (GPU) (not shown) for faster processing of the system 100.
[0040] Upon comparison, if the key features may match with physiologies on the pretrained images of the rice leaves, then the data comparison module 206 may transmit a classification signal to the data classification module 208. Else, the data comparison module 206 may reactivate the data extraction module 204 for re-identification and re-extraction of the key features.
[0041] The data classification module 208 may be activated upon receipt of the classification signal from the data comparison module 206. The data classification module 208 may be configured to classify the corresponding images of the rice leaves into one of predefined categories selected from the healthy category or the diseased category, based on the learned key features and classification confidence thresholds derived from the training dataset 114.
[0042] The data classification module 208 may be configured to assess a severity level of the disease in the corresponding images of the rice leaves. Further, the data classification module 208 may be configured to recommend treatment strategies based on the classification of the corresponding images of the rice leaves.
[0043] The data classification module 208 may be configured to reintroduce the corresponding images of the rice leaves, classified into the healthy category or the diseased category to the training dataset 114 for strengthening of the system 100 effectiveness and reliability. Further, the data classification module 208 may be configured to transmit retention and adjust hyperparameters to maximize an effectiveness and a reliability of the training dataset 114.
[0044] FIG. 3 depicts a flowchart of a method 300 for the prediction of rice plant diseases using the system 100, according to an embodiment of the present invention.
[0045] At step 302, the system 100 may receive the images of the rice leaves from the image capturing device 102.
[0046] At step 304, the system 100 may pre-process the images of the rice leaves.
[0047] At step 306, the system 100 may isolate the segments depicting the rice leaves from the pre-processed images of the rice leaves.
[0048] At step 308, the system 100 may extract the features from the isolated segments using the Convolutional Neural Network (CNN) models 110.
[0049] At step 310, the system 100 may identify and classify key features from the extracted features by deploying advanced deep learning architectures 112.
[0050] At step 312, the system 100 may compare the identified key features with the training dataset 114 comprising the pretrained images of the rice leaves.
[0051] At step 314, the system 100 may classify the corresponding images of the rice leaves into one of predefined categories selected from the healthy category or the diseased category, based on the learned key features and the assigned weights.
[0052] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0053] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
I/We Claim:
1. A system (100) for prediction of rice plant diseases, the system (100) comprising:
an image acquisition unit (106) adapted to receive images of rice leaves from an image capturing device (102); and
a processing unit (108) in communication with the image acquisition unit (106), characterized in that the processing unit (108) is configured to;
pre-process the images of the rice leaves received by the image acquisition unit (106);
isolate segments depicting rice leaves from the pre-processed images of the rice leaves;
extract features from the isolated segments using Convolutional Neural Network (CNN) models (100), wherein the Convolutional Neural Network (CNN) models (100) are selected from a Visual Geometry Group 19-layer (VGG19), a ResNet50, an InceptionV3, or a combination thereof;
identify and classify key features from the extracted features by deploying advanced deep learning architectures (112), wherein the advanced deep learning architectures (112) are selected from a You Only Look Once version 8 (YOLOv8), Mobile Net, a Vision Transformer (ViT), a Convolutional Block Attention Module (CBAM), or a combination thereof;
compare the identified key features with a training dataset (114) comprising pretrained images of the rice leaves; and
classify the corresponding images of the rice leaves into one of predefined categories selected from a healthy category or a diseased category based on the learned key features and the assigned weights.
2. The system (100) as claimed in claim 1, wherein the training dataset (114) is refined by implementation of Grid Search CV and hyperparameter tuning techniques.
3. The system (100) as claimed in claim 1, wherein the preprocessing of the received images of the rice leaves is carried out by resizing input images to a fixed dimension, normalizing pixel values, applying data augmentation, applying data normalization, flipping, rotating, brightness adjustment, contrast enhancement, noise reduction, or a combination thereof.
4. The system (100) as claimed in claim 1, wherein the image capturing device (102) comprises a computer application (104) adapted to display the categorization of the images of the rice leaves.
5. The system (100) as claimed in claim 1, wherein the key features are selected from a bacterial rice leaf blight, a brown spot, a blast spot, a terminal blast, a blister, a blight infestation, a hips spot, a fungal infection, or a combination thereof.
6. The system (100) as claimed in claim 1, wherein the processing unit (108) is configured to generate a confusion matrix to evaluate classification performance.
7. The system (100) as claimed in claim 1, wherein the training dataset (114) is trained on Google Colab for faster processing of the system (100).
8. A method (300) predicting rice plant diseases, the method (300) characterized by steps of:
receiving images of rice leaves from an image capturing device (102);
pre-processing the received images of the rice leaves by an image acquisition unit (106);
isolating segments depicting the rice leaves from the pre-processed images of the rice leaves;
extracting features from the isolated segments using Convolutional Neural Network (CNN) models (100), wherein the Convolutional Neural Network (CNN) models (100) are selected from a Visual Geometry Group 19-layer (VGG19), a ResNet50, an InceptionV3, or a combination thereof;
identify and classify key features from the extracted features by deploying advanced deep learning architectures (112), wherein the advanced deep learning architectures (112) are adapted to assign weights to the identified and classified key features;
comparing the identified key features with a training dataset (114) comprising pretrained images of the rice leaves; and
classifying the corresponding images of the rice leaves into one of predefined categories selected from a healthy category or a diseased category based on the learned key features and the assigned weights.
9. The method (300) as claimed in claim 8, wherein the training dataset (114) is refined by implementation of Grid Search CV and hyperparameter tuning techniques.
10. The method (300) as claimed in claim 8, wherein the key features are selected from a bacterial rice leaf blight, a brown spot, a blast spot, a terminal blast, a blister, a blight infestation, a hips spot, a fungal infection, or a combination thereof.
Date: April 16, 2025
Place: Noida

Nainsi Rastogi
Patent Agent (IN/PA-2372)
Agent for the Applicant

Documents

Application Documents

# Name Date
1 202541037107-STATEMENT OF UNDERTAKING (FORM 3) [17-04-2025(online)].pdf 2025-04-17
2 202541037107-REQUEST FOR EARLY PUBLICATION(FORM-9) [17-04-2025(online)].pdf 2025-04-17
3 202541037107-POWER OF AUTHORITY [17-04-2025(online)].pdf 2025-04-17
4 202541037107-OTHERS [17-04-2025(online)].pdf 2025-04-17
5 202541037107-FORM-9 [17-04-2025(online)].pdf 2025-04-17
6 202541037107-FORM FOR SMALL ENTITY(FORM-28) [17-04-2025(online)].pdf 2025-04-17
7 202541037107-FORM 1 [17-04-2025(online)].pdf 2025-04-17
8 202541037107-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [17-04-2025(online)].pdf 2025-04-17
9 202541037107-EDUCATIONAL INSTITUTION(S) [17-04-2025(online)].pdf 2025-04-17
10 202541037107-DRAWINGS [17-04-2025(online)].pdf 2025-04-17
11 202541037107-DECLARATION OF INVENTORSHIP (FORM 5) [17-04-2025(online)].pdf 2025-04-17
12 202541037107-COMPLETE SPECIFICATION [17-04-2025(online)].pdf 2025-04-17