Sign In to Follow Application
View All Documents & Correspondence

System And Method For Multi Stage Disease Classification Using Neural Network

Abstract: SYSTEM AND METHOD FOR MULTI-STAGE DISEASE CLASSIFICATION USING NEURAL NETWORK ABSTRACT A system (100) for multi-stage disease classification in agricultural imagery using neural network is disclosed. The system (100) comprises an image acquisition unit (106) to receive images of plants from an image capturing device (102). A processing unit (108) to pre-process the images of the plants; isolate spatial patterns depicting plants from the pre-processed images; extract features from the isolated spatial patterns using Convolutional Neural Network (CNN) models (110); superimpose a quantum enhancement layer (112) and a classical processing layer (114) on the features extracted from the isolated spatial patterns; compare the extracted features with a dataset (116) for detecting a disease in the images of the plants; classify a stage of the detected disease; and generate a confidence score and recommendations. The system (100) achieves faster convergence during training and quicker decision-making during inference. Thus, reduces the overall time required for model deployment in real-time agricultural scenarios. Claims: 10, Figures: 4 Figure 1A is selected.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
18 April 2025
Publication Number
19/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SR University
SR University, Ananthasagar, Warangal Telangana India 506371 patent@sru.edu.in 08702818333

Inventors

1. Rapelly Nandini
Department of Computer Science, SR University, Ananthasagar, Hasanparthy (PO), Warangal, Telangana, India-506371.
2. K. Deepa
Department of Computer Science, SR University, Ananthasagar, Hasanparthy (PO), Warangal, Telangana, India-506371.

Specification

Description:
BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to a disease classification in agricultural imagery and particularly to a system for multi-stage disease classification from agricultural imagery using a neural network.
Description of Related Art
[002] Accurate identification of plant diseases remains a critical challenge in precision agriculture. The ability to detect diseases at various stages plays a vital role in safeguarding crop health and ensuring food security. Conventional image-based diagnostic approaches often depend on visual inspection or standard machine learning models, which face limitations when exposed to subtle visual cues or overlapping symptoms across disease stages. These challenges contribute to late diagnoses, reduced yield, and increased use of agrochemicals.
[003] Several advancements in computer vision detection techniques have been introduced into agricultural diagnostics. These techniques extract features from images and use them for classification. However, they frequently suffer from constraints in handling high-dimensional features and differentiating between stages of disease. This limitation arises from the inherent structure of classical networks, which require extensive training data and high computational resources to achieve marginal improvements in classification performance.
[004] Existing approaches have also explored data augmentation and transfer learning to improve model robustness and accuracy. Despite these efforts, the scalability and efficiency of such models remain suboptimal. The complexity of agricultural datasets, coupled with the need for real-time intervention, demands solutions that support faster learning, higher precision, and better generalization without incurring prohibitive computational costs.
[005] There is thus a need for an improved and advanced system for multi-stage disease classification in agricultural imagery using neural network that can administer the aforementioned limitations in a more efficient manner.
SUMMARY
[006] Embodiments in accordance with the present invention provide a system for multi-stage disease classification in agricultural imagery using a neural network. The system comprising an image acquisition unit adapted to receive images of plants from an image capturing device. The system further comprising a processing unit in communication with the image acquisition unit. The processing unit is configured to pre-process the images of the plants received by the image acquisition unit. The pre-processing of the images is carried out using noise reduction, data augmentation, segmentation, or a combination thereof.
[007] The processing unit is further configured to isolate spatial patterns depicting plants from the pre-processed images of the plants; and extract features from the isolated spatial patterns using Convolutional Neural Network (CNN) models. The features are extracted using convolutional layers, pooling layers, or a combination thereof; superimpose a quantum enhancement layer and a classical processing layer on the features extracted from the isolated spatial patterns. The quantum enhancement layer is superimposed using a quantum feature encoding, quantum gates Quantum Neural Networks (QNNs) layers, or a combination thereof; compare the extracted features, superimposed with the quantum enhancement layer and the classical processing layer, with a dataset for detecting a disease in the images of the plants; classify a stage of the detected disease into preset categories. The classification is carried out by a SoftMax activation algorithm; and generate a confidence score for the corresponding detected disease by conducting a post processing on the detected disease and the classified stage of the detected disease. The processing unit is further configured to generate recommendations, based on the generated confidence score. for treatment of the detected disease.
[008] Embodiments in accordance with the present invention further provide a method for multi-stage disease classification in agricultural imagery using neural network. The method comprising steps of capturing images of plants using an image capturing device; pre-processing the images of the plants. The pre-processing of the images is carried out using noise reduction, data augmentation, segmentation, or a combination thereof; isolating spatial patterns depicting plants from the pre-processed images of the plants; extracting features from the isolated spatial patterns using Convolutional Neural Network (CNN) models.
[009] The features are extracted using convolutional layers, pooling layers, or a combination thereof; superimposing a quantum enhancement layer and a classical processing layer on the features extracted from the isolated spatial patterns. The quantum enhancement layer is superimposed using a quantum feature encoding, quantum gates QNN layers, or a combination thereof. The method further comprising steps of comparing the extracted features, superimposed with the quantum enhancement layer and the classical processing layer, with a dataset for detecting a disease in the images of the plants; and classifying a stage of the detected disease into preset categories. The classification is carried out by a SoftMax activation algorithm. The method further comprising step of generating a confidence score for the corresponding detected disease by conducting a post processing on the detected disease and the classified stage of the detected diseas; and generating recommendations, based on the generated confidence score for treatment of the detected disease.
[0010] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide a system for multi-stage disease classification in agricultural imagery using a neural network.
[0011] Next, embodiments of the present application may provide a system for multi-stage disease classification that features enhanced precision in multi-stage disease classification compared to traditional convolutional neural networks (CNNs) based systems.
[0012] Next, embodiments of the present application may provide a system for multi-stage disease classification that achieves faster convergence during training and quicker decision-making during inference. This reduces the overall time required for model deployment in real-time agricultural scenarios.
[0013] Next, embodiments of the present application may provide a system for multi-stage disease classification that performs better on previously unseen disease patterns, supporting more reliable diagnostics in diverse field conditions.
[0014] Next, embodiments of the present application may provide a system for multi-stage disease classification that offers significant energy and resource savings without compromising performance.
[0015] Next, embodiments of the present application may provide a system for multi-stage disease classification that leads to a more nuanced and robust representation of disease symptoms, improving the effectiveness of the classification process.
[0016] These and other advantages will be apparent from the present application of the embodiments described herein.
[0017] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0019] FIG. 1A illustrates a schematic block diagram of a system for multi-stage disease classification in agricultural imagery using a neural network, according to an embodiment of the present invention;
[0020] FIG. 1B illustrates an exemplary scenario of the system for multi-stage disease classification, according to an embodiment of the present invention;
[0021] FIG. 2 illustrates a block diagram of a processing unit, according to an embodiment of the present invention; and
[0022] FIG. 3 depicts a flowchart of a method for multi-stage disease classification in agricultural imagery using a neural network, according to an embodiment of the present invention.
[0023] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0024] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0025] In any embodiment described herein, the open-ended terms "comprising", "comprises”, and the like (which are synonymous with "including", "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", “consists essentially of", and the like or the respective closed phrases "consisting of", "consists of”, the like.
[0026] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0027] FIG. 1A illustrates a schematic block diagram of a system 100 for multi-stage disease classification in agricultural imagery using a neural network, according to an embodiment of the present invention. In an embodiment of the present invention, the system 100 may be adapted to detect a presence of disease in received images of plants. Moreover, the system 100 may classify and evaluate a stage of the detected disease in the received images of the plants. Furthermore, the system 100 may train an artificially computable model for adaptive learning and disease progression prediction. Further, the training may be driven by real-time updates based on emerging disease trends. The system 100 may utilize advanced feature extraction techniques to analyze progressive changes in plants in correlation with emerging disease and/or past infestations.
[0028] According to the embodiments of the present invention, the system 100 may incorporate non-limiting hardware components to enhance the processing speed and efficiency such as the system 100 may comprise a image capturing device 102, a computer application 104, an image acquisition unit 106, a processing unit 108, Convolutional Neural Network (CNN) models 110, a quantum enhancement layer 112, a classical processing layer 114, and a dataset 116. In an embodiment of the present invention, the hardware components of the system 100 may be integrated with computer-executable instructions for overcoming the challenges and the limitations of the existing systems.
[0029] In an embodiment of the present invention, the image capturing device 102 may be adapted to capture and upload the images of the plants to the system 100. The images of the plants may be captured under various conditions such as, but not limited to. different angles, disproportionate lighting, several rotations, and so forth to ensure diversity. In an embodiment of the present invention, a resolution of the images of the plants is in a range from 200 pixels by 200 pixels to 300 pixels by 300 pixels. In a preferred embodiment of the present invention, the resolution of the images of the plants may be 224 pixels by 224 pixels. Embodiments of the present invention are intended to include or otherwise cover any resolution of the images.
[0030] The image capturing device 102 may be, but not limited to, a camera, a laptop, a mobile, a drone, a flood illuminator, an infrared emitter, a Raspberry Pi, a smartphone, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the image capturing device 102, including known, related art, and/or later developed technologies.
[0031] The image capturing device 102 may comprise the computer application 104. The computer application 104 may be adapted to display a detected disease, a classified stage of the detected disease, recommendations, and so forth. The classified stage of the detected disease may be, but not limited to, an early symptom, a moderate symptom, a severe symptom, an early lesion, a moderate decay, and so forth. The computer application 104 may be, but not limited to, a web application, a standalone application, an Unstructured Supplementary Service Data (USSD) application, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the computer application 104, including known, related art, and/or later developed technologies.
[0032] In an embodiment of the present invention, the image acquisition unit 106 may be adapted to receive the images of the plants from the image capturing device 102.
[0033] In an embodiment of the present invention, the processing unit 108 may be in communication with the image acquisition unit 106. The processing unit 108 may further be configured to execute computer-executable instructions to generate an output relating to the system 100. According to embodiments of the present invention, the processing unit 108 may be, but not limited to, a Programmable Logic Control (PLC) unit, a microprocessor, a development board, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the processing unit 108 including known, related art, and/or later developed technologies. In an embodiment of the present invention, the processing unit 108 may further be explained in conjunction with FIG. 2.
[0034] FIG. 1B illustrates an exemplary scenario of the system 100 for multi-stage disease classification, according to an embodiment of the present invention. In the exemplary scenario of the present invention, the system 100 may receive a photograph 118 of a leaf. The photograph 118 may be preprocessed. Upon preprocessing, a set of classical features may be extracted using the using the Convolutional Neural Network (CNN) models 110.
[0035] Further, the photograph 118 may be operated on a Maxpooling algorithm. The Maxpooling algorithm may be added to the Convolutional Neural Network (CNN) models 110 to reduce dimensionality of the photograph 118. The dimensionality may be reduced by reducing a number of pixels in the photograph 118. Along with reduction of dimensionality, a Rectified Linear Unit (ReLU) algorithm may be operated on the photograph 118. The Rectified Linear Unit (ReLU) algorithm may provide a weightage to the pixels in the photograph 118. The weightage may be provided on basis of factors such as, but not limited to, application of filters, clarity, condensation, pixilation, and so forth.
[0036] Further, the Rectified Linear Unit (ReLU) algorithm may provide a negative and zero weightage to the pixels exhibiting bad and/or unusable factors. Moreover, the pixels exhibiting bad and/or unusable factors may be dropped out. Additionally, the Rectified Linear Unit (ReLU) algorithm may provide a positive weightage to the pixels exhibiting good and/or usable factors. The pixels exhibiting good and/or usable factors may be passed through a Hybrid Fusion Dense (HFD) and, thus, may be carried forward.
[0037] Further, as the photograph 118 may now be dimensionally reduced and may comprise the pixels exhibiting good and/or usable factors, the photograph 118 may be executed on a quantum layer and on an entanglement layer.
[0038] The execution of the photograph 118 on the quantum layer may enable an angle encoding for generation of a quantum feature map. Further, the execution of the photograph 118 on the entanglement layer may enable the detection of the disease and classification of the stage of the detected disease into preset categories using a SoftMax activation algorithm.
[0039] FIG. 2 illustrates a block diagram of the processing unit 108, according to an embodiment of the present invention. The processing unit 108 may comprise the computer-executable instructions in form of programming modules such as a data receiving module 200, a data preprocessing module 202, a data extraction module 204, a data comparison module 206, and a data classification module 208.
[0040] In an embodiment of the present invention, the data receiving module 200 may be configured to receive the images of the plants from the image capturing device 102. The data receiving module 200 may be configured to transmit the received images of the plants to the data preprocessing module 202.
[0041] The data preprocessing module 202 may be activated upon receipt of the images of the plants from the data receiving module 200. In an embodiment of the present invention, the data preprocessing module 202 may be configured to pre-process the images of the plants. The preprocessing of the received images of the plants may be carried out by noise reduction, data augmentation, segmentation, resizing input images to a fixed dimension, normalizing pixel values, applying, applying data normalization, flipping, rotating, brightness adjustment, contrast enhancement, noise reduction, and so forth. Embodiments of the present invention are intended to include or otherwise cover any means for preprocessing of the received images of the plants, including known, related art, and/or later developed technologies.
[0042] In an embodiment of the present invention, the data preprocessing module 202 may be configured to isolate spatial patterns, depicting plants, from the pre-processed images of the plants. The isolate spatial patterns may be, but not limited to, local patterns, global patterns, and so forth. The data preprocessing module 202 may be configured to transmit the pre-processed images and the isolated spatial pattern of the plants to the data extraction module 204.
[0043] The data extraction module 204 may be activated upon receipt of the pre-processed images and the isolated spatial pattern of the plants from the data preprocessing module 202. In an embodiment of the present invention, the data extraction module 204 may be configured to engage the Convolutional Neural Network (CNN) models 110 to extract features from the isolated spatial patterns. In an embodiment of the present invention, the Convolutional Neural Network (CNN) models 110 may encompass a set of computational algorithms such as, but not limited to, the Maxpooling algorithm, a Conv algorithm, a 64filter algorithm, the Rectified Linear Unit (ReLU) algorithm, and so forth. Embodiments of the present invention are intended to include or otherwise cover any computational algorithms, including known, related art, and/or later developed technologies, encompassed in the Convolutional Neural Network (CNN) models 110.
[0044] The features may be extracted by utilization of layers such as, but not limited to, convolutional layers, pooling layers, and so forth. Embodiments of the present invention are intended to include or otherwise cover any layers for feature extraction, including known, related art, and/or later developed technologies.
[0045] The extracted features may undergo a quantum feature encoding. The quantum feature encoding may transform the extracted features into a higher-dimensional space for improved pattern recognition. The extracted features may be encoded into quantum states using encoding techniques such as, but not limited to, the angle encoding, an amplitude encoding, a basis encoding, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the encoding techniques, including known, related art, and/or later developed technologies.
[0046] The extracted features may be, not limited to, edges, textures, color variations, and so forth. Embodiments of the present invention are intended to include or otherwise cover features, including known, related art, and/or later developed technologies, that may be extracted from the isolated spatial patterns.
[0047] In an embodiment of the present invention, the data extraction module 204 may be configured to superimpose the quantum enhancement layer 112 and the classical processing layer 114 on the features extracted from the isolated spatial patterns. The superimposition of the quantum enhancement layer 112 and the classical processing layer 114 on the extracted features may enable identification of contextual dependencies.
[0048] The identification of contextual dependencies may enable a quantum entanglement. The quantum entanglement may allow the data extraction module 204 to explore and compare complex relationships between the extracted features. The quantum entanglement may enable identification of contextual dependencies across distant extracted features. Thus, improving stage-wise differentiation of the disease. The quantum entanglement may further be validated and processed using quantum gates. The quantum gates may be, but not limited to, Hadamard, Pauli-X, CNOT, RX, RY, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the quantum gates, including known, related art, and/or later developed technologies. The quantum enhancement layer 112 may further employ Quantum Neural Networks (QNNs) utilizing the quantum gates to enhance classification efficiency by leveraging quantum parallelism. The Quantum Neural Networks (QNNs) may capture nonlinear patterns with fewer layers and trainable parameters.
[0049] The quantum enhancement layer 112 may be superimposed using means such as, but not limited to, a quantum feature encoding, quantum gates Quantum Neural Networks (QNNs) layers, and so forth. Embodiments of the present invention are intended to include or otherwise cover any means, including known, related art, and/or later developed technologies, for superimposing the quantum enhancement layer 112 on the features extracted from the isolated spatial patterns.
[0050] The classical processing layer 114 may be superimposed using means such as, but not limited to, fully connected layers, a quantum classical attention mechanism, and so forth. Embodiments of the present invention are intended to include or otherwise cover any means, including known, related art, and/or later developed technologies, for superimposing the classical processing layer 114 on the features extracted from the isolated spatial patterns.
[0051] The data extraction module 204 may be configured to transmit the extracted features, superimposed with the quantum enhancement layer 112 and the classical processing layer 114 to the data comparison module 206.
[0052] The data comparison module 206 may be activated upon receipt of the extracted features, superimposed with the quantum enhancement layer 112 and the classical processing layer 114. In an embodiment of the present invention, the data comparison module 206 may be configured to compare the extracted features, superimposed with the quantum enhancement layer 112 and the classical processing layer 114, with the dataset 116 for detecting the disease in the images of the plants. The dataset 116 may comprise pretrained images of the plants contoured with the features.
[0053] The data comparison module 206 may be configured to transmit the detected disease to the data classification module 208.
[0054] The data classification module 208 may be activated upon receipt of the detected disease from the data comparison module 206. In an embodiment of the present invention, the data classification module 208 may be configured to classify the stage of the detected disease into the preset categories. The classification into the preset categories may be carried out by the SoftMax activation algorithm. The preset categories may be, but not limited to, an early symptom, a moderate symptom, a severe symptom, an early lesion, a moderate decay, and so forth. Embodiments of the present invention are intended to include or otherwise cover any preset categories, including known, related art, and/or later developed technologies, for classification of the detected disease.
[0055] The data comparison module 206 may further be configured to employ Quantum-Classical Attention Mechanisms (qCAM) to refine a differentiation of the stage of the detected disease. The Quantum-enhanced learning accelerates training, improves generalization, and enhances accuracy, making real-time disease detection and intervention more effective. The SoftMax activation algorithm deployed by the data comparison module 206 may be configured to deploy a hybrid fusion dense to assign probabilities to the disease and progression stages of the disease.
[0056] In an embodiment of the present invention, the data classification module 208 may be configured to generate a confidence score and the recommendations for the corresponding detected disease. The confidence score may be generated by conducting post processing on the detected disease and the classified stage of the detected disease. The post processing may be conducted by an imposition of an output layer. Based on the generated confidence score, the data classification module 208 may be configured to generate recommendations for treatment of the disease, according to an embodiment of the present invention. The recommendations may be, but not limited to, a usage-dosages of pesticides, a usage-dosage of insecticides, an exposure to sunlight, an administration of salt water, a trimming of infested areas, and so forth. Embodiments of the present invention are intended to include or otherwise cover any recommendations that may be generated for treatment of the disease, including known, related art, and/or later developed technologies.
[0057] In an embodiment of the present invention, the data classification module 208 may be configured to generate refined feature maps for training of the dataset 116. The training of the dataset 116 may increase sensitivity to stage transitions and improve classification granularity of the images of the plants.
[0058] FIG. 3 depicts a flowchart of a method 300 for the multi-stage disease classification in the agricultural imagery using the neural network, according to an embodiment of the present invention.
[0059] At step 302, the system 100 may capture the images of the plants using the image capturing device 102.
[0060] At step 304, the system 100 may pre-process the captured images of the plants.
[0061] At step 306, the system 100 may isolate spatial patterns depicting plants from the pre-processed images of the plants.
[0062] At step 308, the system 100 may extract features from the isolated spatial patterns using the Convolutional Neural Network (CNN) models 110.
[0063] At step 310, the system 100 may superimpose the quantum enhancement layer 112 and the classical processing layer 114 on the features extracted from the isolated spatial patterns.
[0064] At step 312, the system 100 may compare the extracted features with the dataset 116 for detecting the disease in the images of the plants.
[0065] At step 314, the system 100 may classify the stage of the detected disease into the preset categories.
[0066] At step 316, the system 100 may generate the confidence score for the corresponding detected disease by conducting the post processing on the detected disease and the classified stage of the detected disease.
[0067] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0068] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
I/We Claim:
1. A system (100) for multi-stage disease classification in agricultural imagery using neural network, the system (100) comprising:
an image acquisition unit (106) adapted to receive images of plants from an image capturing device (102); and
a processing unit (108) in communication with the image acquisition unit (106), characterized in that the processing unit (108) is configured to:
pre-process the images of the plants received by the image acquisition unit (106), wherein the pre-processing of the images is carried out using noise reduction, data augmentation, segmentation, or a combination thereof;
isolate spatial patterns depicting plants from the pre-processed images of the plants;
extract features from the isolated spatial patterns using Convolutional Neural Network (CNN) models (110), wherein the features are extracted using convolutional layers, pooling layers, or a combination thereof;
superimpose a quantum enhancement layer (112) and a classical processing layer (114) on the features extracted from the isolated spatial patterns, wherein the quantum enhancement layer (112) is superimposed using a quantum feature encoding, quantum gates Quantum Neural Networks (QNNs) layers, or a combination thereof;
compare the extracted features, superimposed with the quantum enhancement layer (112) and the classical processing layer (114), with a dataset (116) for detecting a disease in the images of the plants;
classify a stage of the detected disease into preset categories, wherein the classification is carried out by a SoftMax activation algorithm; and
generate a confidence score for the corresponding detected disease by conducting a post processing on the detected disease and the classified stage of the detected disease.
2. The system (100) as claimed in claim 1, wherein the classical processing layer (114) is superimposed using fully connected layers, a quantum classical attention mechanism, or a combination thereof.
3. The system (100) as claimed in claim 1, wherein the superimposition of the quantum enhancement layer (112) and the classical processing layer (114) on the extracted features enable identification of contextual dependencies.
4. The system (100) as claimed in claim 1, wherein the dataset (116) comprises pretrained images of the plants contoured with features.
5. The system (100) as claimed in claim 1, wherein the processing unit (108) is configured to generate refined feature maps for training of the dataset (116).
6. The system (100) as claimed in claim 1, wherein the image capturing device (102) comprise a computer application (104) adapted to display the detected disease, the classified stage of the detected disease, recommendations, or a combination thereof.
7. The system (100) as claimed in claim 1, wherein the features extracted from the isolated spatial patterns are selected from edges, textures, color variations, or a combination thereof.
8. The system (100) as claimed in claim 1, wherein a resolution of the images of the plants is in a range from 200 pixels by 200 pixels to 300 pixels by 300 pixels. 224 pixels by 224 pixels.
9. The system (100) as claimed in claim 1, wherein the preset categories are selected from an early symptom, a moderate symptom, a severe symptom, an early lesion, a moderate decay, or a combination thereof.
10. A method (300) for multi-stage disease classification in agricultural imagery using neural network, the method (300) characterized by steps of:
capturing images of plants using an image capturing device (102);
pre-processing the images of the plants, wherein the pre-processing of the images is carried out using noise reduction, data augmentation, segmentation, or a combination thereof;
isolating spatial patterns depicting plants from the pre-processed images of the plants;
extracting features from the isolated spatial patterns using Convolutional Neural Network (CNN) models (110), wherein the features are extracted using convolutional layers, pooling layers, or a combination thereof;
superimposing a quantum enhancement layer (112) and a classical processing layer (114) on the features extracted from the isolated spatial patterns, wherein the quantum enhancement layer (112) is superimposed using a quantum feature encoding, quantum gates QNN layers, or a combination thereof;
comparing the extracted features, superimposed with the quantum enhancement layer (112) and the classical processing layer (114), with a dataset (116) for detecting a disease in the images of the plants;
classifying a stage of the detected disease into preset categories, wherein the classification is carried out by a SoftMax activation algorithm; and
generating a confidence score for the corresponding detected disease by conducting a post processing on the detected disease and the classified stage of the detected disease.
Date: April 17, 2025
Place: Noida

Nainsi Rastogi
Patent Agent (IN/PA-2372)
Agent for the Applicant

Documents

Application Documents

# Name Date
1 202541037579-STATEMENT OF UNDERTAKING (FORM 3) [18-04-2025(online)].pdf 2025-04-18
2 202541037579-REQUEST FOR EARLY PUBLICATION(FORM-9) [18-04-2025(online)].pdf 2025-04-18
3 202541037579-POWER OF AUTHORITY [18-04-2025(online)].pdf 2025-04-18
4 202541037579-OTHERS [18-04-2025(online)].pdf 2025-04-18
5 202541037579-FORM-9 [18-04-2025(online)].pdf 2025-04-18
6 202541037579-FORM FOR SMALL ENTITY(FORM-28) [18-04-2025(online)].pdf 2025-04-18
7 202541037579-FORM 1 [18-04-2025(online)].pdf 2025-04-18
8 202541037579-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [18-04-2025(online)].pdf 2025-04-18
9 202541037579-EDUCATIONAL INSTITUTION(S) [18-04-2025(online)].pdf 2025-04-18
10 202541037579-DRAWINGS [18-04-2025(online)].pdf 2025-04-18
11 202541037579-DECLARATION OF INVENTORSHIP (FORM 5) [18-04-2025(online)].pdf 2025-04-18
12 202541037579-COMPLETE SPECIFICATION [18-04-2025(online)].pdf 2025-04-18
13 202541037579-Proof of Right [13-05-2025(online)].pdf 2025-05-13