Abstract: A GUIDED DUAL-BRANCH CONVOLUTIONAL NEURAL NETWORK SYSTEM FOR COTTON LEAF DISEASE DETECTION The invention discloses an Attention-Guided Dual-Branch Convolutional Neural Network (AG-DB-CNN) for cotton leaf disease detection. The system comprises a dual-branch architecture, where one branch extracts color features and the other extracts texture features from cotton leaf images. An attention-based fusion mechanism dynamically combines the extracted features, focusing on the most relevant patterns for disease recognition. The model is lightweight and optimized for deployment on mobile and edge devices, enabling real-time field-level diagnosis. Compared to traditional single-branch CNNs, the invention achieves higher accuracy, robustness, and adaptability under diverse environmental conditions, thereby providing an effective tool for precision agriculture.
Description:FIELD OF THE INVENTION
The invention relates to an Attention-Guided Dual-Branch Convolutional Neural Network (AG-DB-CNN) for cotton leaf disease detection. The system extracts color and texture features separately through dual branches and combines them using an attention-based fusion mechanism for improved accuracy. The lightweight design enables real-time deployment on mobile and edge devices, making it practical for field applications in agriculture.
BACKGROUND OF THE INVENTION
Cotton leaf diseases are difficult to identify accurately due to overlapping symptoms, varying environmental conditions, and limitations in manual inspection or standard CNN models. Existing methods cannot often distinguish fine color and texture details and perform poorly in real-world settings. A more intelligent, robust, and lightweight system is needed—one that can dynamically focus on relevant features and operate effectively on mobile or edge devices for real-time detection in the field.
US9535563B2: An Internet appliance, comprising, within a single housing, packet data network interfaces, adapted for communicating with the Internet and a local area network, at least one data interface selected from the group consisting of a universal serial bus, an IEEE-1394 interface, a voice telephony interface, an audio program interface, a video program interface, an audiovisual program interface, a camera interface, a physical security system interface, a wireless networking interface; a device control interface, smart home interface, an environmental sensing interface, and an environmental control interface, and a processor, for controlling a data transfer between the local area network and the Internet, and defining a markup language interface communicated through a packet data network interface, to control a data transfer or control a remote device.
US7987003B2: A network media appliance, comprising at least one packet data network interface, adapted for communicating data packets with a data network according to an Internet Protocol; a media data interface, and a processor, having an associated memory for storing executable code, said code defining at least a remote virtual interface function, and a data transfer function for controlling transfer of data through said media data interface.
The primary objective of this invention is to develop a robust and intelligent deep learning model for detecting cotton leaf diseases with high accuracy under diverse real-world conditions.
Another objective is to introduce a dual-branch CNN architecture that separately processes color and texture features, ensuring that the model captures fine-grained symptom details more effectively than single-branch networks.
A further objective is to incorporate an attention-guided fusion mechanism that dynamically emphasizes the most relevant features for disease classification, thereby improving accuracy and reliability.
The invention also aims to design a lightweight and computationally efficient model capable of being deployed on mobile and edge devices, enabling farmers to detect diseases in real time without relying on cloud-based processing.
Finally, an objective of the invention is to create a system that is scalable and adaptable to other crops, thereby extending its utility across agricultural domains beyond cotton.
SUMMARY OF THE INVENTION
This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the invention.
This summary is neither intended to identify key or essential inventive concepts of the invention and nor is it intended for determining the scope of the invention.
The invention introduces an Attention-Guided Dual-Branch Convolutional Neural Network (AG-DB-CNN) for accurate detection of cotton leaf diseases. Unlike traditional single-branch CNNs, the proposed model uses two dedicated branches: one for extracting color features and the other for texture features. These are combined through an attention-based fusion mechanism, which selectively emphasizes the most relevant patterns for disease classification.
The system is designed to be lightweight and optimized for mobile and edge devices, enabling real-time field-level detection without reliance on cloud processing. This makes it highly practical for farmers and agricultural experts. The invention improves accuracy, robustness, and adaptability under diverse environmental conditions, offering a scalable solution that can also be extended to other crops beyond cotton.
To further clarify advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.
The proposed invention introduces a novel deep learning model designed to detect and classify cotton leaf diseases. It uses a dual-branch CNN architecture, where one branch extracts color features and the other extracts texture features from input leaf images. These features are then combined using an attention-based fusion mechanism, which intelligently focuses on the most important patterns in the image.
BRIEF DESCRIPTION OF THE DRAWINGS
The illustrated embodiments of the subject matter will be understood by reference to the drawings, wherein like parts are designated by like numerals throughout. The following description is intended only by way of example, and simply illustrates certain selected embodiments of devices, systems, and methods that are consistent with the subject matter as claimed herein, wherein:
FIGURE 1: SYSTEM ARCHITECTURE
The figures depict embodiments of the present subject matter for the purposes of illustration only. A person skilled in the art will easily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION OF THE INVENTION
The detailed description of various exemplary embodiments of the disclosure is described herein with reference to the accompanying drawings. It should be noted that the embodiments are described herein in such details as to clearly communicate the disclosure. However, the amount of details provided herein is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the scope of the present disclosure as defined by the appended claims.
It is also to be understood that various arrangements may be devised that, although not explicitly described or shown herein, embody the principles of the present disclosure. Moreover, all statements herein reciting principles, aspects, and embodiments of the present disclosure, as well as specific examples, are intended to encompass equivalents thereof.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms “a",” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes” and/or “including,” when used herein, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.
It should also be noted that in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may, in fact, be executed concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
In addition, the descriptions of "first", "second", “third”, and the like in the present invention are used for the purpose of description only, and are not to be construed as indicating or implying their relative importance or implicitly indicating the number of technical features indicated. Thus, features defining "first" and "second" may include at least one of the features, either explicitly or implicitly.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The proposed invention introduces a novel deep learning model designed to detect and classify cotton leaf diseases. It uses a dual-branch CNN architecture, where one branch extracts color features and the other extracts texture features from input leaf images. These features are then combined using an attention-based fusion mechanism, which intelligently focuses on the most important patterns in the image.
The model is optimized for real-time, high-accuracy performance, even under challenging conditions such as varying backgrounds and lighting. It is suitable for deployment on mobile and edge devices, making it highly practical for field use. The invention significantly improves upon traditional CNN approaches by offering enhanced robustness, scalability, and precision in disease detection for agricultural applications.
The proposed invention introduces a unique dual-branch CNN architecture that separately extracts color and texture features from cotton leaf images. These features are then intelligently combined using an attention-based fusion mechanism, allowing the model to focus on the most relevant patterns for accurate disease detection. Unlike traditional models, this design enhances robustness and accuracy under real-world conditions and is optimized for real-time use on mobile and edge devices, making it highly suitable for field-based agricultural applications.
The present invention relates to the field of agricultural image processing and plant disease detection using deep learning. More particularly, it discloses an Attention-Guided Dual-Branch Convolutional Neural Network (AG-DB-CNN) designed for accurate and real-time cotton leaf disease detection.
Conventional CNN-based approaches typically employ a single-path architecture that processes all visual features uniformly. These models often fail to distinguish subtle disease symptoms that manifest as fine variations in color and texture, especially under noisy field conditions with variable lighting and complex backgrounds. This limitation reduces their effectiveness for real-world agricultural use.
The proposed invention addresses these challenges by employing a dual-branch CNN architecture. One branch is dedicated to extracting color-specific features such as discoloration, chlorosis, and pigmentation anomalies, while the other branch specializes in extracting texture-specific features such as roughness, lesions, and surface irregularities.
The outputs of these two branches are combined using an attention-based fusion mechanism that intelligently assigns higher weights to the most relevant features. This fusion strategy ensures that the model dynamically focuses on critical patterns associated with specific cotton leaf diseases while suppressing irrelevant background noise.
The architecture is lightweight and optimized for real-time operation on mobile and edge devices, making it suitable for deployment directly in the field by farmers and agricultural practitioners. Through this design, the invention provides enhanced robustness, accuracy, and scalability compared to traditional CNN-based disease detection systems.
The best method of implementing the invention involves collecting a dataset of cotton leaf images representing multiple disease categories under varying conditions. The dataset is preprocessed through normalization, resizing, and data augmentation techniques to enhance robustness against environmental variability.
The dual-branch CNN is then constructed, with the first branch configured for color feature extraction and the second branch configured for texture feature extraction. Both branches consist of convolutional and pooling layers tuned for their specific feature domains.
The outputs of both branches are directed to an attention-based fusion module, which applies channel-spatial attention to assign appropriate weights to features. This ensures that disease-relevant features are amplified while irrelevant features are suppressed.
The fused features are passed into a classification layer that outputs the probability distribution over predefined disease classes. The model is trained using supervised learning with cross-entropy loss and optimized with gradient descent methods.
Once trained, the model can be deployed on mobile or edge devices using frameworks such as TensorFlow Lite or PyTorch Mobile. This allows real-time inference in the field, where farmers can capture leaf images through a smartphone camera and instantly receive disease classification results.
, Claims:1. An attention-guided dual-branch convolutional neural network for cotton leaf disease detection comprising a first branch configured to extract color features from input images, a second branch configured to extract texture features from the same input images, and an attention-based fusion module adapted to combine the outputs of both branches by selectively emphasizing relevant features, thereby improving classification accuracy under real-world conditions.
2. The system as claimed in claim 1, wherein the first branch is configured with convolutional filters optimized for capturing chromatic and spectral variations in leaf images.
3. The system as claimed in claim 1, wherein the second branch is configured with convolutional filters designed to capture fine-grained texture and structural details of leaf images.
4. The system as claimed in claim 1, wherein the attention-based fusion module applies channel-wise and spatial attention mechanisms to dynamically weight the extracted features.
5. The system as claimed in claim 1, wherein the dual-branch outputs are fused prior to the final classification layer for disease identification.
6. The system as claimed in claim 1, wherein the model is trained using a cotton leaf disease dataset including variations in background, illumination, and environmental noise.
7. The system as claimed in claim 1, wherein the architecture is lightweight and optimized for deployment on mobile or edge devices for real-time disease detection.
8. The system as claimed in claim 1, wherein the attention-based fusion enhances robustness against overlapping disease symptoms and environmental variability.
9. The system as claimed in claim 1, wherein the classifier outputs multiple disease categories including but not limited to bacterial blight, leaf curl, grey mildew, and healthy leaf.
10. The system as claimed in claim 1, wherein the architecture is scalable and adaptable for extension to other crop disease detection tasks beyond cotton.
| # | Name | Date |
|---|---|---|
| 1 | 202541089030-STATEMENT OF UNDERTAKING (FORM 3) [18-09-2025(online)].pdf | 2025-09-18 |
| 2 | 202541089030-REQUEST FOR EARLY PUBLICATION(FORM-9) [18-09-2025(online)].pdf | 2025-09-18 |
| 3 | 202541089030-POWER OF AUTHORITY [18-09-2025(online)].pdf | 2025-09-18 |
| 4 | 202541089030-FORM-9 [18-09-2025(online)].pdf | 2025-09-18 |
| 5 | 202541089030-FORM FOR SMALL ENTITY(FORM-28) [18-09-2025(online)].pdf | 2025-09-18 |
| 6 | 202541089030-FORM 1 [18-09-2025(online)].pdf | 2025-09-18 |
| 7 | 202541089030-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [18-09-2025(online)].pdf | 2025-09-18 |
| 8 | 202541089030-EVIDENCE FOR REGISTRATION UNDER SSI [18-09-2025(online)].pdf | 2025-09-18 |
| 9 | 202541089030-EDUCATIONAL INSTITUTION(S) [18-09-2025(online)].pdf | 2025-09-18 |
| 10 | 202541089030-DRAWINGS [18-09-2025(online)].pdf | 2025-09-18 |
| 11 | 202541089030-DECLARATION OF INVENTORSHIP (FORM 5) [18-09-2025(online)].pdf | 2025-09-18 |
| 12 | 202541089030-COMPLETE SPECIFICATION [18-09-2025(online)].pdf | 2025-09-18 |