Sign In to Follow Application
View All Documents & Correspondence

System Of Plant Leaf Disease Detection Based On Transfer Learning And Deep Learning

Abstract: SYSTEM OF PLANT LEAF DISEASE DETECTION BASED ON TRANSFER LEARNING AND DEEP LEARNING TECHNIQUES The present invention introduces a plant leaf disease detection system using deep learning and transfer learning techniques. The system captures plant leaf images, preprocesses them for quality enhancement, and applies U-Net-based feature extraction for segmenting disease-affected regions. Pre-trained CNN models, fine-tuned using transfer learning, classify diseases with high accuracy. The system offers real-time disease detection through mobile and IoT applications, providing farmers with actionable insights. A cloud-based analytics module enables long-term disease monitoring and model refinement. The invention enhances precision agriculture by delivering scalable, real-time, and accurate plant disease identification.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
20 February 2025
Publication Number
10/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SR UNIVERSITY
ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA

Inventors

1. J. PHANI PRASAD
SR UNIVERSITY, ANANTHASAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA
2. DR. VIJAYA CHANDRA JADALA
SR UNIVERSITY, ANANTHASAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA

Specification

Description:FIELD OF THE INVENTION
This invention applies to area called Deep Learning based system for plant leaf disease identification using Convolution Neural Networks to analyse plant leaf images and identify the types of diseases occurring to plants, particularly in applications demanding high accuracy in picture search and retrieval. More specifically, it concentrates on the plant leaf disease detection using CNN’s and Efficient net through the use of U-Net-based feature extraction and The invention is applicable across detecting various fruit and plant image datasets where robust and precise identification of various plant leaf diseases is very crucial and significant.
BACKGROUND OF THE INVENTION
The agricultural sector faces significant challenges in disease management, where quick identification of plant diseases beforehand is very important in minimizing crop failures. Traditional methods, often based on visual inspection by experts, are not only time consuming but also subjected to mistakes and inefficiency. Latest developments in deep learning have showed the way for more precise and scalable solutions, utilizing image datasets to train algorithms that can autonomously identify and classify diseases in plants. Despite progress, existing systems often suffer from issues such as limited dataset availability, accuracy in real-world conditions, and the need for efficient real-time operation.
This invention addresses these limitations by utilizing sophisticated CNN architectures, along with transfer learning and data augmentation techniques like rotation, scaling zooming, shearing, shifting, to increase the efficiency of plant disease identification models, especially in mobile environments where farmers require fast, reliable results.
Deep learning has suggestively advanced the area of image analysis, predominantly through the use of CNNs specifically models like ResNet, VGG, and AlexNet, have shown tremendous success in extracting rich, hierarchical features from images, surpassing traditional handcrafted features.
One type of convolutional neural network is U-net designed initially for segmentation of image related tasks. It is especially well-suited for tasks where the aim is to categorize each pixel in an image as related to a specific class, such as identifying disease spots on a plant leaf.
Key Features of U-Net:
• Encoder-Decoder Structure:
o The encoder (contracting path) gradually reduces the spatial dimensions of the input image, using convolutional layers and pooling operations, while capturing high-level features.
o The decoder (expansive path) restores the spatial dimensions using transposed convolutions (up-sampling), allowing the network to create pixel-wise predictions that correspond to the original input size.
Despite the advantages of deep learning models like U-Net, existing Deep learning systems still face limitations in handling feature extraction in a way that is adaptive to different environmental conditions. Once features are extracted, most systems treat all features with equal preference.
In the existing methodologies which were implemented in using deep learning the plant leaf disease identification and in the traditional leaf disease detection methods the leaf disease identification in plants is found to be inaccurate and sometimes it is not feasible to detect the disease accurately.
Traditional Convolutional Neural Networks (CNNs), which are typically used for image classification tasks, can also be effective for plant leaf disease identification, especially when the task is to classify an entire leaf as diseased or healthy.
Key Features of Traditional CNNs:
• Feature Extraction: CNNs learn sequence of features from the raw image data, capturing low-level features (e.g., edges, textures) with initial layers and capturing more abstract and complex patterns by deeper layers.
• Fully Connected Layers: CNNs typically use fully connected layers to produce the final classification After the convolutional and pooling layers, where the image is assigned a label by the network (e.g., “healthy” vs. “diseased”).
• End-to-End Learning: CNNs can learn end-to-end from raw pixel data to the final output without the need for manual feature extraction, making them highly adaptable and powerful for complex image recognition tasks.
The present invention discourses these challenges by considering the tomato plants taken in to consideration and detecting the tomato plant leaf disease by considering the plant leaf village data set which is taken from kaggle data set, along with it some more custom data sets are taken in to consideration and the images of healthy plant leafs and unhealthy plant leafs are taken in to account.
U-net models of various types were used separately on Plant Village dataset leaf images and leaf mask of tomato dataset to figure out the segmentation in a better manner . Five-fold cross-validation was used, where 80% of 18,161 leaf images of tomato and their respective ground truth masks were randomly selected and used for training, and rest of 20% were meant for testing. The class distribution in the test set is identical to the train set. Out of the 80% training dataset, 90% was used for actual training, and 10% for validation, which aids in the over fitting problem avoidance.
COMPARISION OF EXISTING VS PROPOSED METHODOLOGY
Existing Methodologies in identification of plant leaf disease using Deep Learning
In old plant leaf disease identification methods, the process generally involves the mining of low-level features from various plant leaf images, followed by the comparison of these features with a query image to retrieve similar images from the database. These methods, while effective in simple and small-scale retrieval tasks, particularly in large-scale and complex datasets. Some key elements and limitations of existing methodologies are outlined below:
CNN-Based Feature Extraction:
CNNs began to be employed for feature extraction in CBIR systems. CNNs, such as ResNet, VGG, and AlexNet, have shown superior performance in capturing complex hierarchical features compared to handcrafted methods. These models automatically learn high-level, discriminative features from large image datasets. However, even with CNNs, several challenges persist:
• Limited and Imbalanced Datasets: Deep learning models require large, high-quality labeled datasets to perform well. However, in plant disease detection, datasets are often limited, especially for specific crops or rare diseases. In addition to that , class imbalance problem is most common thing for much of datasets, where certain diseases are under represented. This can tend to model biasing which perform poorly on minority classes.
• Noise and Variability in Data: Leaf images may contain noise due to lighting conditions, background clutter, or image quality. Inconsistent image capture settings (e.g., changing lighting, angles, or leaf orientations) lead to poor model generalization, affecting model performance in real-world applications.
Projected Methodology
It introduces a novel approach that overcomes many of the limitations found in existing methodologies. The key innovations of the proposed methodology are:
U-Net for Feature Extraction:
Instead of relying solely on traditional handcrafted features or standard CNNs, the proposed CBIR system utilizes U-Net designed for image segmentation errands. U-Net has several advantages:
Efficientnet for classification of Tomato leaf disease prediction:
Comparison Summary
Aspect U-Net Traditional CNNs
Task Image segmentation (pixel-wise) Image classification (entire image)
Output Pixel-level class labels (segmentation mask) Single class label for the whole image
Use Case Detecting and segmenting disease spots on leaves Classifying overall leaf health or identifying specific diseases
Model Complexity More complex due to encoder-decoder structure Simpler, mainly involving convolutional layers and fully connected layers
Precision High precision in identifying fine details like small lesions Might miss subtle details, particularly if the disease manifests in small patches
Training Data Requires labeled segmentation data (mask annotations) Requires labeled classification data (overall labels)

The primary objective is to detect the plant leaf diseases effectively using deep learning techniques, The system is designed to meet the encounters modelled by large-scale and diverse image datasets. The detailed objectives of the proposed methodology are as follows:
1. Accurate Disease Classification
To build a model of deep learning that can categorize different types of leaf diseases automatically and exactly. The model must distinguish between healthy leaves and leaves infected with various diseases (e.g., rust, powdery mildew, blight, and bacterial leaf spot). It should also be able to distinguish between similar-looking diseases, reducing misclassification. High accuracy, precision, and recall in classifying leaf diseases, even with small variations in image quality or disease symptoms.
2. Automation of Disease Detection
To automate the process of disease detection to reduce manual intervention and speed up the diagnosis process. Instead of depending upon man power to visually inspect and identify diseases in leaves, the system should automatically process images of leafs, detect diseases, and provide good insights without human involvement. A quick, more scalable solution that can be used by farmers, agricultural companies, and researchers, especially in larger scope in farming operations or when dealing with number of plant species.
3 . Scalability and Real-World Applicability
• To build a system that can extend its hand to handle large datasets and diverse plant species, making it applicable in real-world farming environments. The model should be worth of handling a variety of plant species and diseases, making it apt for commercial agriculture where multiple crops and varieties are cultivated. A unique system that can be generalized across different crops and geographic regions, implemented to various growing conditions and disease types.
4. Robustness to Variations in Environmental Factors
To make sure the model is powerful to various environmental conditions such as lighting, image quality, and leaf orientation. The deep learning model should be trained with different datasets that encompasses different environmental factors such as differences in light, leaf color, weather conditions, and camera angles. This will aid improve the model’s robustness. even when images are captured under non-ideal conditions, the model must be very perfect making it suitable for use in field settings, where conditions might change.
5. Disease Detection in Real time
To make it possible to identify leaf diseases in the real world with cell phones or other portable electronics. During field inspections with smartphones or drones, farmers should be able to promptly identify leaf diseases thanks to the system's real-time optimization. Real-time, mobile-friendly solution that gives farmers instant feedback so they may move quickly and implement specific interventions (like trimming or pesticide application)
6. Sustainability and Precision Agriculture
With the usage of precision agriculture techniques, supporting the sustainable farming practices .
Farmers should be able to implement disease control measures only when needed, optimizing pesticide use, decreasing environmental impact, such system should be employed. A system that facilitates the farming by encouraging resource usage in an effective manner, improving crop health, and decreasing the ecological footprint of agricultural practices.
7. Evaluate Performance Through Standard Metrics (Precision, Recall, F1-Score)
An important objective of the proposed methodology is to achieve major improvements in standard performance metrics. Particularly in terms of how well it identifies diseased images of plant leafs and how it balances false positives and false negatives.
The proposed system is designed to outperform traditional Plant Leaf disease detection with deep learning methods across these metrics. By using U-Net-based feature extraction, which captures more detailed and high-level features, the system aims to enhance precision (reducing false positives), improve recall (increasing true positives), and achieve a higher F1-score (balancing precision and recall). These metrics will be used to benchmark the proposed system against existing Plant leaf disease detection methodologies, demonstrating its superior performance.

SUMMARY OF THE INVENTION
This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the invention.
This summary is neither intended to identify key or essential inventive concepts of the invention and nor is it intended for determining the scope of the invention.
The present invention introduces an advanced system for plant leaf disease detection using deep learning models with transfer learning techniques. The system leverages Convolutional Neural Networks (CNNs) and U-Net-based feature extraction models to identify diseases from plant leaf images.
The invention begins with an image acquisition module that captures high-resolution images of plant leaves using smartphones, drones, or specialized imaging devices. These images undergo preprocessing steps, including normalization and augmentation, to enhance the quality and variability of the dataset.
A deep learning-based feature extraction module is then applied, where U-Net architecture is used to segment disease-affected areas from the leaf images. CNN models, pre-trained on large-scale datasets such as PlantVillage, are fine-tuned using transfer learning techniques to classify the extracted features into disease categories.
The system provides real-time disease detection, enabling farmers to take immediate action. The trained model can be deployed on mobile applications, cloud-based platforms, or embedded devices for in-field diagnosis. The invention also includes a feedback mechanism that allows users to update and refine the model over time, improving accuracy through continuous learning.
The proposed methodology significantly improves disease detection accuracy compared to traditional CNN-based classification models. By integrating segmentation-based feature extraction and transfer learning, the system offers a scalable, real-time, and user-friendly solution for plant disease identification.
To further clarify advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.
The invention focuses on utilizing deep learning to self-detect and diagnose diseases in plant leaves through analysing the plant leaf images. By leveraging the power of Convolutional Neural Networks (CNNs) and other advanced deep learning algorithms, the system helps to assist farmers and agricultural professionals in early detection of plant diseases, which can significantly reduce crop loss and improve agricultural productivity.
Key Features and Techniques:
1. Image Acquisition: The system captures images with good quality of plant leaves, typically using smartphones, cameras, or specialized imaging systems. These images are used as inputs for disease detection.
2. Data Preprocessing:
o Normalization: Raw images are pre-processed to ensure consistent color representation and scaling, making the deep learning model more accurate.
o Augmentation: Data augmentation techniques like rotation, flipping, and scaling are applied to create a robust training dataset, improving model generalization.
3. Deep Learning Models:
o Convolutional Neural Networks (CNNs): to extract important features from the images (like color patterns, shapes, and textures) that are indicative of specific plant diseases CNN are basically used.
o Transfer Learning: Pre-trained models (like VGGNet, ResNet, or Inception) are fine-tuned for specific plant disease detection tasks. This helps reduce the need for large datasets and shortens the training time.
o Object Detection Networks: Some systems may incorporate specialized object detection frameworks (e.g., YOLO, Faster R-CNN) to identify disease-affected regions on the leaves.
4. Disease Classification:
o The system classifies the detected disease based on the learned features, identifying the specific condition affecting the plant. Common diseases such as powdery mildew, rust, blight, and bacterial spots can be classified using the trained model.
o Multi-Class Classification: The model can distinguish between various plant diseases or even healthy leaves, offering multiple disease labels for comprehensive diagnosis.
5. Real-Time Diagnosis:
o Many implementations aim to provide real-time feedback, enabling farmers to take immediate action to control or treat the disease, such as applying the appropriate pesticide, changing cultivation practices, or isolating affected plants.
6. Performance and Accuracy:
o The system is trained on large datasets consisting of leaf images labeled with known diseases, ensuring high accuracy in diagnosis.
o based on metrics like accuracy, precision, recall, and F1-score, The model's performance is assessed with continuous improvement as more data is collected.
Benefits:
• Early Disease Detection: Allows farmers to detect plant diseases at initial stage, reducing risk of widespread crop damage.
• Cost-Effective: With smartphone apps or low-cost imaging tools, the technology is affordable and accessible for smallholder farmers.
• Non-Destructive: The system analyzes leaves without causing any harm to the plants, providing a non-invasive method of diagnosis.
• Scalability: The solution can be implemented to a broad range of crops and environments, helping farmers globally.
• Reduction in Chemical Use: Early detection can lead to targeted pesticide use, reducing unnecessary chemical treatments and their environmental impact.
Challenges:
• Dataset Quality: Ensuring high-quality, diverse datasets with various plant species, disease types, and environmental conditions is critical for the model’s robustness.
• Environmental Variability: Factors like lighting, background, and leaf position may affect the accuracy of image-based disease detection, requiring advanced techniques to mitigate such issues.
• Interpretability: While deep learning models achieve high accuracy, the decision-making process is often opaque, and more research is needed to make models more interpretable for farmers.
BRIEF DESCRIPTION OF THE DRAWINGS
The illustrated embodiments of the subject matter will be understood by reference to the drawings, wherein like parts are designated by like numerals throughout. The following description is intended only by way of example, and simply illustrates certain selected embodiments of devices, systems, and methods that are consistent with the subject matter as claimed herein, wherein:
FIGURE 1: BLOCK DIAGRAM OF THE PROPOSED MODEL
DETAILED DESCRIPTION OF THE INVENTION
The detailed description of various exemplary embodiments of the disclosure is described herein with reference to the accompanying drawings. It should be noted that the embodiments are described herein in such details as to clearly communicate the disclosure. However, the amount of details provided herein is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the scope of the present disclosure as defined by the appended claims.
It is also to be understood that various arrangements may be devised that, although not explicitly described or shown herein, embody the principles of the present disclosure. Moreover, all statements herein reciting principles, aspects, and embodiments of the present disclosure, as well as specific examples, are intended to encompass equivalents thereof.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms “a",” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes” and/or “including,” when used herein, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.
It should also be noted that in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may, in fact, be executed concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
In addition, the descriptions of "first", "second", “third”, and the like in the present invention are used for the purpose of description only, and are not to be construed as indicating or implying their relative importance or implicitly indicating the number of technical features indicated. Thus, features defining "first" and "second" may include at least one of the features, either explicitly or implicitly.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The invention comprises multiple components working together to ensure accurate and efficient plant leaf disease detection.
The image acquisition module is responsible for capturing plant leaf images from various sources, such as smartphones, drones, and imaging sensors. The system supports different image formats and resolutions, ensuring compatibility with diverse hardware.
Next, the data preprocessing module applies several image enhancement techniques, including normalization, contrast adjustments, and augmentation methods such as rotation, flipping, and zooming. These techniques help improve the model’s ability to generalize across different environmental conditions.
The feature extraction module utilizes a U-Net-based architecture designed for pixel-wise segmentation of disease-affected areas. The U-Net model consists of an encoder-decoder structure, where the encoder extracts hierarchical features and the decoder reconstructs the segmented disease regions.
After feature extraction, the classification module employs deep learning models such as EfficientNet, ResNet, and VGGNet. These models are pre-trained on large datasets and fine-tuned using transfer learning to classify plant diseases with high accuracy. The system is trained using supervised learning techniques, with labeled datasets containing healthy and diseased leaf images.
To ensure real-time performance, the invention integrates an optimized inference engine that accelerates model predictions on edge devices, including mobile phones and IoT-enabled agricultural tools. The inference engine optimizes memory usage and computational efficiency for seamless deployment in field conditions.
The real-time notification system provides instant alerts to farmers through a mobile application. When a disease is detected, the system offers actionable insights, such as recommended treatments, pesticide usage, and preventive measures. The notifications are generated based on a combination of disease severity and historical trends.
The system also features a cloud-based analytics module that stores disease detection results and trends over time. This module allows agricultural experts to analyze historical data, refine disease classification models, and provide predictive insights to farmers.
, Claims:1. A plant leaf disease detection system using deep learning and transfer learning, comprising:
o an image acquisition module for capturing plant leaf images,
o a preprocessing module for normalizing and augmenting the acquired images,
o a feature extraction module utilizing a U-Net-based architecture for segmenting disease-affected regions,
o a classification module incorporating CNN models pre-trained using transfer learning, and
o a real-time notification system for alerting users about detected plant diseases.
2. The system as claimed in claim 1, wherein the image acquisition module supports smartphone cameras, drones, and specialized imaging devices.
3. The system as claimed in claim 1, wherein the preprocessing module applies normalization, contrast adjustments, and data augmentation techniques to improve model robustness.
4. The system as claimed in claim 1, wherein the feature extraction module uses a U-Net encoder-decoder structure to identify disease-affected regions in plant leaf images.
5. The system as claimed in claim 1, wherein the classification module uses CNN architectures, including ResNet, VGGNet, and EfficientNet, to classify plant diseases.
6. The system as claimed in claim 1, wherein the real-time notification system provides farmers with disease-specific treatment recommendations.
7. The system as claimed in claim 1, wherein the cloud-based analytics module stores disease detection results for long-term analysis and predictive modeling.
8. The system as claimed in claim 1, wherein transfer learning techniques are employed to fine-tune CNN models for improved accuracy.
9. The system as claimed in claim 1, wherein an optimized inference engine ensures real-time performance on mobile and IoT devices.
10. The system as claimed in claim 1, wherein continuous model updates are enabled through user feedback and cloud-based learning mechanisms.

Documents

Application Documents

# Name Date
1 202541014660-STATEMENT OF UNDERTAKING (FORM 3) [20-02-2025(online)].pdf 2025-02-20
2 202541014660-REQUEST FOR EARLY PUBLICATION(FORM-9) [20-02-2025(online)].pdf 2025-02-20
3 202541014660-POWER OF AUTHORITY [20-02-2025(online)].pdf 2025-02-20
4 202541014660-FORM-9 [20-02-2025(online)].pdf 2025-02-20
5 202541014660-FORM FOR SMALL ENTITY(FORM-28) [20-02-2025(online)].pdf 2025-02-20
6 202541014660-FORM 1 [20-02-2025(online)].pdf 2025-02-20
7 202541014660-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [20-02-2025(online)].pdf 2025-02-20
8 202541014660-EVIDENCE FOR REGISTRATION UNDER SSI [20-02-2025(online)].pdf 2025-02-20
9 202541014660-EDUCATIONAL INSTITUTION(S) [20-02-2025(online)].pdf 2025-02-20
10 202541014660-DRAWINGS [20-02-2025(online)].pdf 2025-02-20
11 202541014660-DECLARATION OF INVENTORSHIP (FORM 5) [20-02-2025(online)].pdf 2025-02-20
12 202541014660-COMPLETE SPECIFICATION [20-02-2025(online)].pdf 2025-02-20