Sign In to Follow Application
View All Documents & Correspondence

An Adaptive Neuro Fuzzy Based Resnext System For Crop Disease And Pest Detection

Abstract: Disclosed herein is an adaptive neuro-fuzzy based ResNeXt system for crop disease and pest detection (100) integrates multiple intelligent modules to deliver accurate and real-time agricultural diagnostics. The system begins with an image acquisition module (102) that captures crop-related data, including leaf images and environmental parameters. The captured data is analyzed by a ResNeXt-based deep convolutional neural network module (104), which performs multi-scale feature extraction to detect patterns, textures, and structural anomalies linked to diseases and pests. These extracted features are processed by an Adaptive Neuro-Fuzzy Inference System (106), which applies adaptive fuzzy rules and neuro-fuzzy reasoning to handle uncertain or overlapping symptoms. A hybrid diagnostic engine (108) combines these outputs to predict the type and severity of infestations. For efficient deployment, an edge computing interface (110) supports real-time, local processing, while a cloud storage and analytics unit (112) maintains historical records and updates inference rules. Finally, an alert generation module (114) issues proactive notifications to farmers.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
30 September 2025
Publication Number
44/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SR UNIVERSITY
ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA

Inventors

1. K. MAMATHA
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA
2. DR RUPESH KUMAR MISHRA
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA

Specification

Description:FIELD OF DISCLOSURE
[0001] The present disclosure relates generally relates to the field of agricultural informatics, precision farming, and computer vision-based plant health monitoring. More specifically, it pertains to an adaptive neuro-fuzzy based ResNeXt system for crop disease and pest detection.
BACKGROUND OF THE DISCLOSURE
[0002] Agriculture has historically formed the backbone of human civilization, sustaining societies through the production of food, raw materials, and employment. Over centuries, farming practices have undergone significant transformations, from traditional manual cultivation to mechanized approaches and, more recently, the integration of digital technologies. Despite these advancements, agriculture continues to face persistent challenges that compromise crop yield and quality. Among the most critical threats to agricultural productivity are crop diseases and pest infestations. These biotic stresses are responsible for significant yield losses globally and continue to be a primary source of economic, social, and food security concerns. For smallholder farmers, in particular, crop loss caused by pathogens and pests directly translates into diminished income and heightened vulnerability to poverty. In large-scale commercial agriculture, such issues not only result in financial losses but also trigger wider supply chain disruptions.
[0003] Plant diseases are caused by a wide range of pathogens, including fungi, bacteria, viruses, and nematodes, whereas pests include insects, mites, and other organisms that feed on or damage crops. The identification and management of these threats traditionally depend on human expertise, such as agronomists and plant pathologists who visually inspect crop leaves, stems, or fruits for symptoms. While this expertise-driven approach has long been relied upon, it is subject to human error, fatigue, and inconsistency, especially when dealing with large-scale fields. Moreover, many diseases manifest through subtle visual cues that can easily be overlooked during early stages of infection, reducing the effectiveness of control measures if intervention is delayed. Consequently, there is a growing demand for automated, reliable, and scalable detection systems that can accurately identify crop diseases and pest damage at early stages.
[0004] The history of crop protection reveals a gradual progression in methods of disease and pest detection. In earlier decades, farmers primarily depended on empirical observations, using their experience and local knowledge to recognize anomalies in plant appearance or growth patterns. As agricultural science advanced, microscopy and biochemical assays became prominent tools for diagnosing plant diseases. However, these techniques, though accurate, were resource-intensive, requiring laboratory infrastructure, skilled personnel, and substantial time to generate results. In a global agricultural environment that increasingly demands efficiency and real-time decision-making, traditional laboratory-based approaches fall short of providing timely interventions.
[0005] Digital image processing emerged as an early attempt to automate disease detection. By capturing images of crop leaves or fruits and applying classical computer vision algorithms such as thresholding, edge detection, and texture analysis, researchers sought to identify disease symptoms. While these approaches represented a significant step forward, they struggled with the variability inherent in real agricultural environments. Variations in lighting conditions, crop species, growth stages, and overlapping disease symptoms often resulted in poor classification accuracy. Moreover, the handcrafted features used in classical image processing lacked adaptability, limiting their robustness when confronted with new datasets or environmental conditions.
[0006] The advent of machine learning provided a new dimension to crop disease and pest detection. Traditional machine learning models, including support vector machines, decision trees, and k-nearest neighbors, were employed to classify crop images into diseased or healthy categories. These models leveraged feature extraction techniques such as color histograms, texture descriptors, and shape features to generate input data for classifiers. While this approach improved detection accuracy over purely manual or rule-based systems, it still relied heavily on domain expertise for feature engineering. Designing features that could generalize across diverse disease types and crop varieties remained a challenge, thereby constraining the scalability of such solutions.
[0007] A revolutionary breakthrough arrived with the introduction of deep learning, particularly convolutional neural networks (CNNs), which demonstrated remarkable capabilities in automatically learning hierarchical representations of image data. CNNs bypassed the limitations of manual feature engineering by enabling models to learn features directly from raw pixel data. Early research in applying CNNs to agriculture showed impressive results in distinguishing between healthy and diseased plant leaves. Large publicly available datasets, such as PlantVillage, further accelerated progress by providing benchmark images for training and validating CNN models. These developments brought the vision of real-time, field-deployable crop disease detection systems closer to reality.
[0008] However, the success of CNNs was accompanied by several challenges. Deep networks require vast amounts of labeled training data, which are not always available for every crop-disease-pest combination. Additionally, CNN-based models often function as black boxes, offering limited interpretability for farmers or agronomists who require transparent decision-making processes. In real-world agricultural contexts, factors such as varying illumination, occlusions, and the presence of multiple overlapping disease symptoms often reduce the performance of CNNs trained in controlled environments. Furthermore, while CNNs are powerful in identifying visual features, they are less effective in incorporating approximate reasoning or handling uncertainty, both of which are intrinsic to agricultural environments where symptoms are not always distinct or clear-cut.
[0009] In parallel with advancements in deep learning, researchers explored the integration of fuzzy logic systems into agricultural applications. Fuzzy logic, inspired by the way humans’ reason under uncertainty, provides a mathematical framework for handling imprecise or ambiguous information. Unlike traditional binary logic, which operates on rigid true/false values, fuzzy systems allow for intermediate degrees of truth. This makes them particularly well-suited for modeling complex agricultural scenarios where symptoms of disease or pest damage may not be clearly defined. For example, leaf discoloration may indicate a fungal infection, nutrient deficiency, or pest infestation, and fuzzy logic enables systems to represent this ambiguity in a way that more closely resembles human reasoning.
[0010] Over time, hybrid systems combining machine learning with fuzzy logic gained traction in crop health monitoring. Neuro-fuzzy systems, in particular, emerged as promising approaches that combined the adaptability of neural networks with the interpretability of fuzzy systems. These systems could learn fuzzy rules automatically from data while maintaining the capacity to reason with uncertainty. The integration of neuro-fuzzy frameworks into agriculture promised to enhance robustness and transparency in disease and pest detection. Nevertheless, conventional neuro-fuzzy systems often struggled with scalability when applied to high-dimensional data such as crop images, limiting their effectiveness in large-scale agricultural deployments.
[0011] While fuzzy systems brought interpretability, the evolution of deep learning continued to provide architectural innovations aimed at enhancing accuracy and efficiency. Among these innovations, ResNet (Residual Network) and its extensions such as ResNeXt gained attention for their superior performance in computer vision tasks. ResNet introduced the concept of residual connections, enabling the training of deeper networks without the vanishing gradient problem. ResNeXt built upon this by incorporating a split-transform-merge strategy that allowed for more expressive representations while maintaining computational efficiency. In agriculture, ResNeXt demonstrated strong performance in image classification tasks involving crop disease datasets, proving its capacity to handle complex visual features.
[0012] Despite these developments, agricultural systems face practical challenges that hinder widespread adoption of automated disease and pest detection models. Data scarcity remains a major issue, especially for crops grown in diverse geographical and climatic regions. Variations in disease manifestation across regions make it difficult to develop universal models. Moreover, many smallholder farmers lack access to high-performance computing infrastructure required to run deep learning models in real time. Thus, there is a pressing need for systems that not only achieve high accuracy but also balance efficiency, interpretability, and adaptability to real-world farming conditions.
[0013] In addition to technical considerations, socioeconomic factors also influence the adoption of digital agricultural systems. Farmers may be reluctant to trust automated recommendations without clear explanations, underscoring the importance of interpretability in detection systems. Furthermore, solutions must be affordable, user-friendly, and compatible with existing farming practices to ensure widespread implementation. The growing emphasis on sustainable agriculture and precision farming highlights the need for systems that reduce chemical usage by enabling targeted interventions based on accurate disease and pest detection.
[0014] As global food demand continues to rise due to population growth and climate change, the pressure on agriculture intensifies. Crop diseases and pests are projected to cause even greater losses if not managed effectively. Traditional methods, though useful, cannot scale to meet the demands of modern agriculture. The convergence of computer vision, deep learning architectures such as ResNeXt, and intelligent reasoning frameworks such as adaptive neuro-fuzzy systems offers a promising direction for addressing these challenges. However, prior systems have struggled to fully reconcile the strengths of these diverse approaches into a unified solution capable of operating effectively under real-world agricultural conditions.
[0015] Thus, in light of the above-stated discussion, there exists a need for an adaptive neuro-fuzzy based ResNeXt system for crop disease and pest detection.
SUMMARY OF THE DISCLOSURE
[0016] The following is a summary description of illustrative embodiments of the invention. It is provided as a preface to assist those skilled in the art to more rapidly assimilate the detailed design discussion which ensues and is not intended in any way to limit the scope of the claims which are appended hereto in order to particularly point out the invention.
[0017] According to illustrative embodiments, the present disclosure focuses on an adaptive neuro-fuzzy based ResNeXt system for crop disease and pest detection which overcomes the above-mentioned disadvantages or provide the users with a useful or commercial choice.
[0018] An objective of the present disclosure is to achieve early and precise diagnosis of crop diseases and pests in real-time, thereby reducing farmers’ reliance on manual visual inspection and subjective judgments.
[0019] Another objective of the present disclosure is to design an adaptive neuro-fuzzy framework integrated with ResNeXt architecture that enhances robustness in detecting crop diseases and pests under variable environmental conditions.
[0020] Another objective of the present disclosure is to develop a system capable of handling heterogeneous field conditions such as variable lighting, mixed infections, overlapping symptoms, and diverse crop types.
[0021] Another objective of the present disclosure is to improve reliability and accuracy of AI-driven detection models beyond laboratory settings by integrating fuzzy reasoning with deep convolutional feature extraction.
[0022] Another objective of the present disclosure is to minimize pesticide misuse and over-application by providing farmers with accurate and timely detection results, thereby lowering production costs and promoting sustainable agriculture.
[0023] Another objective of the present disclosure is to create an edge-compatible system architecture that enables local, offline processing without depending entirely on cloud connectivity, making it suitable for rural and low-connectivity regions.
[0024] Another objective of the present disclosure is to reduce false positives and false negatives in disease and pest classification by incorporating adaptive neuro-fuzzy decision-making mechanisms into the ResNeXt backbone.
[0025] Another objective of the present disclosure is to support smallholder farmers with limited resources by developing a lightweight, practical, and user-friendly detection system deployable on mobile and edge devices.
[0026] Another objective of the present disclosure is to enhance generalization ability of AI models so that the system can handle diverse crop species, different disease stages, and pest variations across regions.
[0027] Yet another objective of the present disclosure is to contribute to precision agriculture practices by providing a scalable, efficient, and adaptive AI system that boosts crop yield, ensures food security, and reduces environmental impact.
[0028] In light of the above, an adaptive neuro-fuzzy based ResNeXt system for crop disease and pest detection comprises an image acquisition module configured to capture crop-related data. The system also includes a ResNeXt-based deep convolutional neural network module configured to perform multi-scale feature extraction from the acquired crop images to identify patterns, textures, and structural anomalies associated with diseases and pests. The system also includes an Adaptive Neuro-Fuzzy Inference System (ANFIS) module configured to apply adaptive fuzzy rules and neuro-fuzzy reasoning to handle uncertain, overlapping, or ambiguous symptoms in the extracted features. The system also includes a hybrid diagnostic engine configured to generate predictive outputs indicating the presence, type, and severity of crop diseases or pest infestations. The system also includes an edge computing interface configured to perform real-time processing of crop data locally at farm-level devices to enable timely detection and reduce dependency on remote servers. The system also includes a cloud storage and analytics unit configured to securely store diagnostic results, environmental data, and historical crop health records. The system also includes an alert generation module configured to transmit proactive notifications to farmers regarding detected crop diseases or pest infestations.
[0029] In one embodiment, the image acquisition module further comprises IoT-enabled sensors configured to capture multimodal data including leaf images, weather parameters, soil moisture, and nutrient levels.
[0030] In one embodiment, the ResNeXt-based deep convolutional neural network module is configured to perform multi-scale feature extraction using cardinality-based grouped convolutions to improve robustness under varied lighting conditions and across diverse crop species.
[0031] In one embodiment, the Adaptive Neuro-Fuzzy Inference System (ANFIS) module is configured to dynamically update fuzzy inference rules based on continuous data streams received from the image acquisition module and the cloud storage and analytics unit.
[0032] In one embodiment, the Adaptive Neuro-Fuzzy Inference System is further configured to incorporate expert-defined agricultural rules combined with data-driven updates, thereby allowing both expert knowledge and machine learning to guide inference.
[0033] In one embodiment, the hybrid diagnostic engine is configured to continuously self-improve by retraining the ResNeXt-based convolutional neural network and updating fuzzy rules in the ANFIS module based on feedback from the cloud storage and analytics unit.
[0034] In one embodiment, the hybrid diagnostic engine is configured to integrate outputs from both the ResNeXt module and the ANFIS module using a weighted decision fusion mechanism to enhance classification accuracy in cases of overlapping or ambiguous disease symptoms.
[0035] In one embodiment, the edge computing interface is further configured to reduce latency by performing localized pre-processing and real-time inference, thereby enabling deployment in rural or low-connectivity agricultural environments.
[0036] In one embodiment, the cloud storage and analytics unit are further configured to aggregate data from multiple farms to enable large-scale disease trend analysis and predictive modeling across geographical regions.
[0037] In one embodiment, the alert generation module is configured to provide actionable recommendations including pesticide usage levels, irrigation requirements, and preventive crop management strategies in addition to disease and pest alerts.
[0038] These and other advantages will be apparent from the present application of the embodiments described herein.
[0039] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
[0040] These elements, together with the other aspects of the present disclosure and various features are pointed out with particularity in the claims annexed hereto and form a part of the present disclosure. For a better understanding of the present disclosure, its operating advantages, and the specified object attained by its uses, reference should be made to the accompanying drawings and descriptive matter in which there are illustrated exemplary embodiments of the present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0041] To describe the technical solutions in the embodiments of the present disclosure or in the prior art more clearly, the following briefly describes the accompanying drawings required for describing the embodiments or the prior art. Apparently, the accompanying drawings in the following description merely show some embodiments of the present disclosure, and a person of ordinary skill in the art can derive other implementations from these accompanying drawings without creative efforts. All of the embodiments or the implementations shall fall within the protection scope of the present disclosure.
[0042] The advantages and features of the present disclosure will become better understood with reference to the following detailed description taken in conjunction with the accompanying drawing, in which:
[0043] FIG. 1 illustrates a flowchart outlining sequential step involved in an adaptive neuro-fuzzy based ResNeXt system for crop disease and pest detection, in accordance with an exemplary embodiment of the present disclosure;
[0044] FIG. 2 illustrates a block diagram of an adaptive neuro-fuzzy based ResNeXt system for crop disease and pest detection, in accordance with an exemplary embodiment of the present disclosure.
[0045] Like reference, numerals refer to like parts throughout the description of several views of the drawing;
[0046] The adaptive neuro-fuzzy based ResNeXt system for crop disease and pest detection, which like reference letters indicate corresponding parts in the various figures. It should be noted that the accompanying figure is intended to present illustrations of exemplary embodiments of the present disclosure. This figure is not intended to limit the scope of the present disclosure. It should also be noted that the accompanying figure is not necessarily drawn to scale.
DETAILED DESCRIPTION OF THE DISCLOSURE
[0047] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure.
[0048] In the following description, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the present disclosure. It may be apparent to one skilled in the art that embodiments of the present disclosure may be practiced without some of these specific details.
[0049] Various terms as used herein are shown below. To the extent a term is used, it should be given the broadest definition persons in the pertinent art have given that term as reflected in printed publications and issued patents at the time of filing.
[0050] The terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items.
[0051] The terms “having”, “comprising”, “including”, and variations thereof signify the presence of a component.
[0052] Referring now to FIG. 1 to FIG. 2 to describe various exemplary embodiments of the present disclosure. FIG. 1 illustrates a flowchart outlining sequential step involved in an adaptive neuro-fuzzy based ResNeXt system for crop disease and pest detection, in accordance with an exemplary embodiment of the present disclosure.
[0053] An adaptive neuro-fuzzy based ResNeXt system for crop disease and pest detection 100 comprises an image acquisition module 102 configured to capture crop-related data. The image acquisition module 102 further comprises IoT-enabled sensors configured to capture multimodal data including leaf images, weather parameters, soil moisture, and nutrient levels.
[0054] The system also includes a ResNeXt-based deep convolutional neural network module 104 configured to perform multi-scale feature extraction from the acquired crop images to identify patterns, textures, and structural anomalies associated with diseases and pests. The ResNeXt-based deep convolutional neural network module 104 is configured to perform multi-scale feature extraction using cardinality-based grouped convolutions to improve robustness under varied lighting conditions and across diverse crop species.
[0055] The system also includes an Adaptive Neuro-Fuzzy Inference System (ANFIS) module 106 configured to apply adaptive fuzzy rules and neuro-fuzzy reasoning to handle uncertain, overlapping, or ambiguous symptoms in the extracted features. The Adaptive Neuro-Fuzzy Inference System (ANFIS) module 106 is configured to dynamically update fuzzy inference rules based on continuous data streams received from the image acquisition module and the cloud storage and analytics unit. The Adaptive Neuro-Fuzzy Inference System 106 is further configured to incorporate expert-defined agricultural rules combined with data-driven updates, thereby allowing both expert knowledge and machine learning to guide inference.
[0056] The system also includes a hybrid diagnostic engine 108 configured to generate predictive outputs indicating the presence, type, and severity of crop diseases or pest infestations. The hybrid diagnostic engine 108 is configured to continuously self-improve by retraining the ResNeXt-based convolutional neural network and updating fuzzy rules in the ANFIS module based on feedback from the cloud storage and analytics unit. The hybrid diagnostic engine 108 is configured to integrate outputs from both the ResNeXt module and the ANFIS module using a weighted decision fusion mechanism to enhance classification accuracy in cases of overlapping or ambiguous disease symptoms.
[0057] The system also includes an edge computing interface 110 configured to perform real-time processing of crop data locally at farm-level devices to enable timely detection and reduce dependency on remote servers. The edge computing interface 110 is further configured to reduce latency by performing localized pre-processing and real-time inference, thereby enabling deployment in rural or low-connectivity agricultural environments.
[0058] The system also includes a cloud storage and analytics unit 112 configured to securely store diagnostic results, environmental data, and historical crop health records. The cloud storage and analytics unit 112 are further configured to aggregate data from multiple farms to enable large-scale disease trend analysis and predictive modeling across geographical regions.
[0059] The system also includes an alert generation module 114 configured to transmit proactive notifications to farmers regarding detected crop diseases or pest infestations. The alert generation module 114 is configured to provide actionable recommendations including pesticide usage levels, irrigation requirements, and preventive crop management strategies in addition to disease and pest alerts.
[0060] FIG. 1 illustrates a flowchart outlining sequential step involved in an adaptive neuro-fuzzy based ResNeXt system for crop disease and pest detection.
[0061] At 102, the process commences with the image acquisition module, which is designed to capture crop-related data in the field. This includes high-resolution leaf images, weather information, and soil condition parameters. Leaf images provide visible indicators such as discoloration, spots, or unusual growths, while weather data and soil parameters contextualize crop health by adding external environmental influences that may contribute to disease or pest susceptibility. By collecting multimodal data rather than relying solely on visual information, this stage ensures that the system has a holistic view of the crop’s health environment. The acquisition module acts as the foundation of the system, providing raw inputs for all subsequent analytical stages.
[0062] At 104, the captured data is then processed by the ResNeXt-based deep convolutional neural network module, which specializes in extracting features at multiple scales from the acquired crop images. ResNeXt, as a deep learning architecture, brings an advanced capability of parallelized transformations, allowing it to analyze crop images in great detail. It identifies patterns in color variations, textures of leaves, and microstructural anomalies that might not be visible to the human eye. For instance, it can distinguish between leaf yellowing caused by nitrogen deficiency and similar discoloration induced by pest attacks, which traditional methods often misinterpret. The ResNeXt module’s role is essentially to provide a high-dimensional, feature-rich representation of the raw input data, laying the groundwork for the fuzzy reasoning process that follows.
[0063] At 106, the extracted features are subsequently passed into the Adaptive Neuro-Fuzzy Inference System (ANFIS) module. Unlike pure deep learning models that treat every feature as deterministic, the ANFIS module introduces interpretability and flexibility by applying fuzzy rules that can adapt over time. Agricultural diseases and pest infestations often present overlapping or ambiguous symptoms, where one disease may mimic another, or where environmental stress may manifest similarly to pest damage. The neuro-fuzzy reasoning mechanism handles such uncertainties by applying linguistic rules and continuously adjusting them through adaptive learning. This adaptive reasoning allows the system to make sense of ambiguous inputs and produce outputs that are both accurate and robust across varying field conditions, crop species, and environmental variations.
[0064] At 108, the integration of the ResNeXt module and ANFIS module occurs within the hybrid diagnostic engine. This engine is responsible for synthesizing the precise feature extraction capabilities of ResNeXt with the adaptive reasoning capabilities of ANFIS. The hybrid diagnostic engine generates predictive outputs that specify whether a crop is healthy or afflicted, and if afflicted, it classifies the disease or pest type while also estimating its severity level. Such detailed outputs go beyond simple classification by offering a layered diagnostic perspective. For example, the engine can not only identify a fungal infection but also assess whether the infection is mild, moderate, or severe, thereby providing more actionable insights for farm management.
[0065] At 110, to ensure that farmers can benefit from timely insights, the system incorporates an edge computing interface. This module enables real-time processing of the acquired data locally at farm-level devices, such as edge-enabled cameras or IoT hubs, without depending entirely on remote cloud servers. Edge computing is particularly crucial in rural or connectivity-limited regions where internet access may be sporadic. By performing computations locally, the edge interface ensures that disease or pest detections can be delivered instantly, allowing farmers to take prompt remedial actions. It also reduces latency, lowers bandwidth consumption, and provides resilience against connectivity disruptions, making the system highly practical for real-world agricultural environments.
[0066] At 112, while local decision-making is critical, the system also ensures long-term knowledge accumulation and large-scale analytics through the cloud storage and analytics unit. This unit securely stores diagnostic results, environmental metadata, and historical crop health records. The cloud storage serves not only as a repository for farmer-specific data but also as a platform where aggregated data from multiple farms can be analyzed collectively. The analytics performed in the cloud further enhance the adaptive fuzzy rules by feeding back improved parameters into the ANFIS module. This iterative feedback loop ensures that the system evolves continuously, learning from new diseases, emerging pest patterns, and changing climate influences. Security protocols in the cloud storage unit guarantee that farmer data is protected while still enabling scalable, collective intelligence.
[0067] At 114, the final stage of the flow is the alert generation module. Once the hybrid diagnostic engine produces results, and once these results are stored and refined, the alert generation module ensures that farmers receive timely, actionable notifications. These alerts can be transmitted through SMS, mobile applications, or IoT dashboards depending on the deployment environment. They contain not just a diagnosis of the disease or pest but also recommendations on the most effective interventions, such as optimized pesticide usage or preventive measures. By providing proactive notifications, the system empowers farmers to prevent small-scale infestations from escalating into large-scale crop losses, thereby supporting sustainability and productivity in agriculture.
[0068] FIG. 2 illustrates a block diagram of an adaptive neuro-fuzzy based ResNeXt system for crop disease and pest detection.
[0069] The process begins with the image capture stage, where raw agricultural data is gathered from crop fields. This includes leaf images, weather conditions, and soil parameters. These inputs serve as the foundational data sources for the system. To ensure quality and consistency, the raw inputs undergo a data preprocessing step, where noise is removed, formats are standardized, and key features are normalized before further analysis. This preprocessing stage ensures that the subsequent models receive optimized and clean data.
[0070] Once preprocessed, the data is fed into the ResNeXt deep learning model. ResNeXt, being a powerful convolutional neural network, specializes in feature extraction, identifying patterns such as leaf discoloration, texture irregularities, or growth anomalies that may indicate the presence of diseases or pests. At the same time, additional contextual information such as soil moisture and weather trends can also be factored into the analysis to enhance accuracy.
[0071] The extracted features from ResNeXt are then passed into the Adaptive Neuro-Fuzzy Inference System (ANFIS). This module brings the reasoning capability of fuzzy logic into the system. Unlike conventional deep learning models that might struggle with ambiguous or overlapping symptoms, the ANFIS layer applies adaptive fuzzy rules that continuously evolve based on incoming data. This allows the system to handle uncertain cases such as diseases that share visual similarities or pest infestations with inconsistent patterns. The adaptability ensures that as environmental conditions shift or new crop variants are introduced, the system remains accurate and reliable.
[0072] The edge device plays a critical role by enabling localized, real-time processing. Instead of relying solely on cloud servers, the edge device processes incoming data on-site, which is essential for farmers working in areas with limited connectivity. This ensures that detections and recommendations can be delivered without delays, making the system practical for real-world agricultural use.
[0073] Following analysis, the system produces detection results, which provide a clear indication of whether a disease or pest infestation is present, along with details of its type and severity. These results are not only stored for future reference but are also used to generate alerts and recommendations. Farmers receive actionable insights, such as suggested pesticide usage, crop treatment strategies, or preventive measures, thereby directly supporting agricultural decision-making.
[0074] The cloud database complements the edge processing by acting as a centralized repository for storing historical records, environmental parameters, and diagnostic outcomes. The cloud also facilitates long-term learning by continuously updating the fuzzy inference rules with aggregated data, thereby enhancing system performance over time. Moreover, it enables scalability, where data from multiple farms or regions can be collectively analyzed for broader agricultural insights.
[0075] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it will be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0076] A person of ordinary skill in the art may be aware that, in combination with the examples described in the embodiments disclosed in this specification, units and algorithm steps may be implemented by electronic hardware, computer software, or a combination thereof.
[0077] The foregoing descriptions of specific embodiments of the present disclosure have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the present disclosure to the precise forms disclosed, and many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described to best explain the principles of the present disclosure and its practical application, and to thereby enable others skilled in the art to best utilize the present disclosure and various embodiments with various modifications as are suited to the particular use contemplated. It is understood that various omissions and substitutions of equivalents are contemplated as circumstances may suggest or render expedient, but such omissions and substitutions are intended to cover the application or implementation without departing from the scope of the present disclosure.
[0078] Disjunctive language such as the phrase “at least one of X, Y, Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (e.g., X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present.
[0079] In a case that no conflict occurs, the embodiments in the present disclosure and the features in the embodiments may be mutually combined. The foregoing descriptions are merely specific implementations of the present disclosure, but are not intended to limit the protection scope of the present disclosure. Any variation or replacement readily figured out by a person skilled in the art within the technical scope disclosed in the present disclosure shall fall within the protection scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.
, Claims:I/We Claim:
1. An adaptive neuro-fuzzy based ResNeXt system for crop disease and pest detection (100) comprising:
an image acquisition module (102) configured to capture crop-related data;
a ResNeXt-based deep convolutional neural network module (104) configured to perform multi-scale feature extraction from the acquired crop images to identify patterns, textures, and structural anomalies associated with diseases and pests;
an Adaptive Neuro-Fuzzy Inference System (ANFIS) module (106) configured to apply adaptive fuzzy rules and neuro-fuzzy reasoning to handle uncertain, overlapping, or ambiguous symptoms in the extracted features;
a hybrid diagnostic engine (108) configured to generate predictive outputs indicating the presence, type, and severity of crop diseases or pest infestations;
an edge computing interface (110) configured to perform real-time processing of crop data locally at farm-level devices to enable timely detection and reduce dependency on remote servers;
a cloud storage and analytics unit (112) configured to securely store diagnostic results, environmental data, and historical crop health records;
an alert generation module (114) configured to transmit proactive notifications to farmers regarding detected crop diseases or pest infestations.
2. The system (100) as claimed in claim 1, wherein the image acquisition module (102) further comprises IoT-enabled sensors configured to capture multimodal data including leaf images, weather parameters, soil moisture, and nutrient levels.
3. The system (100) as claimed in claim 1, wherein the ResNeXt-based deep convolutional neural network module (104) is configured to perform multi-scale feature extraction using cardinality-based grouped convolutions to improve robustness under varied lighting conditions and across diverse crop species.
4. The system (100) as claimed in claim 1, wherein the Adaptive Neuro-Fuzzy Inference System (ANFIS) module (106) is configured to dynamically update fuzzy inference rules based on continuous data streams received from the image acquisition module and the cloud storage and analytics unit.
5. The system (100) as claimed in claim 1, wherein the Adaptive Neuro-Fuzzy Inference System (106) is further configured to incorporate expert-defined agricultural rules combined with data-driven updates, thereby allowing both expert knowledge and machine learning to guide inference.
6. The system (100) as claimed in claim 1, wherein the hybrid diagnostic engine (108) is configured to continuously self-improve by retraining the ResNeXt-based convolutional neural network and updating fuzzy rules in the ANFIS module based on feedback from the cloud storage and analytics unit.
7. The system (100) as claimed in claim 1, wherein the hybrid diagnostic engine (108) is configured to integrate outputs from both the ResNeXt module and the ANFIS module using a weighted decision fusion mechanism to enhance classification accuracy in cases of overlapping or ambiguous disease symptoms.
8. The system (100) as claimed in claim 1, wherein the edge computing interface (110) is further configured to reduce latency by performing localized pre-processing and real-time inference, thereby enabling deployment in rural or low-connectivity agricultural environments.
9. The system (100) as claimed in claim 1, wherein the cloud storage and analytics unit (112) are further configured to aggregate data from multiple farms to enable large-scale disease trend analysis and predictive modeling across geographical regions.
10. The system (100) as claimed in claim 1, wherein the alert generation module (114) is configured to provide actionable recommendations including pesticide usage levels, irrigation requirements, and preventive crop management strategies in addition to disease and pest alerts.

Documents

Application Documents

# Name Date
1 202541094058-STATEMENT OF UNDERTAKING (FORM 3) [30-09-2025(online)].pdf 2025-09-30
2 202541094058-REQUEST FOR EARLY PUBLICATION(FORM-9) [30-09-2025(online)].pdf 2025-09-30
3 202541094058-POWER OF AUTHORITY [30-09-2025(online)].pdf 2025-09-30
4 202541094058-FORM-9 [30-09-2025(online)].pdf 2025-09-30
5 202541094058-FORM FOR SMALL ENTITY(FORM-28) [30-09-2025(online)].pdf 2025-09-30
6 202541094058-FORM 1 [30-09-2025(online)].pdf 2025-09-30
7 202541094058-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [30-09-2025(online)].pdf 2025-09-30
8 202541094058-DRAWINGS [30-09-2025(online)].pdf 2025-09-30
9 202541094058-DECLARATION OF INVENTORSHIP (FORM 5) [30-09-2025(online)].pdf 2025-09-30
10 202541094058-COMPLETE SPECIFICATION [30-09-2025(online)].pdf 2025-09-30