Sign In to Follow Application
View All Documents & Correspondence

Method And System For Automatically Identifying Crop Infestation

Abstract: Disclosed herein is a method and system for automatically identifying a crop infestation. The method comprises receiving, by an identification system (107), user selection (211) on at least one option from a plurality of options provided to the user (101), wherein each of the plurality of options relate to type of the crop infestation (109). Further, the method comprises receiving one or more images of an affected crop and metadata associated with the one or more images of the affected crop. Further, the method comprises verifying each of the one or more images using pre-trained neural network for determining usability of each of the one or more images. Furthermore, the method comprises predicting infected region in the one or more images according to the option selected by the user. Thereafter, the method comprises analyzing the infected region using pre-trained neural network for identifying the crop infestation in the infected region. FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
24 November 2021
Publication Number
21/2023
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
IPO@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-08-27
Renewal Date

Applicants

UPL LIMITED
UPL House, 610 B/2, Bandra Village, Off Western Express Highway, Bandra East, Mumbai 400051, India

Inventors

1. Prajakta Aher
UPL House, 610 B/2, Bandra Village, Off Western Express Highway, Bandra East, Mumbai 400051, India
2. Mohammad Shahbaz Hussain
UPL House, 610 B/2, Bandra Village, Off Western Express Highway, Bandra East, Mumbai 400051, India
3. Vedansh Kedia
UPL House, 610 B/2, Bandra Village, Off Western Express Highway, Bandra East, Mumbai 400051, India

Specification

FORM 2
THE PATENTS ACT 1970
[39 OF 1970]
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
[See section 10; Rule 13]
TITLE: “METHOD AND SYSTEM FOR AUTOMATICALLY IDENTIFYING CROP
INFESTATION”
Name and Address of the Applicant:
UPL LIMITED, UPL House, 610 B/2, Bandra Village, Off Western Express Highway, Bandra
East, Mumbai 400051, India.
Nationality: INDIAN
The following specification particularly describes the invention and the manner in which it is to be performed.

TECHNICAL FIELD
The present subject matter is, in general, related to identifying crop infestation in the agricultural crops and more particularly, but not exclusively, to a method and system for automatically identifying crop infestation in a real-time agricultural environment.
BACKGROUND
In the existing methods, diseases in the agricultural crops are identified and studied with the support of agricultural organizations and research institutions. However, due to lack of exposure and excessive dependence on human expertise, the conventional approaches are often considered as slow and ineffective in the current scenario. With the increase in internet usage, the modern approaches for diagnosing diseases in agricultural crops focus on providing various online services that enable farmers to easily access information related to the agricultural crops and associated diseases.
Further, with the advent of Artificial Intelligence (AI) and particularly, computer vision techniques, the recent approaches for diagnosing diseases in the agricultural crops leverage these techniques to design learning models that can automatically detect, analyze, and diagnose the diseases affecting the crops.
However, the existing AI based approaches use data collected from a controlled environment to train the learning models. Consequently, the existing AI based approaches fail to accurately diagnose the diseases that occur in a real-world agricultural environment. Also, the existing approaches are limited to diagnosis of the diseases, and do not assist farmers in managing the diseases.
Hence, in view of the above limitations in the existing approaches, there is a need for a diagnosis method that can accurately detect and analyze diseases in the real-world agricultural environment and further provide useful recommendations for controlling the diagnosed diseases.
The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.

SUMMARY
Disclosed herein is a method for automatically identifying a crop infestation in a real-time environment. The method comprises receiving, by an identification system, a user selection on at least one option from a plurality of options provided to the user, wherein each of the plurality of options relate to a type of the crop infestation. Further, the method comprises receiving one or more images of an affected crop and metadata associated with the one or more images of the affected crop. Further, the method comprises verifying each of the one or more images using a pre-trained neural network for determining a usability of each of the one or more images. Furthermore, the method comprises predicting an infected region in the one or more images according to the option selected by the user. Thereafter, the method comprises analyzing the infected region using the pre-trained neural network for identifying the crop infestation in the infected region.
Further, the present disclosure relates to an identification system for automatically identifying a crop infestation in a real-time environment. The identification system comprises a processor and a memory. The memory is communicatively coupled to the processor and stores processor-executable instructions, which on execution, cause the processor to receive a user selection on at least one option from a plurality of options provided to the user. Each of the plurality of options relate to a type of the crop infestation. Further, the instructions cause the processor to receive one or more images of an affected crop and metadata associated with the one or more images of the affected crop. Further, the instructions cause the processor to verify each of the one or more images using a pre-trained neural network for determining a usability of each of the one or more images. Furthermore, the instructions cause the processor to predict an infected region in the one or more images according to the option selected by the user. Thereafter, the instructions cause the processor to analyze the infected region using the pre-trained neural network for identifying the crop infestation in the infected region.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, explain the disclosed

principles. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and regarding the accompanying figures, in which:
FIG. 1 provides an overview of the proposed solution for automatically identifying a crop infestation in a real-time environment in accordance with some embodiments of the present disclosure.
FIG. 2 shows a detailed block diagram of an identification system for automatically identifying the crop infestation in accordance with some embodiments of the present disclosure.
FIG. 3 shows a flowchart illustrating a method for automatically identifying the crop infestation in accordance with some embodiments of the present disclosure.
FIG. 4 shows a flowchart indicating various steps involved in identifying the crop infestation and recommending products to a user in accordance with some embodiments of the present disclosure.
FIG. 5 illustrates a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether such computer or processor is explicitly shown.
DETAILED DESCRIPTION
In the present document, the word “exemplary” is used herein to mean “serving as an example, instance, or illustration”. Any embodiment or implementation of the present subject matter described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments.

While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however, that it is not intended to limit the disclosure to the specific forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
The terms “comprise”, “comprising”, “includes”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device, or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system.
In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
FIG. 1 provides an overview of automatically identifying a crop infestation in a real-time environment in accordance with some embodiments of the present disclosure.
In an embodiment, referring to FIG. 1, the user 101 may include, without limiting to, a farmer, a researcher, a landlord, a worker and the like. In an embodiment, the identification system 107 may be any computing system, which may be configured for automatically identifying a crop infestation 109 in a real-time environment of the crop, in accordance with embodiments of the present disclosure. As an example, the identification system 107 may include a dedicated computing unit such as, without limiting to, a smartphone, a laptop, a computer and the like. Alternatively, the proposed solution may be provided as an application platform, which may be downloaded and installed on the user device 103 and/or may be installed on a remote server and accessed through the user device 103 through Internet.

In an embodiment, when the user 101 is using the proposed solution or the application for the first time, the user 101 may be instructed to register with the identification system 107 and create a personalized user account. After successful registration, the user 101 may login to the application using the pre-registered login credentials. After login, the identification system 107 may provide a list of options to the user on a User Interface (UI) of the user device 101. As an example, each of the plurality of options provided to the user 101 may be related to a type of the crop infestation 109 that the user intends to identify from his crops. As an example, the crop infestation 109 may comprise at least one of diseases, weeds, and insects. The user 101 may select one of the options among the plurality of options displayed on the UI of the application. As an example, if the user 101 intends to analyze the crops for presence of any diseases, then the user may select the ‘diseases’ option on the UI. Similarly, if the user intends to assess the crops for presence of weeds or other unwanted plants, then the user 101 may select the ‘weeds’ option to determine if the crop contains any weeds. In an embodiment, receiving the user selection on one of the plurality of options related to type of the crop infestation helps in optimizing both the accuracy and speed of predicting the crop infestation.
In an embodiment, after receiving the user selection on at least one option from the plurality of options, the identification system 107 receives one or more images 105 of an affected crop from the user 101 through the user device 103. In one implementation, the user 101 may capture the one or more images 105 of the affected crop in real-time using a camera of the user device 103. In such an instance, the identification system 109 may activate an on-device camera of the user device 103 and prompt the user to capture the one or more real-time images 105 of the affected crop. Alternatively, the user 101 may choose to upload the one or more images 105 of the affected crop from a storage space associated with the user device 103. In this instance, the identification system 109 may allow the user to upload the one or more images 105 of the crop from a local storage of the user device 103.
In an embodiment, the identification system 109 may also collect metadata relating to each of the one or more images 105. In an implementation, the metadata may include, without limitation, at least one of an information related to location of the affected crop, resolution of the one or more images, time and data of capturing the one or more images, and a label associated with each of the one or more images 105. As an example, the label associated with the images 105 may include, without limitation, a name and/or stage of the crop infestation

109. As an example, the stage of the crop infestation may be one of, without limitation, egg, larvae, nymph, adult, and/or damage to the crop.
In an embodiment, after receiving one or more images 105 and the related metadata, the identification system 109 may verify each of the one or more images 105 using a pre-trained neural network for determining a usability of each of the one or more images 105. In an embodiment, determining the usability of the images 105 may include, without limiting to, checking if the captured image is blurry, distorted, skewed etc., to ensure that the captured image can be used for further analysis. If the captured image is found to be unusable, then the identification system 107 may prompt the user 101 to capture fresh images 105 of the infested crop and/or upload a different image 105 of the infested crop. On the other hand, if the captured image 105 is found to be usable, then the same may be considered for further analysis. As an example, the pre-trained neural network used for verifying and determining the usability of the images 105 may include, without limitation, at least one of a Convolution Neural Network (CNN), a Recurrent Neural Network (RNN), a Long Short-term Memory (LSTM), a recursive neural network, a graph convolutional network and a sequential neural network.
In an embodiment, after verifying each of the one or more images 105, the identification system 109 may predict an infected region in the verified one or more images 105 according to the option selected by the user 101. As an example, if the option selected by the user 101 is to analyze the crop for presence of a disease, then the identification system may predict the infected region that is likely to be affected by a disease. In an embodiment, the neural network is trained with a plurality of infected regions derived from a plurality of training images to identify the infected region in the one or more images 109. As an example, the plurality of training images, comprising the infected regions, are selected from the images of the crop based on the metadata associated with the one or one or more images 105.
In an embodiment, after predicting the infected region in the one or more images 103, the identification system 107 may analyze the infected region using the pre-trained neural network for identifying the crop infestation 109 in the infected region. In an embodiment, to identify the crop infestation 109, the identification system 109 may train the neural network with a plurality of crop infestations derived from a plurality of crop infestation images. In an embodiment, the plurality of crop infestation images for training may be selected from the images of the crop, based on the metadata comprising the name of the crop infestation. In an embodiment, the name and other details relating to the identified crop infestation 109 may be

displayed to the user through the UI of the user device 103. Further, the identification system 107 may receive a user input or user rating on the predicted crop infestation 109 and assign a confidence score to the pre-trained neural network based on the user 101 input. As an example, on a scale of 0-10, where 0 being a low rating and 10 being the highest rating, if the user has provided a rating of more than 8 to the predicted crop infestation 109, then the pre-trained neural network may be assigned with a higher confidence score. Alternatively, if the user rating is a value less than 3, then the pre-trained neural network may be assigned a low confidence score and subjected to further training. In an implementation, the identification system 107 identifies the crop infestation 109 based on a threshold value of a confidence score for the crop infestation. As an example, the threshold value for the crop infestation may be at least 30%.
In an embodiment, after identifying the crop infestation 109 by analyzing the infected region, the identification system 107 may recommend one or more products 111 for controlling and/or preventing the crop infestation 109 identified in the affected crop. As an example, the one or more recommended products 111 may include, without limitation, pesticides, fungicides, biological products, plant growth regulators, micro and macro nutrients, insecticides and the like. Additionally, the identification system 107 may provide other information related to the one or more recommended products such as, without limitation, name of the product, product description, contact details of the sellers or distributors of the product, dosage information of the product, Stock Keeping Units (SKU) information of the product and the like. In an implementation, the one or more products may be recommended based on the metadata associated with the one or more images 105 used for identifying the crop infestation 109. Alternatively, the one or more products may be recommended based on the location of the crop and/or the location of the user 101.
In an embodiment, the identification system 107 may be also configured to map the one or more products with the identified crop infestation 109, and subsequently, provide a comparative analysis of the price of the one or more recommended products. Also, the identification system 107 may provide information related to a selling price of the one or more products across different markets. Additionally, the identification system 107 may indicate historical prices of the same product along with some graphical representation about percentage of an increase or decrease in the price of the recommended product over a period of time. Also, the identification system 107 may provide an average price information of the recommended product for a predefined number of days.

In an implementation, the identification system 107 may provide details of the one or more sellers selling the recommended products 111 to the user, based on a geo-location of the crop and/or the user 101. In an embodiment, the identification system 107 may authenticate the user 101 and the one or more sellers at the time of registering them into the identification system 107 to establish a common platform for the user 101 and the one or more sellers to directly connect with each other.
In an embodiment, the identification system 107 may also allow the user 101 to access any information related to previously identified crop infestation 109 and/or information related to the one or more products recommended for controlling such crop infestation 109 from a database associated with the identification system 109. This allows the user 101 to control the crop infestation 109 even in scenarios where the user 101 is unable to capture the images of the crop and/or when the user 101 does not have access to the Internet. In an embodiment, the user 101 may share and discuss his/her own analysis and knowledge about various crop infestation 109 with a community of users 101 (for example, the farmers), who have come across similar crop infestation 109, using a database or a live chat window provided on the application.
In another embodiment, the identification system 107 may facilitate the user 101 for scouting one or more farms managed by the user 101 for determining a field-specific occurrence of the crop infestation 109. In an implementation, the user 101 may register the one or more farms by entering the details related to the one or more farms. The information related to the one or more farms may include, without limitation, a field name, a method of cultivation used in the one or more farms (for example, direct seeded or transplanted), a seed variety (for example, inbred or hybrid), sowing date, harvesting date, field area (for example, manually entering field area or geo-tagging and drawing a boundary of the farm) and the like. In an implementation, by using the above-said information of one or more farms of the user 101, the farms may be added to the application. Subsequently, the user 101 may be allowed to select the farms before scouting for any crop infestation 109 across farms.
In another embodiment, the identification system 107 may also provide a product listing and search functionality to the user 101,using which the user 101 may create a wish list of the recommended products from the scouting prediction or the products page.

In another embodiment, identification system 107 may also provide a field-specific information related to historical weather, current weather and a future forecast for the one or more farms managed by the user 101. As an example, the weather information may include, without limitation, whether the day is a sunny day, a rainy day and whether special events like a thunderstorm or a snowfall is expected in the day. Such field-specific information helps the user make suitable decisions on various activities to be taken up in the farms.
FIG. 2 shows a detailed block diagram of an identification system for automatically identifying a crop infestation in accordance with some embodiments of the present disclosure.
In an implementation, the identification system 107 may include a processor 201, a I/O Interface 203 and a memory 205. The memory 205 may comprise data 207 and one or more modules 209. The memory 205 may be communicatively coupled to the processor 201. The processor 201 may be configured to perform one or more functions of the identification system 107 for automatically identifying the crop infestation 109 in a real-time environment using one or more modules 209 stored in the memory 205. The I/O interface 203 may be configured for establishing a connection with a user device 103 and other entities like a remote server or a database associated with the identification system 107.
In an embodiment, the data 207 may include, without limitation, a user selection 211, one or more images 105, metadata 213, crop infestation 109 information and other data 215. In an implementation, the user selection 211 may be an option selected by the user 101 among the plurality of options provided to the user 101. The plurality of options may be provided on a dashboard of the user device 103. The plurality of options provided may comprise information related to the crop infestation 109. As an example, the crop infestation 109 may comprise at least one of diseases, weeds, and insects. In an implementation, the one or more images 105 may be the images of the affected crop captured using a user device 103 associated with the user 101 and/or images uploaded from a storage space.
In an implementation, the metadata 213 may include, without limitation, at least one of an information related to location of the affected crop, resolution of the one or more images, time and data of capturing the one or more images, and a label associated with each of the one or more images 105. As an example, the label associated with the images may include, without limitation, a name and/or stage of the crop infestation 109. In an embodiment, the stage of the crop infestation may include, without limitation, egg, larvae, nymph, adult, and damage to the

crop. The other data 215 may include various temporary data and files generated by the one or more modules 209 while performing various functions of the identification system 107.
In an embodiment, the data 207 may be processed by the one or more modules 209. In some implementations, the one or more modules 209 may be communicatively coupled to the processor 201 for performing one or more functions of the identification system 107. In an implementation, the one or more modules 209 may include, without limitation, a receiving module 217, verifying module 219, predicting module 221, analyzing module 223, pre-trained neural network model 225 and other modules 227.
As used herein, the term module refers to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality. In an embodiment, the other modules 227 may be used to perform various miscellaneous functionalities of the identification system 107. It will be appreciated that such one or more modules 209 may be represented as a single module or a combination of different modules.
In an embodiment, the receiving module 217 may be configured to receive a user selection 211 on at least one option from a plurality of options provided to the user 101, wherein each of the plurality of options relate to a type of the crop infestation 109. Further, the receiving module 217 may be configured to receive one or more images 105 of an affected crop and metadata 213 associated with the one or more images 105 of the affected crop. In an embodiment, the verification module 219 may be configured to verify each of the one or more images 105 using a pre-trained neural network for determining a usability of each of the one or more images 105. As an example, the pre-trained neural network may include, without limitation, at least one of a Convolution Neural Network (CNN), a Recurrent Neural Network (RNN), a Long Short-term Memory (LSTM), a recursive neural network, a graph convolutional network and a sequential neural network.
In an embodiment, the predicting module 221 may be configured to predict an infected region in the verified one or more images 105 according to the option selected by the user 101. In an implementation, the affected region corresponds to the region affected by the crop infestation 109. In an embodiment, the analyzing module 223 may be configured to analyze the infected

region using the pre-trained neural network model 225 for identifying the crop infestation 109 in the infected region.
FIG. 3 shows a flowchart illustrating a method for automatically identifying a crop infestation in accordance with some embodiments of the present disclosure.
As illustrated in FIG. 3, the method 300 may include one or more blocks illustrating a method for automatically identifying a crop infestation 109 in a real-time environment using an identification system 107 as illustrated in FIG. 1. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform specific functions or implement specific abstract data types.
The order in which the method 300 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
At block 302, the method 300 includes receiving, by an identification system 107, a user selection 211 on at least one option from a plurality of options provided to the user, wherein each of the plurality of options relate to a type of the crop infestation 109. In an implementation, the type of the crop infestation 109 comprises at least one of diseases of crops, weeds across the crops, and insects that damage the crops. As an example, the user 101 may select the option from the plurality of options displayed on dashboard of the application using the user device 103.
At block 304, the method 300 includes receiving, by the identification system 107, one or more images 105 of an affected crop and metadata 213 associated with the one or more images 105 of the affected crop. In an implementation, for receiving one or more images 105, the user 101 may capture one or more images 105 of the affected crop in real-time using a user device 103, and/or the user 101 may upload the one or more images 105 of the affected crop from a storage space. As an example, when the user 101 chooses the option among the plurality of options, the identification system 107 activates an on-device camera of the user device 103 and allows

the user to capture one or more real-time images 105 of the affected crop. In an alternative embodiment, when the user 101 chooses the option from the plurality of options, the identification system 107 allows the user to upload one or more images 105 of the crop from a pre-defined local storage. In an implementation, the pre-defined local storage may be a database associated with the user device 103.
In an implementation, the metadata associated with the one or more images 105 may include, without limitation, information related to location of the affected crop, resolution of the one or more images, time and date of capturing the one or more images, a label associated with each of the one or more images. As an example, the label associated with each of the one or more images may include without limitation, name and/or stage of the crop infestation. In an embodiment, the stage of the disease or the insect in the crop may include, without limitation, eggs, larvae, nymph, adult, damages and the like.
At block 306, the method 300 includes verifying, by the identification system 107, each of the one or more images using a pre-trained neural network for determining a usability of each of the one or more images 105. In an implementation, the pre-trained neural networks comprises a plurality of layers. The plurality of layers may include, without limitation, to convolution layers, maxpool layers, dense layers fully connected layers and the like. In an embodiment, the pre-trained neural network may be modified based on the crop infestation 109 that has to be identified. The neural network may be trained using the plurality of reference images of the crop captured from a real-world environment of the crop.
At block 308, the method 300 includes predicting, by the identification system 107, an infected region in the one or more images according to the option selected by the user 101.
At block 310, the method includes analyzing, by the identification system 107, the infected region using the pre-trained neural network for identifying the crop infestation 109 in the infected region.
Further, the method 300 includes recommending one or more products 111 for controlling the crop infestation 109 identified in the affected crop to the user. As an example, the one or more recommended products 111 may include, without limitation, pesticides, fungicides, biological products, plant growth regulators, micro and macro nutrients, insecticides and the like. In an embodiment, the information related to the one or more recommended products 111 may

include, without limitation, a product name, a product description, contact details of the sellers of the products, dosage information of the products, Stock Keeping Units (SKU) information of the products and the like. In an implementation, the one or more products is recommended based on the metadata 213 associated with the one or more images 105 used for identifying the crop infestation 109. In an embodiment, the one or more products is recommended based on the location of the crop and/or the user 101.
Further, the method 300 includes providing information related to the one or more sellers of the one or more products recommended to the user. Further, the method 300 includes providing field-specific information related to historical weather, current weather and future forecast for the one or more farms managed by the user. Additionally, the method 300 includes facilitating the user for scouting one or more farms managed by the user for determining a field-specific occurrence of the crop infestation.
FIG. 4 shows a flowchart for identifying a crop disease and recommending products to a user in accordance with some embodiments of the present disclosure.
At step 402, a user 101 may login to a personalized account created on the application. As an example, the user 101 may create his personalized account by signing up or registering with an application. In an implementation, the user 101 may download and install the application on a user device 103 and login into the application.
At step 404, the application may display plurality of options to the user 101 on a UI associated with the user device 103 to receive a user selection on at least one option from the plurality of options. The plurality of options relates to a type of the crop infestation 109. Subsequently, a camera in the user device 103 may be launched to capture at least one image of the affected crop, as indicated in the step 406.
At step 408, the application may receive the user selection 211. The user selection 211 may be at least one option selected from the plurality of options by the user 101.
At step 410, the user 101 may capture the one or more images 105 of the affected crop using the image acquisition device associated with the user device of the user 101. In another embodiment, the user 101 may upload the one or more images 105 of the affected crop from a storage space associated with the user device of the user 101.

At step 412, the application may predict the type of the crop infestation 109. In an implementation, the type of the crop infestation 109 comprises at least one of diseases of the crops, weeds across the crops, and insects that damage the crops. That is, after capturing the one or more images 105 or uploading the one or more images 105, of the affected crop, the pre-trained neural network analyses the one or more images 105 to identify the crop infestation 109.
At step 414, the application may recommend one or more products 111 to the user to control the identified crop infestation 109. As an example, the one or more recommended products 111 may include, without limitation, pesticides, fungicides, biological products, plant growth regulators, micro and macro nutrients, insecticides and the like.
At step 416, the application may recommend one or more sellers or distributors 113 of the one or more products 111. In an implementation, the user 101 may be connected to a computing unit or server associated with the identification system 109 through a network, using the user device 103. Further, the one or more sellers may also connect to the network and the computing unit, using the user device 103. The computing unit may provide details about the recommended products that can be used on the identified crop infestation 109, as well as the corresponding details of the one or more sellers selling the recommended products 111 to the user, based on the geo-location of the crop and the user 101.
At step 418, the user 101 may select an option to refer to the historical data after logging into the application and the user may select the image as indicated in step 420. The application may be configured to store the information related to previously identified crop infestation (i.e., historical data) provided to the user 101 on a storage associated with the application, to ensure that any reference or analysis previously made by the user 101 is not lost and can be revisited at any time. As a result, the user 101 may directly contact the sellers to enquire about the one or more recommended products 111.
Exemplary Computer System
FIG. 5 illustrates a block diagram of an exemplary computer system 500 for implementing embodiments consistent with the present disclosure. In an embodiment, the computer system 500 may be the identification system 107 illustrated in FIG. 1, which automatically identifies a crop infestation in a real-time environment. The computer system 500 may include a Central Processing Unit (“CPU” or “processor”) 502. The processor 502 may comprise at least one

data processor for executing program components for executing user-or-system generated processes. The processor 502 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc.
The processor 502 may be disposed in communication with one or more Input/Output (I/O) devices (510 and 511) via I/O interface 501. The I/O interface 501 may employ communication protocols/methods such as, without limitation, audio, analogue, digital, stereo, IEEE®-1394, serial bus, Universal Serial Bus (USB), infrared, PS/2, BNC, coaxial, component, composite, Digital Visual Interface (DVI), high-definition multimedia interface (HDMI), Radio Frequency (RF) antennas, S-Video, Video Graphics Array (VGA), IEEE® 802.n /b/g/n/x, Bluetooth, cellular (e.g., Code-Division Multiple Access (CDMA), High-Speed Packet Access (HSPA+), Global System For Mobile Communications (GSM), Long-Term Evolution (LTE) or the like), etc. Using the I/O interface 501, the computer system 500 may communicate with one or more I/O devices 510 and 511.
In some embodiments, the processor 502 may be disposed in communication with a communication network 509 via network interface 503. The network interface 503 may communicate with the communication network 509. The network interface 503 may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/Internet Protocol (TCP/IP), token ring IEEE® 802.11a/b/g/n/x, etc. Using the network interface 503 and the communication network 509, the computer system 500 may improve the process of identifying or detecting the presence of child seat in a vehicle. Further, the communication network 509 may be connected with the user device 103 associated with the user 101.
In an implementation, the communication network 509 may be implemented as one of the several types of networks, such as intranet or Local Area Network (LAN) and such within the organization. The communication network 509 may either be a dedicated network or a shared network, which represents an association of several types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), etc., to communicate with each other. Further, the communication network 509 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, etc.

In some embodiments, the processor 502 may be disposed in communication with a memory
505 (e.g., RAM 512, ROM 513, etc. as shown in FIG. 5) via a storage interface 504. The
storage interface 504 may connect to memory 505 including, without limitation, memory
drives, removable disc drives, etc., employing connection protocols such as Serial Advanced
Technology Attachment (SATA), Integrated Drive Electronics (IDE), IEEE-1394, Universal
Serial Bus (USB), fiber channel, Small Computer Systems Interface (SCSI), etc. The memory
drives may further include a drum, magnetic disc drive, magneto-optical drive, optical drive,
Redundant Array of Independent Discs (RAID), solid-state memory devices, solid-state drives,
etc.
The memory 505 may store a collection of program or database components, including, without limitation, user/application interface 506, an operating system 507, a web browser 508, and the like. In some embodiments, computer system 500 may store user/application data 506, such as the data, variables, records, etc. as described in this invention. Such databases may be implemented as fault-tolerant, scalable, secure databases such as distributed databases.
The operating system 507 may facilitate resource management and operation of the computer system 500. Examples of operating systems include, without limitation, APPLE® MACINTOSH® OS X®, UNIX®, UNIX-like system distributions (E.G., BERKELEY SOFTWARE DISTRIBUTION® (BSD), FREEBSD®, NETBSD®, OPENBSD, etc.), LINUX® DISTRIBUTIONS (E.G., RED HAT®, UBUNTU®, KUBUNTU®, etc.), IBM® OS/2®, MICROSOFT® WINDOWS® (XP®, VISTA®/7/8, 10 etc.), APPLE® IOS®, GOOGLE TM ANDROID TM, BLACKBERRY® OS, or the like.
The user interface 506 may facilitate display, execution, interaction, manipulation, or operation of program components through textual or graphical facilities. For example, the user interface
506 may provide computer interaction interface elements on a display system operatively
connected to the computer system 500, such as cursors, icons, check boxes, menus, scrollers,
windows, widgets, and the like. Further, Graphical User Interfaces (GUIs) may be employed,
including, without limitation, APPLE® MACINTOSH® operating systems’ Aqua®, IBM®
OS/2®, MICROSOFT® WINDOWS® (e.g., Aero, Metro, etc.), web interface libraries (e.g.,
ActiveX®, JAVA®, JAVASCRIPT®, AJAX, HTML, ADOBE® FLASH®, etc.), or the like.
The web browser 508 may be a hypertext viewing application. Secure web browsing may be provided using Secure Hypertext Transport Protocol (HTTPS), Secure Sockets Layer (SSL),

Transport Layer Security (TLS), and the like. The web browsers 508 may utilize facilities such as AJAX, DHTML, ADOBE® FLASH®, JAVASCRIPT®, JAVA®, Application Programming Interfaces (APIs), and the like. Further, the computer system 500 may implement a mail server stored program component. The mail server may utilize facilities such as ASP, ACTIVEX®, ANSI® C++/C#, MICROSOFT®, .NET, CGI SCRIPTS, JAVA®, JAVASCRIPT®, PERL®, PHP, PYTHON®, WEBOBJECTS®, etc. The mail server may utilize communication protocols such as Internet Message Access Protocol (IMAP), Messaging Application Programming Interface (MAPI), MICROSOFT® exchange, Post Office Protocol (POP), Simple Mail Transfer Protocol (SMTP), or the like. In some embodiments, the computer system 700 may implement a mail client stored program component. The mail client may be a mail viewing application, such as APPLE® MAIL, MICROSOFT® ENTOURAGE®, MICROSOFT® OUTLOOK®, MOZILLA® THUNDERBIRD®, and the like.
Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present invention. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., non-transitory. Examples include Random Access Memory (RAM), Read-Only Memory (ROM), volatile memory, nonvolatile memory, hard drives, Compact Disc (CD) ROMs, Digital Video Disc (DVDs), flash drives, disks, and any other known physical storage media.
Advantages of the embodiments of the present disclosure are illustrated herein.
In an embodiment, the present disclosure uses on-device filtering techniques to identify and reject distorted/unsuitable images of the crops and prompts the farmers to capture the quality images. As a result, the analysis of the unwanted images is avoided and also the unwanted images are prevented from entering the server. Consequently, the method of present disclosure provides a faster analysis and optimal usage of computing resources.
In an embodiment, according to the present disclosure, the pre-trained neural network, used for identifying the crop infestation, is trained with a plurality of images captured from the real-world environment of the crops, rather than the images captured from a controlled environment.

Consequently, the neural network of the present disclosure identifies the crop infestation with at most precision.
In an embodiment, the present disclosure can serve a greater number of users at a time. Further, the present disclosure efficiently identifies all the infestation occurring on the crops, and it also recommends suitable products for the identified crops to control the identified crop infestation.
In an embodiment, the user may select his/her one or more farm before scouting for crop infestation. As a result, the present disclosure helps in understanding field-specific occurrence of biotic stresses.
In an embodiment, the present disclosure create more awareness and better mapping of the one or more recommended products by using multiple predictions.
In an embodiment, the present disclosure provides a product listing and search functionality. As a result, the user may see all the products for herbicides, fungicides, insecticides, and Bio-Solutions.
In an embodiment, the present disclosure allows the user to search one or more products based on weeds, diseases and insects name or based on product’s name. Consequently, the user may directly search for the best products if they already know their weeds, diseases and insect’s name.
In an embodiment, according to the present disclosure, the user may wish list a product from scouting prediction or products page. As a result, the user details may be forwarded to the three nearest distributors of the one or more recommended products via. Consequently, the present disclosure increases the connectivity between the users and the sellers.
In an embodiment, the present disclosure enables the users to get the best price of their crop produce by looking into the market price of the respective countries with open-source dynamic websites.
In light of the technical advancements provided by the proposed method and the update manager, the claimed steps, as discussed above, are not routine, conventional, or well-known aspects in the art, as the claimed steps provide the aforesaid solutions to the technical problems existing in the conventional technologies. Further, the claimed steps clearly bring an

improvement in the functioning of the system itself, as the claimed steps provide a technical solution to a technical problem.
The terms "an embodiment", "embodiment", "embodiments", "the embodiment", "the embodiments", "one or more embodiments", "some embodiments", and "one embodiment" mean "one or more (but not all) embodiments of the invention(s)" unless expressly specified otherwise.
The terms "including", "comprising", “having” and variations thereof mean "including but not limited to", unless expressly specified otherwise.
The enumerated listing of items does not imply that any or all the items are mutually exclusive, unless expressly specified otherwise. The terms "a", "an" and "the" mean "one or more", unless expressly specified otherwise.
A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
When a single device or article is described herein, it will be clear that more than one device/article (whether they cooperate) may be used in place of a single device/article. Similarly, where more than one device/article is described herein (whether they cooperate), it will be clear that a single device/article may be used in place of the more than one device/article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of invention need not include the device itself.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the embodiments of the present invention are intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.

While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
Referral Numerals:

Reference Number Description
100 Exemplary arrangement
101 User
103 User device
105 Images
107 Identification system
109 Crop infestation
111 Recommend products
113 Recommend distributors
201 Processor
203 I/O Interface
205 Memory
207 Data
209 Modules
211 User selection
213 Metadata
215 Other data
217 Receiving module
219 Verification module
221 Predicting module
223 Analyzing module
225 Pre-trained model

227 Other modules
500 Computer system
501 Input/Output interface of the computer
502 Processor of the computer
503 Network interface
504 Storage interface
505 Memory of the computer
506 User/Application
507 Operating system
508 Web browser
509 Communication network
510 Input devices
511 Output devices
512 Random access memory
513 Read only memory

WE CLAIM:
1. A method for automatically identifying a crop infestation (109) in a real-time
environment, the method comprising:
receiving, by an identification system (107), a user selection (211) on at least one option from a plurality of options provided to the user, wherein each of the plurality of options relate to a type of the crop infestation (109);
receiving, by the identification system (107), one or more images (105) of an affected crop and metadata (213) associated with the one or more images (105) of the affected crop;
verifying, by the identification system (107), each of the one or more images (105) using a pre-trained neural network for determining a usability of each of the one or more images (105);
predicting, by the identification system (107), an infected region in the one or more images (105) according to the option selected by the user (101); and
analyzing, by the identification system (107), the infected region using the pre-trained neural network for identifying the crop infestation (109) in the infected region.
2. The method as claimed in claim 1, wherein the type of the crop infestation (109) comprises at least one of diseases, weeds and insects.
3. The method as claimed in claim 1, wherein receiving the one or more images (105) comprises:
capturing the one or more images (105) of the affected crop in real time using a user device (103); and/or
uploading the one or more images (105) of the affected crop from a storage space.
4. The method as claimed in claim 1, wherein the metadata (213) associated with the one
or more images (105) comprises at least one of information related to location of the
affected crop, resolution of the one or more images, time and date of capturing the one
or more images, and a label associated with each of the one or more images.

5. The method as claimed in claim 1, wherein the neural network is trained using a plurality of crop infestation (109) images and one or more crop infestation regions identified in each of the plurality of crop infestation images.
6. The method as claimed in claim 1, further comprises:
recommending one or more products for controlling the crop infestation (109) identified in the affected crop to the user (101); and
providing information related to one or more sellers of the one or more products recommended to the user (101).
7. The method as claimed in claim 6 further comprises providing field-specific information related to historical weather, current weather and a future forecast for the one or more farms managed by the user.
8. The method as claimed in claim 6, wherein recommending the one or more products further comprises:
mapping the one or more products with the crop infestation (109) identified; and
providing a comparative analysis of price of each of the one or more products.
9. The method as claimed in claim 1 further comprises facilitating the user for scouting one or more farms managed by the user (101) for determining a field-specific occurrence of the crop infestation (109).
10. The method as claimed in claim 1, wherein identifying the crop infestation (109) further comprises:
receiving a user (101) input on the accuracy of the crop infestation (109) identified; and
assigning a confidence score to the pre-trained neural network based on the user input.
11. An identification system (107) for automatically identifying a crop infestation (109) in
a real-time environment, the identification system comprising:
a processor (201); and

a memory (203), communicatively coupled to the processor (201), wherein the memory (205) stores the processor (201) executable instructions, which, on execution cause the processor (201) to:
receive a user selection (211) on at least one option from a plurality of options provided to the user (101), wherein each of the plurality of options relate to a type of crop infestation (109);
receive one or more images (105) of an affected crop and metadata (213) associated with the one or more images (105) of the affected crop;
verify each of the one or more images (105) using a pre-trained neural network for determining a usability of each of the one or more images (105);
predict an infected region in the one or more images (105) according to the option selected by the user (101); and
analyze the infected region using the pre-trained neural network for identifying the crop infestation (109) in the infected region.
12. The identification system (107) as claimed in claim 11, wherein the type of the crop infestation (109) comprises at least one of diseases, weeds and insects.
13. The identification system (107) as claimed in claim 11, wherein the processor (201) receives the one or more images (105) when:
the one or more images (105) of the affected crop are captured in real time using a user device (103); and/or
the one or more images (105) of the affected crop are uploaded from a storage space.
14. The identification system (107) as claimed in claim 11, wherein the metadata (213) associated with the one or more images (105) comprises at least one of information related to location of the affected crop, resolution of the one or more images, time and date of capturing the one or more images, and a label associated with each of the one or more images.
15. The identification system (107) as claimed in claim 11, wherein the processor (201) trains the neural network using a plurality of crop infestation images and one or more crop infestation regions identified in each of the plurality of crop infestation images.

16. The identification system (107) as claimed in claim 11, wherein the processor (201) is
further configured to:
recommend one or more products for controlling the crop infestation (109) identified in the affected crop to the user (101); and
provide information related to one or more sellers of the one or more products recommended to the user (101).
17. The identification system (107) as claimed in claim 16, wherein the processor (201) is
further configured to:
provide field-specific information related to historical weather, current weather and a future forecast for the one or more farms managed by the user (101).
18. The identification system (107) as claimed in claim 16, wherein the processor (201) is
further configured to :
mapping the one or more products with the crop infestation (109) identified; and
providing a comparative analysis of price of each of the one or more products.
19. The identification system (!07) as claimed in claim 11, wherein the processor (201) is configured to facilitate the user (101) for scouting one or more farms managed by the user for determining a field-specific occurrence of the crop infestation (109).
20. The identification system (107) as claimed in claim 11, wherein the processor (201) is further configured to:
receive a user input on the accuracy of the crop infestation (109) identified; and assign a confidence score to the pre-trained neural network based on the user input.

Documents

Application Documents

# Name Date
1 202121054111-STATEMENT OF UNDERTAKING (FORM 3) [24-11-2021(online)].pdf 2021-11-24
2 202121054111-PROVISIONAL SPECIFICATION [24-11-2021(online)].pdf 2021-11-24
3 202121054111-FORM 1 [24-11-2021(online)].pdf 2021-11-24
4 202121054111-DRAWINGS [24-11-2021(online)].pdf 2021-11-24
5 202121054111-DECLARATION OF INVENTORSHIP (FORM 5) [24-11-2021(online)].pdf 2021-11-24
6 202121054111-FORM-26 [11-01-2022(online)].pdf 2022-01-11
7 202121054111-Proof of Right [28-03-2022(online)].pdf 2022-03-28
8 202121054111-Request Letter-Correspondence [24-11-2022(online)].pdf 2022-11-24
9 202121054111-REQUEST FOR CERTIFIED COPY [24-11-2022(online)].pdf 2022-11-24
10 202121054111-Power of Attorney [24-11-2022(online)].pdf 2022-11-24
11 202121054111-Form 1 (Submitted on date of filing) [24-11-2022(online)].pdf 2022-11-24
12 202121054111-DRAWING [24-11-2022(online)].pdf 2022-11-24
13 202121054111-Covering Letter [24-11-2022(online)].pdf 2022-11-24
14 202121054111-CORRESPONDENCE-OTHERS [24-11-2022(online)].pdf 2022-11-24
15 202121054111-COMPLETE SPECIFICATION [24-11-2022(online)].pdf 2022-11-24
16 202121054111-FORM 18 [25-11-2022(online)].pdf 2022-11-25
17 202121054111-CORRESPONDENCE(IPO)(CERTIFIED COPY)-28-11-2022.pdf 2022-11-28
18 Abstract1.jpg 2022-12-09
19 202121054111-CORRESPONDENCE(IPO)-(WIPO DAS)-12-12-2022.pdf 2022-12-12
20 202121054111-FORM 3 [02-01-2023(online)].pdf 2023-01-02
21 202121054111-FER.pdf 2023-11-16
22 202121054111-FORM 3 [13-05-2024(online)].pdf 2024-05-13
23 202121054111-OTHERS [14-05-2024(online)].pdf 2024-05-14
24 202121054111-FER_SER_REPLY [14-05-2024(online)].pdf 2024-05-14
25 202121054111-DRAWING [14-05-2024(online)].pdf 2024-05-14
26 202121054111-CORRESPONDENCE [14-05-2024(online)].pdf 2024-05-14
27 202121054111-COMPLETE SPECIFICATION [14-05-2024(online)].pdf 2024-05-14
28 202121054111-CLAIMS [14-05-2024(online)].pdf 2024-05-14
29 202121054111-US(14)-HearingNotice-(HearingDate-01-08-2024).pdf 2024-07-08
30 202121054111-US(14)-ExtendedHearingNotice-(HearingDate-02-08-2024)-1100.pdf 2024-07-26
31 202121054111-FORM-26 [30-07-2024(online)].pdf 2024-07-30
32 202121054111-Correspondence to notify the Controller [30-07-2024(online)].pdf 2024-07-30
33 202121054111-Correspondence to notify the Controller [31-07-2024(online)].pdf 2024-07-31
34 202121054111-Written submissions and relevant documents [16-08-2024(online)].pdf 2024-08-16
35 202121054111-PatentCertificate27-08-2024.pdf 2024-08-27
36 202121054111-IntimationOfGrant27-08-2024.pdf 2024-08-27

Search Strategy

1 SearchHistory_202121054111E_10-11-2023.pdf

ERegister / Renewals

3rd: 27 Nov 2024

From 24/11/2023 - To 24/11/2024

4th: 27 Nov 2024

From 24/11/2024 - To 24/11/2025

5th: 19 Nov 2025

From 24/11/2025 - To 24/11/2026