Sign In to Follow Application
View All Documents & Correspondence

An Artificial Intelligence (Ai) Based Computing System For Calculating Toxicity Exposure Ratio (Ter) In Plants

Abstract: An Artificial Intelligence (AI) based computing system for calculating Toxicity Exposure Ratio (TER) in plants is disclosed. An AI based computing system (104) comprises one or more hardware processors and a memory (206) coupled to the one or more hardware processors. The memory (206) comprises a set of program instructions in the form of a plurality of modules (106), configured to be executed by the one or more hardware processors. The plurality of modules (106) comprises an image receiver module (208), an operation performing module (210), an infection detection module (212), a parameter detection module (214), a toxicity determination module (216), and a data output module (218). The data output module (218) is configured to output the determined TER on user interface screen of one or more electronic devices (102).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
28 March 2022
Publication Number
16/2022
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2023-06-29
Renewal Date

Applicants

Blu Cocoon Digital Private Limited
ASO 306, South Wing, Astra Towers, 2C/1 Action Area II C, Rajarhat, Newtown Kolkata, North 24 Parganas, West Bengal – 700115 India

Inventors

1. Pinaki Bhattacharyya
53DD/5, Mangalganthi Anupama Co-operative Housing Society, VIP Road. Kolkata 700052, West Bengal India Landmark- Behind Haldiram’s Prabhuji
2. Souvik Debnath
Flat No A101, Canopy Citadel, 7th Cross, Bank Avenue Extension, Babushapalya Main Road, Kalyan Nagar, Bangalore – 560043, Karnataka India

Specification

FIELD OF INVENTION

[0001] Embodiments of the present disclosure relates to fields of agriculture, pest control, food security analysis and environmental risk analysis and more particularly to an Artificial Intelligence (AI) based computing system for calculating Toxicity Exposure Ratio (TER) in plants.
BACKGROUND
[0002] Agricultural sector plays a strategic role in course of economic development of a country. Its role in the economic development of developing and underdeveloped countries is critical in food security and self-reliance. However, to make the agricultural and horticulture profitable, in the present world, it has to be done at a large scale with integration of technology. With onset of increased management of technological tools and applications, its benefits to this agricultural sector can help radically. Inadequate outcomes due to overseeing defects in crops and lack of smarter implementations can deter the expansion of the agriculture. It is need of hour to augment the conventional techniques to examine crops, seasonal pattern changes and the aftermath of harvesting in general with modern techniques.
[0003] Existing solutions does not detect a disease in plants by image processing. In research labs, scientist may detect the disease with the help of high resolution cameras. However, image captured from these high resolution cameras should be visible enough to naked eyes. Detecting disease in the plant or vegetation at an early stage is extremely important to take necessary steps at a right time. This may lead to loss of profit. In countries such as India and elsewhere, logistics may be an extreme issue. As it takes days to transport samples of diseased plant to a lab for testing and it takes days in order to get remediation on time. The sample of the diseased plant is a part of portion of infected or non-infected plants which has to be examined in the lab. Disease detection classification requires custom feature extraction model due to complexity of the images. Visual manifestation of various diseases is extremely different in terms of location, patterns, colours and the like and it is important on how AI model is trained to cover such broad spectrum.
[0004] Hence, there is a need for an improved Artificial Intelligence (AI) based computing system for calculating Toxicity Exposure Ratio (TER) in plants.
SUMMARY
[0005] This summary is provided to introduce a selection of concepts, in a simple manner, which is further described in the detailed description of the disclosure. This summary is neither intended to identify key or essential inventive concepts of the subject matter nor to determine the scope of the disclosure.
[0006] Embodiments of the present disclosure comprises an Artificial Intelligence (AI) based computing system for calculating Toxicity Exposure Ratio (TER) in plants. The AI based computing system comprises one or more hardware processors and a memory coupled to the one or more hardware processors. The memory comprises a set of program instructions in the form of a plurality of modules, configured to be executed by the one or more hardware processors. The plurality of modules comprises an image receiver module configured to receive one or more images of one or more plants grown in a specific area from one or more electronic devices. The plurality of modules further comprises an operation performing module configured to perform one or more operations on the received one or more images to enhance quality of the received one or more images. The plurality of modules further comprises an infection detection module configured to detect infected region and non-infected region from the enhanced one or more images of the one or more plants based on one or more features by using crop management based AI model. The plurality of modules further comprises a parameter detection module configured to detect one or more soil parameters and one or more environmental parameters associated with the specific area by using Internet of Things (IOT) sensors. The plurality of modules further comprises a toxicity determination module configured to determine Toxicity Exposure Ratio (TER) in the one or more plants based on the detected infected region, the detected non-infected region, number of the one or more plants in the enhanced one or more images, the detected one or more soil parameters and the detected one or more environmental parameters by using crop management based AI model. The plurality of modules further comprises a data output module configured to output the determined TER on user interface screen of the one or more electronic devices.
[0007] Embodiment of another disclosure comprises An Artificial Intelligence (AI) based method for calculating Toxicity Exposure Ratio (TER) in plants. The AI based method comprises receiving, by one or more hardware processors, one or more images of one or more plants grown in a specific area from one or more electronic devices. The AI based method further comprises performing, by the one or more hardware processors, one or more operations on the received one or more images to enhance quality of the received one or more images. The AI based method further comprises detecting, by the one or more hardware processors, infected region and non-infected region from the enhanced one or more images of the one or more plants based on one or more features by using crop management based AI model. The AI based method further comprises detecting, by the one or more hardware processors, one or more soil parameters and one or more environmental parameters associated with the specific area by using Internet of Things (IOT) sensors. The AI based method further comprises determining, by the one or more hardware processors, Toxicity Exposure Ratio (TER) in the one or more plants based on the detected infected region, the detected non-infected region, number of the one or more plants in the enhanced one or more images, the detected one or more soil parameters and the detected one or more environmental parameters by using crop management based AI model. The AI based method further comprises outputting, by the one or more hardware processors, the determined TER on user interface screen of the one or more electronic devices.
[0008] To further clarify the advantages and features of the present disclosure, a more particular description of the disclosure will follow by reference to specific embodiments thereof, which are illustrated in the appended figures. It is to be appreciated that these figures depict only typical embodiments of the disclosure and are therefore not to be considered limiting in scope. The disclosure will be described and explained with additional specificity and detail with the appended figures.
BRIEF DESCRIPTION OF DRAWINGS
[0009] The disclosure will be described and explained with additional specificity and detail with the accompanying figures in which:
[0010] FIG. 1 is a block diagram depicting a vegetation and plant disease detection environment, in accordance with an embodiment of the present disclosure;
[0011] FIG. 2 is a block diagram depicting plurality of modules of an AI based computing system, in accordance with an embodiment of the present disclosure;
[0012] FIG. 3 is a process flowchart depicting an Artificial Intelligence (AI) based method for calculating Toxicity Exposure Ratio (TER) in plants, in accordance with an embodiment of the present disclosure;
[0013] FIG. 4 is a tabular representation depicting exemplary crop names and diseases of crops, in accordance with an embodiment of the present disclosure;
[0014] FIG. 5 is a pictorial representation depicting a sample of wheat, in accordance with an embodiment of the present disclosure;
[0015] FIG. 6A-B are pictorial representations depicting an exemplary image of sample of wheat with enhanced image quality and a preprocessed image of sample of wheat, in accordance with an embodiment of the present disclosure;
[0016] FIG. 7A-C are pictorial representations depicting process of image segmentation for disease detection in sample of wheat, in accordance with an embodiment of the present disclosure; and
[0017] FIG. 8 is a schematic representation depicting a disease report of sample of wheat, in accordance with an embodiment of the present disclosure.
[0018] Further, those skilled in the art will appreciate that elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the figures with details that will be readily apparent to those skilled in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0019] For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiment illustrated in the figures and specific language will be used to describe them. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Such alterations and further modifications in the illustrated online platform, and such further applications of the principles of the disclosure as would normally occur to those skilled in the art are to be construed as being within the scope of the present disclosure.
[0020] In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
[0021] The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such a process or method. Similarly, one or more devices or subsystems or elements or structures or components preceded by "comprises... a" does not, without more constraints, preclude the existence of other devices, subsystems, elements, structures, components, additional devices, additional subsystems, additional elements, additional structures or additional components. Appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but not necessarily do, all refer to the same embodiment.
[0022] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by those skilled in the art to which this disclosure belongs. The system, methods, and examples provided herein are only illustrative and not intended to be limiting.
[0023] Throughout this document, the terms browser and browser application may be used interchangeably to mean the same thing. In some aspects, the terms web application and web app may be used interchangeably to refer to an application, including metadata, that is installed in a browser application. In some aspects, the terms web application and web app may be used interchangeably to refer to a website and/or application to which access is provided over a network (e.g., the Internet) under a specific profile (e.g., a website that provides email service to a user under a specific profile). The terms extension application, web extension, web extension application, extension app and extension may be used interchangeably to refer to a bundle of files that are installed in the browser application to add functionality to the browser application. In some aspects, the term application, when used by itself without modifiers, may be used to refer to, but is not limited to, a web application and/or an extension application that is installed or is to be installed in the browser application.
[0024] A computer system (standalone, client or server computer system) configured by an application may constitute a “module” (or “subsystem”) that is configured and operated to perform certain operations. In one embodiment, the “module” or “subsystem” may be implemented mechanically or electronically, so a module may comprise dedicated circuitry or logic that is permanently configured (within a special-purpose processor) to perform certain operations. In another embodiment, a “module” or “subsystem” may also comprise programmable logic or circuitry (as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations.
[0025] Accordingly, the term “module” or “subsystem” should be understood to encompass a tangible entity, be that an entity that is physically constructed permanently configured (hardwired) or temporarily configured (programmed) to operate in a certain manner and/or to perform certain operations described herein.
[0026] Referring now to the drawings, and more particularly to FIGs. 1 through 3, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method.
[0027] FIG. 1 is a block diagram depicting a vegetation and plant disease detection environment 100, in accordance with an embodiment of the present disclosure. The vegetation and plant disease detection environment 100 comprises one or more electronic devices 102, an AI based computing system 104, plurality of modules 106, network 108 and the like. The one or more electronic devices 102 may be a smart mobile device utilized to capture an image of a plant or the vegetation. Detailed explanation of the AI based computing system 104 and the plurality of modules 106 is explained in FIG. 2. The network 108 may be a wired network or a wireless network. The present invention is a mobile application which detects disease and infection on the plant or the vegetation at an early stage when visibility of the infection and the disease through naked eyes is minor. The present invention negates all noise and background from a received image to an extent of differentiating between mechanical injury and necrosis and infection of the disease. The present invention detects the diseases for standing crop or plantation. Identification of the plant and detection of the disease on the plant are merged in the present invention. The present invention deals with pattern recognition in pixels differences in its neighbour. This may predict the deformations, which eventually leads in identifying defected part irrespective of crop or plant. If the received image is classified as infected, then the present invention approximates the percentage of infected part by creating contours. In the received image, after negating non relevant or non-focal part of the received image, percentage of disease spread in the plant is calculated with respect to plant part or parts. Further, the present invention’s algorithm also calculates growth stage of the plant to predict chances of the disease spread in the plant. The present invention’s model is trained for multiple classifications based on classic image processing and mainly focusses on feature extraction. Further the present invention’s model is improvised and focussed on custom image feature extraction engineering. The present invention adopts hybrid approaches as it requires images to be treated differently due to its zooming dimension. Constant enhancement or contrast enhancement is utilized along with edge, cluster, and probabilistic segmentation. The present invention’s algorithm is preferred to identify a feature of a processed image. Mainly, the features may be the gradient matrix or histogram distribution of gradients.
[0028] The present invention detects the disease in the plant or the vegetation at very early stage of infection from one or more electronic device within thirty seconds’ time. The present invention may calculate the probable spreading of the infection based on processed image and algorithm. This is done after enhancing the received image. In this case, features are selected manually. Here, Histogram and Oriented Gradients (HOG) is applied which provides magnitude and orientation (angle) from its neighbours if pixel difference is significant in positive or negative orientation and if magnitude increases, which may help in distinguishing between toxicity and various diseases. As their intensity is different and gradient value is different with respect to its neighbours. The present invention calculates Toxicity Exposure Ratio if pesticides are applied to protect crop. This requires data of various nature apart from image analysis. This requires subscription of internet of things (IOT) data pertinent to soil and weather-related data and the like. This also has a component of rule-based algorithm. Here for lethal dose 50 (LD50) or no observed effect concentration (NOEC): Predicted Environmental Concentration (PEC) is termed as Toxicity Exposure Ratio (TER). The mobile application has been conceptualised in such a way that it calculates the amount of chemical to be applied on crops depending on the spread of the disease in a given land size. This eventually calculates the PEC of the chemical with the help of IOT devices and backend algorithm on a stepwise approach. Moreover, meteorological of a geographical region may be taken into consideration for the calculation of the TER. Plants may be crops, vegetation and the like.
[0029] FIG. 2 is a block diagram depicting plurality of modules 106 of an AI based computing system 104, in accordance with an embodiment of the present disclosure.
[0030] The processor(s) 202, as used herein, means any type of computational circuit, such as, but not limited to, a microprocessor, a microcontroller, a complex instruction set computing microprocessor, a reduced instruction set computing microprocessor, a very long instruction word microprocessor, an explicitly parallel instruction computing microprocessor, a digital signal processor, or any other type of processing circuit, or a combination thereof.
[0031] A memory 206 includes the plurality of modules 106 stored in the form of executable program which instructs the processor 202 via a bus 204 to perform the method steps illustrated in FIG. 2.
[0032] The AI based computing system 104 comprises one or more hardware processors and the memory 206 coupled to the one or more hardware processors. The memory 206 comprises a set of program instructions in the form of the plurality of modules 106, configured to be executed by the one or more hardware processors.
[0033] The plurality of modules 106 comprises an image receiver module 208 configured to receive one or more images of one or more plants grown in a specific area from one or more electronic devices 102.
[0034] The plurality of modules 106 further comprises an operation performing module 210 configured to perform one or more operations on the received one or more images to enhance quality of the received one or more images. The one or more operations comprise processing the received one or more images of the one or more plants by using the crop management based AI model. The processing of the received one or more images comprises contrast improvement and image size alignment. The one or more operations further comprises segregating the processed one or more images for colour of interests based on pixel value. The colour of interests comprises green, yellow, blue, and brown. The one or more operations further comprises evaluating the segregated one or more images to remove image noise and unwanted objects. The one or more operations further comprises extracting the one or more features from the evaluated one or more images by using the crop management based AI model. The one or more features comprises variance, energy, contrast, correlation, dissimilarity, and homogeneity.
[0035] The plurality of modules 106 further comprises an infection detection module 212 configured to detect infected region and non-infected region from the enhanced one or more images of the one or more plants based on one or more features by using crop management based AI model.
[0036] The plurality of modules 106 further comprises a parameter detection module 214 configured to detect one or more soil parameters and one or more environmental parameters associated with the specific area by using Internet of Things (IOT) sensors. The one or more soil parameters comprise: soil texture, soil pH, dispersion and colour of soil. The one or more environmental parameters comprise: temperature, humidity, air temperature, atmospheric pressure, precipitation, solar radiation, and wind.
[0037] The plurality of modules 106 further comprises a toxicity determination module 216 configured to determine Toxicity Exposure Ratio (TER) in the one or more plants based on the detected infected region, the detected non-infected region, number of the one or more plants in the enhanced one or more images, the detected one or more soil parameters and the detected one or more environmental parameters by using crop management based AI model. The toxicity determination module 216 is configured to correlate the detected infected region, the detected non-infected region, the number of the one or more plants in the enhanced one or more images, the detected one or more soil parameters and the detected one or more environmental parameters with each other by using the crop management based AI model and determine the TER in the one or more plants based on the result of correlation.
[0038] The plurality of modules 106 further comprises a data output module 218 configured to output the determined TER on user interface screen of the one or more electronic devices 102.
[0039] The AI based computing system 104 further comprises a pesticide volume determination module 220 configured to determine volume of pesticide required to treat the detected infected region based on the determined TER, the defected infection region, the number of the one or more plants, the detected one or more soil parameters, the detected one or more environmental parameters, size of the specific area and one or more predefined rules by using crop management based AI model. The volume of pesticide required to treat the detected infected region based on the determined TER are outputted on user interface screen of the one or more electronic devices 102.
[0040] FIG. 3 is a process flowchart depicting an Artificial Intelligence (AI) based method 300 for calculating Toxicity Exposure Ratio (TER) in plants, in accordance with an embodiment of the present disclosure. At step 302, one or more images of one or more plants grown in a specific area are received by one or more hardware processors from one or more electronic devices 102. At step 304, one or more operations are performed by one or more hardware processors on the received one or more images to enhance quality of the received one or more images. At step 306, infected region and non-infected region are detected by the one or more hardware processors from the enhanced one or more images of the one or more plants based on one or more features by using crop management based AI model. The infected region and the non-infected region of the one or more plants are depicted in FIG. 7C. At step 308, one or more soil parameters and one or more environmental parameters associated with the specific area are detected by the one or more hardware processors by using Internet of Things (IOT) sensors. At step 310, Toxicity Exposure Ratio (TER) in the one or more plants based on the detected infected region, the detected non-infected region, number of the one or more plants in the enhanced one or more images, the detected one or more soil parameters and the detected one or more environmental parameters are determined by the one or more hardware processors by using crop management based AI model. At step 312, the determined TER are outputted by the one or more hardware processors on user interface screen of the one or more electronic devices 102.
[0041] In another embodiment, the AI based method 300 comprises one or more operations. The one or more operations comprise processing the received one or more images of the one or more plants by using the crop management based AI model. The processing of the received one or more images comprises contrast improvement and image size alignment. The one or more operations further comprises segregating the processed one or more images for colour of interests based on pixel value. The colour of interests comprises green, yellow, blue, and brown. The one or more operations further comprises evaluating the segregated one or more images to remove image noise and unwanted objects. The one or more operations further comprises extracting the one or more features from the evaluated one or more images by using the crop management based AI model. The one or more features comprises variance, energy, contrast, correlation, dissimilarity, and homogeneity. The AI based method 300 further comprises TER in the one or more plants based on the detected infected region, the detected non-infected region, number of the one or more plants in the enhanced one or more images, the detected one or more soil parameters and the detected one or more environmental parameters by using crop management based AI model. The crop management based AI model comprises correlating the detected infected region, the detected non-infected region, the number of the one or more plants in the enhanced one or more images, the detected one or more soil parameters and the detected one or more environmental parameters with each other by using the crop management based AI model and determining the TER in the one or more plants based on the result of correlation. The AI based method 300 further comprises determining volume of pesticide required to treat the detected infected region based on the determined TER, the defected infection region, the number of the one or more plants, the detected one or more soil parameters, the detected one or more environmental parameters, size of the specific area and one or more predefined rules by using crop management based AI model. The AI based method 300 further comprises one or more soil parameters. The one or more soil parameters comprise soil texture, soil pH, dispersion, and colour of soil. The one or more environmental parameters comprise temperature, humidity, air temperature, atmospheric pressure, precipitation, solar radiation, and wind.
[0042] FIG. 4 is a tabular representation depicting exemplary crop names and diseases of crops 400, in accordance with an embodiment of the present disclosure. Here, it is inferred that, disease detected on brinjal crop are Phomopsis Blight, Leaf Spot and Wilt. Further, the disease detected on cucumber or cucurbit crop are Downey mildew. Further, the disease detected on paddy crop are Sheath Rot, False Smut, Blast, Blight, and the like. Further, the disease detected on potato are Late Blight and Aphids.
[0043] FIG. 5 is a pictorial representation depicting a sample of wheat 500, in accordance with an embodiment of the present disclosure. Diseases which are detected on wheat are Yellow rust and Root rot. Image of the sample of wheat 500 is captured through one or more electronic devices 102. The one or more electronic devices 102 is a smart mobile device through which a mobile application is downloaded and any individual may capture the image of the sample of wheat 500 and upload the image with the help of internet on the mobile application to find out the disease detected on the plant on spot.
[0044] FIG. 6A-B are pictorial representations depicting an exemplary image of a preprocessed image 600A of sample of wheat 500 and sample of wheat 500 with enhanced image quality 600B, in accordance with an embodiment of the present disclosure. AI Model is trained to distinctively identify diseases as their visual characteristics change from disease to disease. The type of AI model may be (K-Nearest Neighbour) KNN, Gradient Boosting and support vector machine (SVM). The AI model generalized with higher f1-score is considered. The present invention’s algorithm may detect the similarities distinctively even if the diseases on a plant or vegetation may appear similar in naked eyes. The type of algorithms are principle component analysis (PCA) and feature selection models which are utilized to select independent features. For example, yellow rust and yellow patch may appear similar due to external influence. Image pre-processing remains same with most of scenarios. Features differ with respect to the disease specifications for their species of plants respectively, as they may come up with their own specific disease types. The types of features are colours, texture, and the like.
[0045] FIG. 7A-C are pictorial representations depicting process of image segmentation 700A-C for disease detection in sample of wheat 500, in accordance with an embodiment of the present disclosure. FIG. 7A is a pictorial representation depicting intersection of actual image and masked image of sample of wheat 500. FIG. 7B and FIG. 7C are pictorial representations depicting masked image of sample of wheat 500. The infected region and the non-infected region of the sample of wheat 500 is depicted in FIG. 7C. The process of image segmentation 700A-C remains same for identifying yellow rust or image which has similar pattern as yellow rust. Algorithm for the image segmentation changes based on type of plant or object of interests and most importantly on the variety of image appearances. The present invention focuses on image quantization. Examples of the object of interests are any portion or part of the plant and pathogen which may or may not be associated with the plant disease. The present invention either applies clustering based image segmentation or probabilistic based image segmentation. In case of the clustering based image segmentation, this is starting from a rough initial clustering of pixels. Gradient ascent methods iteratively refine the clusters until some convergence criterion is met to form image segments or super pixels. These type of algorithms aims to minimise distance between the cluster centre and each pixel in the image. In the case of probabilistic based image segmentation, this helps in those situations when there is an overlap between clusters. Hence, data points or pixels in an overlap region have some probability to be assigned to both clusters.
[0046] FIG. 8 is a schematic representation depicting a disease report 800 of sample of wheat 500, in accordance with an embodiment of the present disclosure. Given below is the analysis from image segmentation to generate the disease report 800 of the sample of wheat 500. This is achieved based on image quantization and threshold tuning. The following are the steps of the analysis. At step one, pixel intensities or values are identified from the segmented region which are affected by the disease and are stored separately for each type of disease. At step two, features are present in the form of pixels for one segment if affected by the disease. At step three, a new input image is pre-processed and segmented. Further it is checked if any segmented area is closest to any pre-defined feature. If yes, then name of the disease is returned with respect to feature identified. In the disease report 800 it is inferred that estimated air light is one hundred and eighty eight. Given below is disease statistics which depicts yellow ratio to be 0.529986. Further, Yellow rust disease infection percentage is 52.9986%.
[0047] In an embodiment, the present invention comprises a classification algorithm for vegetation and plant disease detection. The classification algorithm predicts if the plant is infected or not and this may be predicted by level of spread of the disease in the plant. The present invention first pre-processes a received image. Image enhancement for contrast improvement for darker images is performed with the help of an algorithm called Contrast Limited Adaptive Histogram Equalization (CLAHE). Next, the present invention resizes the received image to 256x256. Next the present invention converts the received image to RGB from BGR. Further, the present invention performs colour analysis based on pixels utilizing sliding window. Here, colour based on interest is analysed on pixel value analysis. For example, yellow or brown. This is analysed based on R, G and B channels. Next, computational values are measured between channels or tensors based on the thresholds identified. For example, in this case, relative yellow colour range is less than or below zero point eight. Given below are the colour ranges based on the present invention’s analysis:
Green colour range: [47,75,52] to [200,225,23]
Yellow colour range: [97,80,14] to [220,201,99]
Blue colour range: [1,140,221] to [147,183,251]
Brown colour range: [44,77,64] to [147,113,86]

[0048] Further, the present invention performs removal of noise and background of the received image. Noise and background are removed other than object of interests based on colour and pixel analysis. It is possible that sky or bird and the like may appear as unwanted object. Next a Hue Saturation Value (HSV) is generated for the received image. Further, the present invention performs feature extraction and texture analysis. Here, grey-level co-occurrence matrix (GLCM) algorithm is utilized for active angle measured between zero to one hundred and fifty and colour scale between one to three. Next variance, energy, contrast, correlation, dissimilarity, and homogeneity and as feature set are computed and correlated for image analysis. Given below are observations when an image is infected by the disease:
[0049] Energy > 0.4, Asm > 0.2, Dissimilarity > 35 and Homogeneity > 0.4.
[0050] Further, the present invention performs model prediction algorithm. Here, regions are identified based on colour and pixel analysis. Next, texture and feature are analysed. Next, infected regions are calculated based on infected region and non-infected region (assumption – images are captured normally in entire plant and in group). Next, images are cropped based on the region identified and re-evaluated in case of identified region of colour interests matches but not identified as disease in the initial evaluation. It is further evaluated for if it is due to aging. Next, toxicity is evaluated based on the region of spread and the no of plants captured in the picture. Subsequently pesticide volume to be applied are estimated based on rule.
[0051] In an embodiment, the present invention has the following advantages. The present invention detects disease in a plant or vegetation at very early stage of infection from one or more electronic device within thirty seconds’ time. The present invention may calculate the probable spreading of the infection based on processed image and algorithm. This is done after enhancing the received image. In this case, features are selected manually. Here, Histogram and Oriented Gradients (HOG) applied provides magnitude and orientation (angle) from its neighbours if pixel difference is significant in positive or negative orientation and if magnitude increases, which may help in distinguishing between toxicity and various diseases. As their intensity is different and gradient value is different with respect its neighbours. The present invention calculates Toxicity Exposure Ratio (TER) if pesticides are applied to protect crop. This requires data of various nature apart from image analysis. This requires subscription of internet of things (IOT) data pertinent to soil and weather-related data and the like. This also has a component of rule-based algorithm. Here for (LD50/NOEC): Predicted Environmental Concentration (PEC) is termed as TER. The mobile application has been conceptualised such a way that it calculates the amount of chemical to be applied on crops depending on the spread of disease in a given land size. This eventually calculates the PEC of the chemical with the help of IOT devices and backend algorithm on a stepwise approach. Moreover, meteorological of a geographical region may be taken into consideration for the calculation of the TER. The purpose of the present invention is to leverage state of art AI powered computer vision to detect plant diseases without need of sophisticated tools or labs or any prior experience and yet the disease may be detected on spot.
[0052] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims.
[0053] The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various modules described herein may be implemented in other modules or combinations of other modules. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[0054] The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid-state memory, magnetic tape, a removable computer diskette, a random-access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.
[0055] Input/output (I/O) devices (including but not limited to keyboards, displays, pointing devices, etc.) can be coupled to the system either directly or through intervening I/O controllers. Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
[0056] A representative hardware environment for practicing the embodiments may include a hardware configuration of an information handling/computer system in accordance with the embodiments herein. The system herein comprises at least one processor or central processing unit (CPU). The CPUs are interconnected via system bus 204 to various devices such as a random-access memory (RAM), read-only memory (ROM), and an input/output (I/O) adapter. The I/O adapter can connect to peripheral devices, such as disk units and tape drives, or other program storage devices that are readable by the system. The system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments herein.
[0057] The system further includes a user interface adapter that connects a keyboard, mouse, speaker, microphone, and/or other user interface devices such as a touch screen device (not shown) to the bus to gather user input. Additionally, a communication adapter connects the bus to a data processing network, and a display adapter connects the bus to a display device which may be embodied as an output device such as a monitor, printer, or transmitter, for example.
[0058] A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention. When a single device or article is described herein, it will be apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be apparent that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
[0059] The specification has described a method and a system for. The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments. Also, the words "comprising," "having," "containing," and "including," and other similar forms are intended to be equivalent in meaning and be open-ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
[0060] Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the embodiments of the present invention are intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.

WE CLAIM:

1. An Artificial Intelligence (AI) based computing system (104) for calculating Toxicity Exposure Ratio (TER) in plants, the AI based computing system (104) comprising:
one or more hardware processors; and
a memory (206) coupled to the one or more hardware processors, wherein the memory (206) comprises a set of program instructions in the form of a plurality of modules (106), configured to be executed by the one or more hardware processors, wherein the plurality of modules (106) comprises:
an image receiver module (208) configured to receive one or more images of one or more plants grown in a specific area from one or more electronic devices (102);
an operation performing module (210) configured to perform one or more operations on the received one or more images to enhance quality of the received one or more images;
an infection detection module (212) configured to detect infected region and non-infected region from the enhanced one or more images of the one or more plants based on one or more features by using crop management based AI model;
a parameter detection module (214) configured to detect one or more soil parameters and one or more environmental parameters associated with the specific area by using Internet of Things (IOT) sensors;
a toxicity determination module (216) configured to determine Toxicity Exposure Ratio (TER) in the one or more plants based on the detected infected region, the detected non-infected region, number of the one or more plants in the enhanced one or more images, the detected one or more soil parameters and the detected one or more environmental parameters by using crop management based AI model; and
a data output module (218) configured to output the determined TER on user interface screen of the one or more electronic devices (102).

2. The AI based computing system (104) as claimed in claim 1, wherein the one or more operations comprise:
processing the received one or more images of the one or more plants by using the crop management based AI model, wherein the processing of the received one or more images comprises: contrast improvement and image size alignment;
segregating the processed one or more images for colour of interests based on pixel value, wherein the colour of interests comprises green, yellow, blue and brown;
evaluating the segregated one or more images to remove image noise and unwanted objects; and
extracting the one or more features from the evaluated one or more images by using the crop management based AI model, wherein the one or more features comprises variance, energy, contrast, correlation, dissimilarity, and homogeneity.

3. The AI based computing system (104) as claimed in claim 1, wherein in determining TER in the one or more plants based on the detected infected region, the detected non-infected region, number of the one or more plants in the enhanced one or more images, the detected one or more soil parameters and the detected one or more environmental parameters by using crop management based AI model, the toxicity determination module (216) is configured to:
correlate the detected infected region, the detected non-infected region, the number of the one or more plants in the enhanced one or more images, the detected one or more soil parameters and the detected one or more environmental parameters with each other by using the crop management based AI model; and
determine the TER in the one or more plants based on the result of correlation.

4. The AI based computing system (104) as claimed in claim 1, further comprises a pesticide volume determination module (220) configured to determine volume of pesticide required to treat the detected infected region based on the determined TER, the defected infection region, the number of the one or more plants, the detected one or more soil parameters, the detected one or more environmental parameters, size of the specific area and one or more predefined rules by using crop management based AI model.

5. The AI based computing system (104) as claimed in claim 1, wherein the one or more soil parameters comprise: soil texture, soil pH, dispersion and colour of soil and wherein the one or more environmental parameters comprise: temperature, humidity, air temperature, atmospheric pressure, precipitation, solar radiation and wind.

6. An Artificial Intelligence (AI) based method (300) for calculating Toxicity Exposure Ratio (TER) in plants, the AI based method (300) comprising:
receiving, by one or more hardware processors, one or more images of one or more plants grown in a specific area from one or more electronic devices (102);
performing, by the one or more hardware processors, one or more operations on the received one or more images to enhance quality of the received one or more images;
detecting, by the one or more hardware processors, infected region and non-infected region from the enhanced one or more images of the one or more plants based on one or more features by using crop management based AI model;
detecting, by the one or more hardware processors, one or more soil parameters and one or more environmental parameters associated with the specific area by using Internet of Things (IOT) sensors;
determining, by the one or more hardware processors, Toxicity Exposure Ratio (TER) in the one or more plants based on the detected infected region, the detected non-infected region, number of the one or more plants in the enhanced one or more images, the detected one or more soil parameters and the detected one or more environmental parameters by using crop management based AI model; and
outputting, by the one or more hardware processors, the determined TER on user interface screen of the one or more electronic devices (102).

7. The AI based method (300) as claimed in claim 1, wherein the one or more operations comprise:
processing the received one or more images of the one or more plants by using the crop management based AI model, wherein the processing of the received one or more images comprises: contrast improvement and image size alignment;
segregating the processed one or more images for colour of interests based on pixel value, wherein the colour of interests comprises green, yellow, blue and brown;
evaluating the segregated one or more images to remove image noise and unwanted objects; and
extracting the one or more features from the evaluated one or more images by using the crop management based AI model, wherein the one or more features comprises variance, energy, contrast, correlation, dissimilarity, and homogeneity.

8. The AI based method (300) as claimed in claim 1, wherein determining TER in the one or more plants based on the detected infected region, the detected non-infected region, number of the one or more plants in the enhanced one or more images, the detected one or more soil parameters and the detected one or more environmental parameters by using crop management based AI model comprises:
correlating the detected infected region, the detected non-infected region, the number of the one or more plants in the enhanced one or more images, the detected one or more soil parameters and the detected one or more environmental parameters with each other by using the crop management based AI model; and
determining the TER in the one or more plants based on the result of correlation.

9. The AI based method (300) as claimed in claim 1, further comprises determining volume of pesticide required to treat the detected infected region based on the determined TER, the defected infection region, the number of the one or more plants, the detected one or more soil parameters, the detected one or more environmental parameters, size of the specific area and one or more predefined rules by using crop management based AI model.

10. The AI based method (300) as claimed in claim 1, wherein the one or more soil parameters comprise: soil texture, soil pH, dispersion and colour of soil and wherein the one or more environmental parameters comprise: temperature, humidity, air temperature, atmospheric pressure, precipitation, solar radiation and wind.

Documents

Application Documents

# Name Date
1 202231017779-STATEMENT OF UNDERTAKING (FORM 3) [28-03-2022(online)].pdf 2022-03-28
2 202231017779-STARTUP [28-03-2022(online)].pdf 2022-03-28
3 202231017779-PROOF OF RIGHT [28-03-2022(online)].pdf 2022-03-28
4 202231017779-POWER OF AUTHORITY [28-03-2022(online)].pdf 2022-03-28
5 202231017779-FORM28 [28-03-2022(online)].pdf 2022-03-28
6 202231017779-FORM-9 [28-03-2022(online)].pdf 2022-03-28
7 202231017779-FORM FOR STARTUP [28-03-2022(online)].pdf 2022-03-28
8 202231017779-FORM FOR SMALL ENTITY(FORM-28) [28-03-2022(online)].pdf 2022-03-28
9 202231017779-FORM 18A [28-03-2022(online)].pdf 2022-03-28
10 202231017779-FORM 1 [28-03-2022(online)].pdf 2022-03-28
11 202231017779-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [28-03-2022(online)].pdf 2022-03-28
12 202231017779-EVIDENCE FOR REGISTRATION UNDER SSI [28-03-2022(online)].pdf 2022-03-28
13 202231017779-DRAWINGS [28-03-2022(online)].pdf 2022-03-28
14 202231017779-DECLARATION OF INVENTORSHIP (FORM 5) [28-03-2022(online)].pdf 2022-03-28
15 202231017779-COMPLETE SPECIFICATION [28-03-2022(online)].pdf 2022-03-28
16 202231017779-FER.pdf 2022-05-26
17 202231017779-POA [25-11-2022(online)].pdf 2022-11-25
18 202231017779-OTHERS [25-11-2022(online)].pdf 2022-11-25
19 202231017779-MARKED COPIES OF AMENDEMENTS [25-11-2022(online)].pdf 2022-11-25
20 202231017779-FORM 3 [25-11-2022(online)].pdf 2022-11-25
21 202231017779-FORM 13 [25-11-2022(online)].pdf 2022-11-25
22 202231017779-FER_SER_REPLY [25-11-2022(online)].pdf 2022-11-25
23 202231017779-ENDORSEMENT BY INVENTORS [25-11-2022(online)].pdf 2022-11-25
24 202231017779-CLAIMS [25-11-2022(online)].pdf 2022-11-25
25 202231017779-AMMENDED DOCUMENTS [25-11-2022(online)].pdf 2022-11-25
26 202231017779-US(14)-HearingNotice-(HearingDate-27-04-2023).pdf 2023-04-03
27 202231017779-FORM-26 [06-04-2023(online)].pdf 2023-04-06
28 202231017779-Correspondence to notify the Controller [06-04-2023(online)].pdf 2023-04-06
29 202231017779-Annexure [06-04-2023(online)].pdf 2023-04-06
30 202231017779-Written submissions and relevant documents [12-05-2023(online)].pdf 2023-05-12
31 202231017779-POA [12-05-2023(online)].pdf 2023-05-12
32 202231017779-MARKED COPIES OF AMENDEMENTS [12-05-2023(online)].pdf 2023-05-12
33 202231017779-FORM 3 [12-05-2023(online)].pdf 2023-05-12
34 202231017779-FORM 13 [12-05-2023(online)].pdf 2023-05-12
35 202231017779-Annexure [12-05-2023(online)].pdf 2023-05-12
36 202231017779-AMMENDED DOCUMENTS [12-05-2023(online)].pdf 2023-05-12
37 202231017779-PatentCertificate29-06-2023.pdf 2023-06-29
38 202231017779-IntimationOfGrant29-06-2023.pdf 2023-06-29
39 202231017779-RELEVANT DOCUMENTS [05-09-2023(online)].pdf 2023-09-05
40 202231017779-PROOF OF ALTERATION [24-05-2024(online)].pdf 2024-05-24
41 202231017779-PROOF OF ALTERATION [30-09-2024(online)].pdf 2024-09-30
42 202231017779-PROOF OF ALTERATION [30-09-2024(online)]-1.pdf 2024-09-30

Search Strategy

1 SearchHistory(23)E_25-05-2022.pdf
2 NPL2AE_20-03-2023.pdf

ERegister / Renewals

3rd: 19 Feb 2024

From 28/03/2024 - To 28/03/2025

4th: 26 Mar 2025

From 28/03/2025 - To 28/03/2026