Sign In to Follow Application
View All Documents & Correspondence

System And Method To Differentiate Between Plant Disease And Plant Mechanical Injuries

Abstract: A computing system (100) to classify plant disease infection and plant mechanical injuries is disclosed. The plurality of subsystems includes an image receiving subsystem (112), configured to receive images of plants grown. The plurality of subsystems includes an image contrast improving subsystem (114), configured to process the received images of plants using artificial intelligence-based image enhancing technique. The plurality of subsystems includes an image evaluation subsystem (116), configured to segregate the processed images for colour of interests based on pixel value and evaluate the segregated images to remove for image noise. The plurality of subsystems includes a feature extraction subsystem (118), configured to extract feature values from the evaluated one or more images. The plurality of subsystems includes an artificial intelligence (AI) based differentiating subsystem (120), configured to apply the extracted one or more feature values to a trained model and classify the extracted one or more feature values.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
28 March 2022
Publication Number
16/2022
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2023-08-08
Renewal Date

Applicants

Blu Cocoon Digital Private Limited
ASO 306, South Wing, Astra Towers, 2C/1 Action Area II C, Rajarhat, Newtown Kolkata, North 24 Parganas, West Bengal – 700115 India

Inventors

1. Pinaki Bhattacharyya
53DD/5, Mangalganthi Anupama Co-operative Housing Society, VIP Road. Kolkata 700052, West Bengal India , Landmark- Behind Haldiram’s Prabhuji
2. Souvik Debnath
Flat No A101, Canopy Citadel, 7th Cross, Bank Avenue Extension, Babushapalya Main Road, Kalyan Nagar, Bangalore – 560043, Karnataka India

Specification

FIELD OF INVENTION

[1] Embodiments of a present disclosure relates to decision making systems , and more particularly to a system and a method to differentiate between plant disease and plant mechanical injuries
BACKGROUND
[2] Plant health is often neglected even though healthy plants are crucial to human and animal survival. Nearly 80% of the food consumed by humans are provided by the plants. And simultaneously, the plants are primary source of nutrition for livestock. Plant diseases, pests and injuries often threaten the availability and safety of plants for human and animal consumption. Plant diseases are mainly caused by pathogenic organisms such as fungi, bacteria, viruses, protozoa, as well as insects and parasitic plant. Early detection of plant related problems will surely help human and animal survival in the long run.
[3] Conventionally, no system or process helps in differentiating easily among plant diseases, mechanical injuries or necrosis. Experienced individuals like Agri Experts or Entomologists can detect the difference through naked eyes. The detection of the plant diseases, the mechanical injuries and the necrosis is required at early growth stage to stop further spreading. In such way, the plants are protected beforehand from spreading of infection.
[4] Furthermore, the easy detection facility is needed to understand the probability of spread if left untreated. In research labs with high resolution camera, scientists may easily detect any infection associated with the plants and predict the spread manually through naked eye. However, the implementation of such high-resolution cameras is not economical and creating controlled environment in the middle of the agricultural field is not feasible.
[5] Hence, there is a need for an improved system to differentiate between plant disease and plant mechanical injuries and a method to operate the same to address the aforementioned issues.
BRIEF DESCRIPTION
[6] In accordance with one embodiment of the disclosure, a system to differentiate between plant disease and plant mechanical injuries is disclosed. The computing system includes a hardware processor. The computing system also includes a memory coupled to the hardware processor. The memory includes a set of program instructions in the form of a plurality of subsystems and configured to be executed by the hardware processor.
[7] The plurality of subsystems includes an image receiving subsystem. The image receiving subsystem is configured to receive one or more images of plants grown in a specific area as captured via one or more image capturing devices. The plurality of subsystems also includes an image contrast improving subsystem. The image contrast improving subsystem is configured to process the received one or more images of the plants using artificial intelligence-based image enhancing technique.
[8] The plurality of subsystems also includes an image evaluation subsystem. The image evaluation subsystem is configured to segregate the processed one or more images for colour of interests based on pixel value. The image evaluation subsystem is also configured to evaluate the segregated one or more images to remove for image noise and unwanted objects.
[9] The plurality of subsystems also includes a feature extraction subsystem. The feature extraction subsystem is configured to extract one or more feature values from the evaluated one or more images using artificial intelligence-based image feature extraction techniques. The plurality of subsystems also includes an artificial intelligence (AI) based differentiating subsystem. The artificial intelligence (AI) based differentiating subsystem is configured to apply the extracted one or more feature values to a trained artificial intelligence-based differentiator model. The artificial intelligence (AI) based differentiating subsystem is also configured to classify the extracted one or more feature values into one of plant disease and mechanical injuries based on results of the trained artificial intelligence-based differentiator model. The artificial intelligence (AI) based differentiating subsystem is also configured to output the output the classified one or more feature values on a user interface.
[10] In accordance with one embodiment of the disclosure, a method to differentiate between plant disease and plant mechanical injuries is disclosed. The method includes receiving one or more images of plants grown in a specific area as captured via one or more image capturing devices. The method also includes processing the received one or more images of the plants using artificial intelligence-based image enhancing technique. The method also includes segregating the processed one or more images for colour of interests based on pixel value.
[11] The method also includes evaluating the segregated one or more images to remove for image noise and unwanted objects. The method also includes extracting one or more feature values from the evaluated one or more images using artificial intelligence-based image feature extraction techniques. The method also includes applying the extracted one or more feature values to a trained artificial intelligence-based differentiator model. The method also includes classifying the extracted one or more feature values into one of plant disease and mechanical injuries based on results of the trained artificial intelligence-based differentiator model. The method also includes outputting the classified one or more feature values on a user interface.
[12] To further clarify the advantages and features of the present disclosure, a more particular description of the disclosure will follow by reference to specific embodiments thereof, which are illustrated in the appended figures. It is to be appreciated that these figures depict only typical embodiments of the disclosure and are therefore not to be considered limiting in scope. The disclosure will be described and explained with additional specificity and detail with the appended figures.
BRIEF DESCRIPTION OF THE DRAWINGS
[13] The disclosure will be described and explained with additional specificity and detail with the accompanying figures in which:
[14] FIG. 1 is a block diagram illustrating an exemplary computing system to differentiate between plant disease and plant mechanical injuries in accordance with an embodiment of the present disclosure;
[15] FIG. 2 A is an exemplary image of a wheat plant in accordance with an embodiment of the present disclosure;
[16] FIG. 2 B is an exemplary contrast enhanced image of the wheat plant in accordance with an embodiment of the present disclosure;
[17] FIG. 3 A-C is an exemplary processed image of the wheat plant in accordance with an embodiment of the present disclosure; and
[18] FIG. 4 is a process flowchart illustrating an exemplary method for to differentiate between plant disease and plant mechanical injuries in accordance with an embodiment of the present disclosure.
[19] Further, those skilled in the art will appreciate that elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the figures with details that will be readily apparent to those skilled in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[20] For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiment illustrated in the figures and specific language will be used to describe them. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Such alterations and further modifications in the illustrated online platform, and such further applications of the principles of the disclosure as would normally occur to those skilled in the art are to be construed as being within the scope of the present disclosure.
[21] The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such a process or method. Similarly, one or more devices or subsystems or elements or structures or components preceded by "comprises... a" does not, without more constraints, preclude the existence of other devices, subsystems, elements, structures, components, additional devices, additional subsystems, additional elements, additional structures or additional components. Appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but not necessarily do, all refer to the same embodiment.
[22] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by those skilled in the art to which this disclosure belongs. The system, methods, and examples provided herein are only illustrative and not intended to be limiting.
[23] In the following specification and the claims, reference will be made to a number of terms, which shall be defined to have the following meanings. The singular forms “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise.
[24] A computer system (standalone, client or server computer system) configured by an application may constitute a “subsystem” that is configured and operated to perform certain operations. In one embodiment, the “subsystem” may be implemented mechanically or electronically, so a subsystem may comprise dedicated circuitry or logic that is permanently configured (within a special-purpose processor) to perform certain operations. In another embodiment, a “subsystem” may also comprise programmable logic or circuitry (as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations.
[25] Accordingly, the term “subsystem” should be understood to encompass a tangible entity, be that an entity that is physically constructed permanently configured (hardwired) or temporarily configured (programmed) to operate in a certain manner and/or to perform certain operations described herein.
[26] FIG. 1 is a block diagram illustrating an exemplary computing system 100 to differentiate between plant disease and plant mechanical injuries in accordance with an embodiment of the present disclosure. Plant pathogens pose significant challenges to the agricultural industry in many countries as the pathogens lead to destruction of crops and plants. Both humane and animal livestock will suffer.
[27] To solve such problem, the computing system 100 helps to easily detect and classify plants among infected plant and plants with disorder such as environmental effect, nutrient deficiency, chemical injury and mechanically injured plant. For example, the mechanically inured plant may be non-infectious disorders such as environmental effect (draught, flood, frost, thunder), nutrient deficiency (soil nutrient), chemical injury (by chemical pesticides spill or spray over) and mechanical injury (improper ploughing). In accordance with classification, a caretaker may implement treatment process to stop the spread. Examples of plant infection includes black spot, Powdery mildew, Downey mildew, blight and the like.
[28] Specific plant crops have specific infection associated with them. Brinjal plant may be infected with Phomopsis Blight, leaf spot, wilt and the like. Cucumber plant may be infected with Downy mildew. Paddy plant with Sheath rot, False smut and the like. Potato plant may be infected with late blight, aphids and the like. The computing system 100 employs constant enhancement or contrast enhancement along with edge, cluster and probabilistic segmentation to detect whether a specific plant is infected or not. After such detection, in accordance with specific feature values the computing system 100 classifies between plant infection and mechanical injuries.
[29] The computing system 100 includes a hardware processor 108. The computing system 100 also includes a memory 102 coupled to the hardware processor 108. The memory 102 comprises a set of program instructions in the form of a plurality of subsystems and configured to be executed by the hardware processor 108. Input/output (I/O) devices 110 (including but not limited to keyboards, displays, pointing devices, etc.) may be coupled to the computing system 100 either directly or through intervening I/O controllers.
[30] The hardware processor(s) 108, as used herein, means any type of computational circuit, such as, but not limited to, a microprocessor, a microcontroller, a complex instruction set computing microprocessor, a reduced instruction set computing microprocessor, a very long instruction word microprocessor, an explicitly parallel instruction computing microprocessor, a digital signal processor, or any other type of processing circuit, or a combination thereof.
[31] The memory 102 includes a plurality of subsystems stored in the form of executable program which instructs the hardware processor 108 via bus 104 to perform method steps. The memory 102 has following plurality of subsystems: an image receiving subsystem 112, an image contrast improving subsystem 114, an image evaluation subsystem 116, a feature extraction subsystem 118 and an artificial intelligence (AI) based differentiating subsystem 120.
[32] Computer memory elements may include any suitable memory device(s) for storing data and executable program, such as read only memory, random access memory, erasable programmable read only memory, electrically erasable programmable read only memory, hard drive, removable media drive for handling memory cards and the like. Embodiments of the present subject matter may be implemented in conjunction with program modules, including functions, procedures, data structures, and application programs, for performing tasks, or defining abstract data types or low-level hardware contexts. Executable program stored on any of the above-mentioned storage media may be executable by the hardware processor(s) 108.
[33] The plurality of subsystems includes an image receiving subsystem 112. The image receiving subsystem 112 is configured to receive one or more images of plants grown in a specific area as captured via one or more image capturing devices. In one specific embodiment, the one or more image capturing devices are positioned in a specific point associated with the plants and configured to take continuous images. The one or more image capturing devices covers a specific area of interest.
[34] In such embodiment, the one or more image capturing devices includes mobile device image capturing device, handheld camera and the like. The one or more image capturing devices may capture single or multiple images of the associated plants. In such embodiment, the captured one or more images are inputted into the computing system 100.
[35] The plurality of subsystems also includes an image contrast improving subsystem 114. The image contrast improving subsystem 114 is configured to process the received one or more images of the plants using artificial intelligence-based image enhancing technique. In one particular embodiment, the processing of the captured one or more images is performed using a contrast improvement and image size alignment technique.
[36] In such embodiment, the artificial intelligence-based image enhancing technique includes Contrast Limited Adaptive Histogram Equalization (CLAHE) technique. Contrast limited adaptive histogram equalization (CLAHE) is used for improving the visibility level of foggy image or video.
[37] In operation, contrast limited adaptive histogram equalization (CLAHE) is a variant of adaptive histogram equalization (AHE) which takes care of over-amplification of the captured one or more images contrast. Contrast limited adaptive histogram equalization (CLAHE) operates on small regions in the image, called tiles, rather than the entire image. The neighbouring tiles are then combined using bilinear interpolation to remove the artificial boundaries. As used herein, the term “contrast” is the degree of difference between two colours or between the lightest lights and darkest darks in the image. The contrast limited adaptive histogram equalization (CLAHE) overcomes the above drawback, by confining to specific regions instead of going global.
[38] The image contrast improving subsystem helps in resizing the image taken into image of 256 width and 256 lengths. In such embodiment, resizing the image includes no resampling. The image's size is changed without changing the amount of data in that image. Resizing without resampling changes the image's physical size without changing the pixel dimensions in the image. In such embodiment, no data is added to or removed from the image. In one specific embodiment, one or more interpolation techniques are being used to increase and decrease image size. The one or more interpolation techniques are INTER_AREA, INTER_CUBIC, INTER_LINEAR, INTER_NEAREST and the like.
[39] The image contrast improving subsystem 114 also converts the captured image to Red Green Blue colour orientation from Blue Green red colour orientation. In such embodiment, the cvtColor function from OpenCV library is used to convert Red Green Blue colour orientation from Blue Green red colour orientation. The main difference between Red Green Blue colour orientation versus Blue Green red colour orientation is the arrangement of the subpixels for Red, Green, and Blue.
[40] The plurality of subsystems also includes an image evaluation subsystem 116. The image evaluation subsystem 116 is configured to segregate the processed one or more images for colour of interests based on pixel value. As used herein, the term “pixel” (or picture element) is the smallest item of information in an image. Each of the pixels that represents an image captured has a pixel value which describes how bright that pixel is, or what colour the pixel should be. The colour of interests includes green, yellow, blue and brown. The pixel value is analysed based on red, green and blue channels. In such embodiment, the segregation is mainly done by applying either clustering based segmentation or probabilistic based. For example, the image segmentation is done using pixel values, like pixel value [130,163,120] is closest to pixel value [138,170,110] than pixel value [150,124,148]. In such exemplary embodiment, the prior cluster is chosen.
[41] Computational values are measured between channels or tensors based on the thresholds identified. As used herein, the “tensors” are mathematical objects that may be used to describe physical properties and may be simply understood as an array.
[42] In one exemplary embodiment, the yellow colour range is less than or below 0.8. In another exemplary embodiment, green colour range is from [47,75,52] to [200,225,23], yellow colour range is from [97,80,14] to [220,201,99], blue colour range is from [1,140,221] to [147,183,251] and brown colour range is from [44,77,64] to [147,113,86]. The range signifies that pixel values in between this range are more likely to be that particular colour.
[43] The image evaluation subsystem 116 is also configured to evaluate the segregated one or more images to remove image noise and unwanted objects. Pixel values of unwanted objects are converted to [0,0,0] which is black colour. In such embodiment, Hue Saturation Value (HSV) model format are generated after evaluation of the segregated one or more images. Hue Saturation Value (HSV) is a cylindrical colour model that remaps the Red Green Blue primary colours into dimensions that are easier for humans to understand. Image noise is random variation of brightness or colour information in images and is usually an aspect of electronic noise.
[44] The plurality of subsystems also includes a feature extraction subsystem 118. The feature extraction subsystem 118 is configured to extract one or more feature values from the evaluated one or more images using artificial intelligence-based image feature extraction techniques. The one or more feature values comprises variance value, energy value, contrast value, correlation value, dissimilarity value, pixel value and homogeneity value. In such embodiment, the feature values are extracted after the image contrast is improved and the image noise has been removed. Correlation measures the joint probability occurrence of the specified pixel pairs.
[45] As used herein, the term “variance” provides idea about how the pixel values are spread across the image. As used herein, the term “energy” is a measure the localized change of the image. Homogeneity expresses how similar certain elements (pixels) of the image are.
[46] In one embodiment, the artificial intelligence-based image feature extraction techniques may include Gray-level co-occurrence matrix (GLCM) algorithm. A Gray-level co-occurrence matrices (GLCMs) is a matrix that is defined over an image to be the distribution of co-occurring pixel values (grayscale values, or colors) at a given offset.
[47] Gray-level co-occurrence matrix (GLCM) gives a measure of the variation in intensity at the pixel of interest. Gray-level co-occurrence matrix (GLCM) texture considers the relation between two pixels at a time, called the reference and the Neighbour pixel. Firstly, the computing system calculates the Gray-level co-occurrence matrix given the angle and scale. Using the matrix, all the features such as variance, energy and the like are calculated. For implementation, the system 100 imports greycomatrix and greycocorps functions from skimage library.
[48] In such embodiment, the Gray-level co-occurrence matrix (GLCM) algorithm has been used here for active angle measured between 0 to 150 and colour scale between 1 to 3. In such embodiment, by specifying the angle the computing system also predicts at that angle and scale, what is the distance.
[49] The plurality of subsystems also includes an artificial intelligence (AI) based differentiating subsystem 120. The artificial intelligence (AI) based differentiating subsystem 120 is configured to apply the extracted one or more feature values to a trained artificial intelligence-based differentiator model. The artificial intelligence (AI) based differentiating subsystem 120 is also configured to classify the extracted one or more feature values into one of plant disease and mechanical injuries based on results of the trained artificial intelligence-based differentiator model. In such embodiment, classifying the extracted one or more feature values into one of plant disease and mechanical injuries includes determining type of the one or more feature values. The classification also includes determining current values of the one or more feature values and classifying based on type and determined current values.
[50] The application of trained artificial intelligence-based differentiator model includes learning from a set of sample images, generating the artificial intelligence-based differentiator model and training the generated artificial intelligence-based differentiator model using the learnt set of sample images.
[51] In one embodiment, for mechanical injuries, the red pixel value ranges from 230 to 250, the green pixel value ranges from 186 to 220 and the blue pixel value ranges from 127 to 145. The artificial intelligence (AI) based differentiating subsystem differentiating subsystem is also configured to output the differentiated result. In such embodiment, for mechanical injury determination, the energy value should be greater than 0.6, the dissimilarity value should be greater than 55 and the homogeneity value should be greater than 0.6.
[52] In one specific exemplary embodiment, the computing system 100 compares the extracted variance value, energy value, contrast value, correlation value, dissimilarity value, pixel value, and homogeneity value with the prestored variance value, the prestored energy value, the prestored contrast value, the prestored correlation value, the prestored dissimilarity value, pixel value, and the prestored homogeneity value of scenario corresponding mechanical injury situation. In such exemplary embodiment, the artificial intelligence (AI) based differentiating subsystem 120 easily classify that whether the plant has infection or the plant has suffered mechanical injury.
[53] The computing system 100 also includes an infection spread detection subsystem. The infection spread detection subsystem is configured to identify current spread level of infection by analysing the evaluated one or more images. For such identified current spread of infection data, apply the identified current spread level of infection to a trained artificial intelligence based probabilistic model. In one embodiment, the artificial intelligence-based object detection technique usually trained to detect insect spread over a plant leaf such as an object. In such embodiment, object detection is a computer vision technique that allows us to identify and locate insects in the processed plant image. In such embodiment, the computing system 100 predicting probable spreading of infection value based on result of the trained artificial intelligence based probabilistic model
[54] In one embodiment, for predicting probable spreading of infection value based on result of the trained artificial intelligence based probabilistic model, the infection spread detection subsystem is configured to compute an infection spread score for the identified current spread level of infection. The infection spread detection subsystem is also configured to determining whether the computed infection spread score amounts to higher level of infection spread. The infection spread detection subsystem is also configured to predict probable spreading of infection value based on the determination.
[55] In one exemplary embodiment, the computing system 100 may detect presence of insects over 20% of processed plant leaf image. The infection spread detection subsystem on comparison with prestored probabilistic spread infections data may predict in how many days insect will completely infect the whole plant leaf. For example, in a given image, if BBCH growth stage is 15 and 30% of the image is Infected, the computing system 100 calculates the spread rate on the basis of Growth stage. On the other hand, the computing system 100 calculates the area of the field (by negating non focal parts like sky, soil, trees, background, human/animal/birds) has been captured in the image and on the basis of that, it will predict the chances of spreading in the whole field.
[56] FIG. 2 A is an exemplary image 202 of a wheat plant in accordance with an embodiment of the present disclosure. In one embodiment, a user with mobile device may capture an image of the wheat plant. The user inputs the image into the computing system 100. FIG. 2A depicts two features, namely color and texture. The color indicates the type of disease, for example, yellow disease detection, detect yellow appearing on leaves and the like. The texture indicates type of damage for example physically damaged plant might also be seen as having yellow color however infected leaves has more texture in it. FIG. 2 B is an exemplary contrast enhanced image 204 of the wheat plant in accordance with an embodiment of the present disclosure. In such embodiment, the contrast of the captured wheat image 202 is improved by using artificial intelligence-based image enhancing technique.
[57] FIG. 3 A-C is an exemplary processed image 302, 304 & 306 of the wheat plant in accordance with an embodiment of the present disclosure. In such embodiment, the wheat plant image is processed in accordance with pixel value. FIG. 3 B is a masked image 304 created using yellow range in Hue Saturation Value (HSV) channel. FIG. 3 C is formed using bit-wise operation of masked image 304 on original image with red, green blue orientation channel.
[58] FIG. 4 is a process flowchart illustrating an exemplary method 400 to differentiate between plant disease and plant mechanical injuries in accordance with an embodiment of the present disclosure. In step 402, one or more images of plants grown in a specific area is received as captured via one or more image capturing devices. In one aspect of the present embodiment, the one or more images of plants grown in the specific area is received as captured by an image receiving subsystem.
[59] In step 404, the received one or more images of the plants is processed using an artificial intelligence-based image enhancing technique. In one aspect of the present embodiment, the received one or more images of the plants is processed by an image contrast improving subsystem. In such embodiment, the processing of the captured one or more images comprises contrast improvement and image size alignment. The artificial intelligence-based image enhancing technique comprises Contrast Limited Adaptive Histogram Equalization (CLAHE) technique.
[60] In step 406, the processed one or more images is segregated for colour of interests based on pixel value by the image evaluation subsystem. In one aspect of the present embodiment, the processed one or more images is segregated by an image evaluation subsystem. In such embodiment, the colour of interests comprises green, yellow, blue and brown.
[61] In step 408, the segregated one or more images is evaluated to remove for image noise and unwanted objects. In one aspect of the present embodiment, the segregated one or more images is evaluated by the image evaluation subsystem.
[62] In step 410, one or more feature values is extracted from the evaluated one or more images using artificial intelligence-based image feature extraction techniques. In one aspect of the present embodiment, the one or more feature values is extracted from the evaluated one or more images by a feature extraction subsystem. In such embodiment, the one or more feature values comprises variance value, energy value, contrast value, correlation value, dissimilarity value, and homogeneity value.
[63] In step 412, the extracted one or more feature values are applied to a trained artificial intelligence-based differentiator model. In one aspect of the present embodiment, the extracted one or more feature values are applied by an artificial intelligence (AI) based differentiating subsystem 120. In such embodiment, applying a trained artificial intelligence-based differentiator model includes learning from a set of sample images, generating the artificial intelligence-based differentiator model and training the generated artificial intelligence-based differentiator model using the learnt set of sample images.
[64] In step 414, the extracted one or more feature values are classified into one of plant disease and mechanical injuries based on results of the trained artificial intelligence-based differentiator model. In one aspect of the present embodiment, the extracted one or more feature values are classified into one of plant disease and mechanical injuries based on results of the trained artificial intelligence-based differentiator model by the artificial intelligence (AI) based differentiating subsystem 120.
[65] In one embodiment, classifying the extracted one or more feature values into one of plant disease and mechanical injuries, the method includes determining type of the one or more feature values, determining current values of the one or more feature values and classifying based on type and determined current values.
[66] In step 416, the classified one or more feature values on a user interface is outputted. In one aspect of the present embodiment, the classified one or more feature values on the user interface is outputted by the artificial intelligence (AI) based differentiating subsystem 120.
[67] The method 400 includes identifying current spread level of infection by analysing the evaluated one or more images. Furthermore, the method 400 also includes applying the identified current spread level of infection to a trained artificial intelligence based probabilistic model. The method 400 also includes predicting probable spreading of infection value based on result of the trained artificial intelligence based probabilistic model.
[68] In one embodiment, predicting probable spreading of infection value based on result of the trained artificial intelligence based probabilistic model, the method 400 also includes computing an infection spread score for the identified current spread level of infection and determining whether the computed infection spread score amounts to higher level of infection spread. In such embodiment, the method 400 includes predicting probable spreading of infection value based on the determination.
[69] Various embodiments of the present disclosure relate to easy way to differentiate whether the plant suffered a mechanical injury or and infection is spreading. After such differentiation, the computing system 100 may also predict the probability of spread of infection, growth stage, percentage chance of loss from the infection.
[70] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims.
[71] The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various modules described herein may be implemented in other modules or combinations of other modules. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[72] The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid-state memory, magnetic tape, a removable computer diskette, a random-access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.
[73] Input/output (I/O) devices (as shown in FIG. 1) (including but not limited to keyboards, displays, pointing devices, etc.) can be coupled to the system either directly or through intervening I/O controllers. Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
[74] A representative hardware environment for practicing the embodiments may include a hardware configuration of an information handling/computer system in accordance with the embodiments herein. The system herein comprises at least one processor or central processing unit (CPU). The CPUs are interconnected via system bus to various devices such as a random-access memory (RAM), read-only memory (ROM), and an input/output (I/O) adapter. The I/O adapter can connect to peripheral devices, such as disk units and tape drives, or other program storage devices that are readable by the system. The system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments herein.
[75] The system further includes a user interface adapter that connects a keyboard, mouse, speaker, microphone, and/or other user interface devices such as a touch screen device (not shown) to the bus to gather user input. Additionally, a communication adapter connects the bus to a data processing network, and a display adapter connects the bus to a display device which may be embodied as an output device such as a monitor, printer, or transmitter, for example.
[76] A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention. When a single device or article is described herein, it will be apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be apparent that a single device/article may be used in place of the more than one device or article, or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
[77] The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed.
[78] The figures and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, order of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts need to be necessarily performed. Also, those acts that are not dependant on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples.

WE CLAIM:

1. A system (100) to differentiate between plant disease and plant mechanical injuries, the system (100) comprising:
a hardware processor (108); and
a memory (102) coupled to the hardware processor (108), wherein the memory (102) comprises a set of program instructions in the form of a plurality of subsystems, configured to be executed by the hardware processor (108), wherein the plurality of subsystems comprises:
an image receiving subsystem (112) configured to receive one or more images of plants grown in a specific area as captured via one or more image capturing devices;
an image contrast improving subsystem (114) configured to process the received one or more images of the plants using artificial intelligence-based image enhancing technique, wherein the processing of the captured one or more images comprises contrast improvement and image size alignment;
an image evaluation subsystem (116) configured to
segregate the processed one or more images for colour of interests based on pixel value, wherein the colour of interests comprises green, yellow, blue and brown; and
evaluate the segregated one or more images to remove for image noise and unwanted objects;
a feature extraction subsystem (118) configured to extract one or more feature values from the evaluated one or more images using artificial intelligence-based image feature extraction techniques, wherein the one or more feature values comprises variance value, energy value, contrast value, correlation value, dissimilarity value, and homogeneity value; and
an artificial intelligence (AI) based differentiating subsystem (120) configured to:
apply the extracted one or more feature values to a trained artificial intelligence-based differentiator model;
classify the extracted one or more feature values into one of plant disease and mechanical injuries based on results of the trained artificial intelligence-based differentiator model; and
output the classified one or more feature values on a user interface.
2. The system (100) as claimed in claim 1, further comprising an infection spread detection subsystem configured to:
identify current spread level of infection by analysing the evaluated one or more images;
apply the identified current spread level of infection to a trained artificial intelligence based probabilistic model; and
predicting probable spreading of infection value based on result of the trained artificial intelligence based probabilistic model.
3. The system (100) as claimed in claim 1, wherein the artificial intelligence-based image enhancing technique comprises Contrast Limited Adaptive Histogram Equalization (CLAHE) technique.
4. The system (100) as claimed in claim 2, wherein for predicting probable spreading of infection value based on result of the trained artificial intelligence based probabilistic model, the infection spread detection subsystem is configured to:
compute an infection spread score for the identified current spread level of infection;
determine whether the computed infection spread score amounts to higher level of infection spread; and
predict probable spreading of infection value based on the determination.
5. The system (100) as claimed in claim 2, wherein for applying a trained artificial intelligence-based differentiator model comprises:
learning from a set of sample images;
generating the artificial intelligence-based differentiator model; and
training the generated artificial intelligence-based differentiator model using the learnt set of sample images.
6. The system (100) as claimed in claim 2, wherein classifying the extracted one or more feature values into one of plant disease and mechanical injuries, the artificial intelligence (AI) based differentiating subsystem (120) comprises:
determining type of the one or more feature values;
determining current values of the one or more feature values; and
classifying based on type and determined current values.
7. A method (400) to differentiate between plant disease and plant mechanical injuries the method includes:
receiving, by a processor (108), one or more images of plants grown in a specific area as captured via one or more image capturing devices (402);
processing, by the processor (108), the received one or more images of the plants using an artificial intelligence-based image enhancing technique, wherein the processing of the captured one or more images comprises contrast improvement and image size alignment (404);
segregating, by the processor (108), the processed one or more images for colour of interests based on pixel value, wherein the colour of interests comprises green, yellow, blue and brown (406);
evaluating, by the processor (108), the segregated one or more images to remove for image noise and unwanted objects (408);
extracting, by the processor (108), one or more feature values from the evaluated one or more images using artificial intelligence-based image feature extraction techniques, wherein the one or more feature values comprises variance value, energy value, contrast value, correlation value, dissimilarity value, and homogeneity value (410);
applying, by the processor (108), the extracted one or more feature values to a trained artificial intelligence-based differentiator model (412);
classifying, by the processor (108), the extracted one or more feature values into one of plant disease and mechanical injuries based on results of the trained artificial intelligence-based differentiator model (414); and
outputting, by the processor (108), the classified one or more feature values on a user interface (416).
8. The method (400) as claimed in claim 7, wherein the method (400) comprises:
identifying current spread level of infection by analysing the evaluated one or more images;
applying the identified current spread level of infection to a trained artificial intelligence based probabilistic model; and
predicting probable spreading of infection value based on result of the trained artificial intelligence based probabilistic model.
9. The method (400) as claimed in claim 7, wherein applying a trained artificial intelligence-based differentiator model comprises:
learning from a set of sample images;
generating the artificial intelligence-based differentiator model; and
training the generated artificial intelligence-based differentiator model using the learnt set of sample images.
10. The method (400) as claimed in claim 7, wherein the artificial intelligence-based image enhancing technique comprises Contrast Limited Adaptive Histogram Equalization (CLAHE) technique.
11. The method (400) as claimed in claim 7, wherein classifying the extracted one or more feature values into one of plant disease and mechanical injuries, the method (400) comprises:
determining type of the one or more feature values;
determining current values of the one or more feature values; and
classifying based on type and determined current values.
12. The method (400) as claimed in claim 7, wherein for predicting probable spreading of infection value based on result of the trained artificial intelligence based probabilistic model, the method (400) comprises:
computing an infection spread score for the identified current spread level of infection;
determining whether the computed infection spread score amounts to higher level of infection spread; and predicting probable spreading of infection value based on the determination.

Documents

Application Documents

# Name Date
1 202231017757-STATEMENT OF UNDERTAKING (FORM 3) [28-03-2022(online)].pdf 2022-03-28
2 202231017757-STARTUP [28-03-2022(online)].pdf 2022-03-28
3 202231017757-PROOF OF RIGHT [28-03-2022(online)].pdf 2022-03-28
4 202231017757-POWER OF AUTHORITY [28-03-2022(online)].pdf 2022-03-28
5 202231017757-FORM28 [28-03-2022(online)].pdf 2022-03-28
6 202231017757-FORM-9 [28-03-2022(online)].pdf 2022-03-28
7 202231017757-FORM FOR STARTUP [28-03-2022(online)].pdf 2022-03-28
8 202231017757-FORM FOR SMALL ENTITY(FORM-28) [28-03-2022(online)].pdf 2022-03-28
9 202231017757-FORM 18A [28-03-2022(online)].pdf 2022-03-28
10 202231017757-FORM 1 [28-03-2022(online)].pdf 2022-03-28
11 202231017757-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [28-03-2022(online)].pdf 2022-03-28
12 202231017757-EVIDENCE FOR REGISTRATION UNDER SSI [28-03-2022(online)].pdf 2022-03-28
13 202231017757-DRAWINGS [28-03-2022(online)].pdf 2022-03-28
14 202231017757-DECLARATION OF INVENTORSHIP (FORM 5) [28-03-2022(online)].pdf 2022-03-28
15 202231017757-COMPLETE SPECIFICATION [28-03-2022(online)].pdf 2022-03-28
16 202231017757-FER.pdf 2022-04-27
17 202231017757-OTHERS [29-09-2022(online)].pdf 2022-09-29
18 202231017757-FORM 3 [29-09-2022(online)].pdf 2022-09-29
19 202231017757-FER_SER_REPLY [29-09-2022(online)].pdf 2022-09-29
20 202231017757-ENDORSEMENT BY INVENTORS [29-09-2022(online)].pdf 2022-09-29
21 202231017757-DRAWING [29-09-2022(online)].pdf 2022-09-29
22 202231017757-CLAIMS [29-09-2022(online)].pdf 2022-09-29
23 202231017757-US(14)-HearingNotice-(HearingDate-07-02-2023).pdf 2023-01-02
24 202231017757-Correspondence to notify the Controller [17-01-2023(online)].pdf 2023-01-17
25 202231017757-Annexure [17-01-2023(online)].pdf 2023-01-17
26 202231017757-FORM-26 [06-02-2023(online)].pdf 2023-02-06
27 202231017757-US(14)-ExtendedHearingNotice-(HearingDate-13-04-2023).pdf 2023-02-07
28 202231017757-Correspondence to notify the Controller [23-02-2023(online)].pdf 2023-02-23
29 202231017757-Annexure [23-02-2023(online)].pdf 2023-02-23
30 202231017757-Written submissions and relevant documents [24-04-2023(online)].pdf 2023-04-24
31 202231017757-POA [24-04-2023(online)].pdf 2023-04-24
32 202231017757-MARKED COPIES OF AMENDEMENTS [24-04-2023(online)].pdf 2023-04-24
33 202231017757-FORM 13 [24-04-2023(online)].pdf 2023-04-24
34 202231017757-Annexure [24-04-2023(online)].pdf 2023-04-24
35 202231017757-AMMENDED DOCUMENTS [24-04-2023(online)].pdf 2023-04-24
36 202231017757-PatentCertificate08-08-2023.pdf 2023-08-08
37 202231017757-IntimationOfGrant08-08-2023.pdf 2023-08-08
38 202231017757-PROOF OF ALTERATION [24-05-2024(online)].pdf 2024-05-24
39 202231017757-PROOF OF ALTERATION [30-09-2024(online)].pdf 2024-09-30
40 202231017757-PROOF OF ALTERATION [30-09-2024(online)]-1.pdf 2024-09-30

Search Strategy

1 202231017757E_27-04-2022.pdf

ERegister / Renewals

3rd: 12 Feb 2024

From 28/03/2024 - To 28/03/2025

4th: 26 Mar 2025

From 28/03/2025 - To 28/03/2026