Sign In to Follow Application
View All Documents & Correspondence

Agriculture Produce Grade Assessment System And Method

Abstract: An agriculture produce grade assessment system (102) for grade assessment of agriculture produce is provided herein. The agriculture produce grade assessment system (102) includes an image acquisition module (202) configured to capture image of an agriculture produce lying on a predetermined background cloth with a reference object, and a lighting compensation module (204) configured to compensate for variation in lighting during image capturing of the agriculture produce, based on image of the reference object. The agriculture produce grade assessment system (102) further includes an image processing module (206) configured to process the captured image to perform object detection, object segmentation, and feature extraction for the agriculture produce. The agriculture produce grade assessment system (102) further includes an output module (208) configured to generate a grade assessment report about accuracy of size, color, and defect detection of the agriculture produce. Representative Figure: Figure 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
20 November 2017
Publication Number
21/2019
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
patent@adastraip.com
Parent Application

Applicants

Agricxlab Pvt. Ltd.
B/1403, Twinkle Tower, Dhokali Balkum Road,Thane, Maharashtra India – 400607

Inventors

1. DHOOT, Ritesh
D-104, St Johns Wood, No-80, St Johns Wood, Koramangala, Bangalore- 560029, Karnataka
2. KUMAR, Saurabh
B/1403, Twinkle Tower, Dhokali Balkum Road,Thane, Maharashtra India – 400607

Specification

Claims:What is claimed is:

1. An agriculture produce grade assessment system (102) comprising a processor (110) and a memory (112), the memory (112) storing:
an image acquisition module (202) configured to capture image of an agriculture produce lying on a predetermined background cloth with a reference object;
a lighting compensation module (204) configured to compensate for variation in lighting during image capturing of the agriculture produce, based on image of the reference object;
an image processing module (206) configured to process the captured image to perform object detection, object segmentation, and feature extraction for the agriculture produce; and
an output module (208) configured to output a grade assessment report about accuracy of size, color, and defect detection of the agriculture produce.

2. The agriculture produce grade assessment system (102) of claim 1, wherein a color of the background cloth is different than the color of agriculture produce.

3. The agriculture produce grade assessment system (102) of claim 2, wherein a color of the reference object is different than the color of the agriculture produce and the color of the background cloth.

4. The agriculture produce grade assessment system (102) of claim 1, wherein the lighting compensation module (204) is configured to calculate a degree of variation in color of the reference object, based on a known color of the reference object.

5. The agriculture produce grade assessment system (102) of claim 1, wherein the lighting compensation module (204) is configured to apply the degree of variation in color of the reference object to all pixels of the captured image of the agriculture produce.

6. The agriculture produce grade assessment system (102) of claim 1, wherein the image processing module (206) further comprising an object detection sub-module to detect objects of the agriculture produce.

7. The agriculture produce grade assessment system (102) of claim 1, wherein the image processing module (206) further comprising a feature extraction sub-module to calculate height, width, length, breadth, and volume of the agriculture produce.

8. The agriculture produce grade assessment system (102) of claim 1, wherein the image processing module (206) further comprising an object classification sub-module to perform classification of each object in the agriculture produce for a defect type, a defect variant, detection of chemicals, and pathogens for food safety.

9. The agriculture produce grade assessment system (102) of claim 1, further comprising a training module (210) configured to train the image processing using U-Net architecture.

10. A computer-implemented method for grade assessment of agriculture produce, the computer-implemented method comprising:
capturing image of an agriculture produce lying on a predetermined background cloth with a reference object;
compensating for variation in lighting during image acquisition of the agriculture produce, based on image of the reference object;

processing the captured image to perform object detection, object segmentation, and feature extraction for the agriculture produce; and

providing a grade assessment report about accuracy of size, color, and defect detection of the agriculture produce.
, Description:FIELD OF THE INVENTION
[001] Embodiments of the present invention, generally relate to grading of agriculture produce, and in particular relate to grade assessment of agriculture produce by image processing.
BACKGROUND
[002] Optimum value realization for agriculture produce is necessary for farmers and other stakeholders in the agriculture value chain to increase their revenue and profits. Conventional practices of quality inspection, grading, and safety control for agriculture produce rely on manual inspections by buyers and sellers.
[003] However, these manual inspections of grading and quality assessment are labor intensive and time consuming. Further, even accuracy of grading may be jeopardized due to subjective human judgments. Furthermore, it is not always easy for humans to perform fast and accurate grading and quality assessment of the agriculture produce.
[004] Hence, traditional method of agricultural produce quality assessment is tedious and costly. It is easily influenced by subjective and inconsistent evaluation results. Agricultural produce quality plays a critical role in all food industry quality assessments. Thus, in some conventional methods, computer technologies such as image capturing and sending have been utilized in order to construct new machines for agricultural produce quality assessment. For value determination of the agriculture produce, images of the produce are captured at a warehouse and sent to buyer at different locations to determine the price and quality of the produce.
[005] However, even these conventional methods also suffer from many disadvantages. First, due to variation in lighting conditions in the warehouse, image of the agriculture produce is also affected and does not present exact quality of the produce. Further, the agricultural produce is characterized by a wide variety of physical, characteristics, and features, which are associated with various aspects relating to agriculture, horticulture, environment, geography, climate, and ecology of the agriculture crop from which agriculture produce is derived. These factors result in wide variety in quality of the agriculture produce, which necessities changes in the associated values of agriculture produce.
[006] Therefore, there is a need for an improved system and method for grading of agriculture produce which solves above disadvantages associated with the conventional methods.
SUMMARY
[007] According to an aspect of the present disclosure, an agriculture produce grade assessment system (102) for grade assessment of agriculture produce is provided herein. The agriculture produce grade assessment system (102) includes an image acquisition module (202) configured to capture image of an agriculture produce lying on a predetermined background cloth with a reference object, and a lighting compensation module (204) configured to compensate for variation in lighting during image capturing of the agriculture produce, based on image of the reference object. The agriculture produce grade assessment system (102) further includes an image processing module (206) configured to process the captured image to perform object detection, object segmentation, and feature extraction for the agriculture produce. The agriculture produce grade assessment system (102) further includes an output module (208) configured to output a grade assessment report about accuracy of size, color, and defect detection of the agriculture produce.
[008] According to another aspect of the present disclosure, a computer-implemented method for grade assessment of agriculture produce is provided herein. The computer-implemented method includes capturing image of an agriculture produce lying on a predetermined background cloth with a reference object, and compensating for variation in lighting during image acquisition of the agriculture produce, based on image of the reference object. The computer-implemented method further includes processing the captured image to perform object detection, object segmentation, and feature extraction for the agriculture produce, and providing a grade assessment report about accuracy of size, color, and defect detection of the agriculture produce.
[009] The preceding is a simplified summary to provide an understanding of some aspects of embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0011] FIG. 1 is a block diagram depicting a network environment according to an embodiment of the present invention;
[0012] FIG. 2 is a block diagram of modules stored in memory, according to an embodiment of the present invention;
[0013] FIG. 3 is a schematic diagram of an captured image of agriculture produce, according to an embodiment of the present invention;
[0014] FIG. 4 is a schematic diagram of an masked image created by using U-NET model during image processing, according to an embodiment of the present invention;
[0015] FIG. 5 is a schematic diagram of the captured image after applying distance transform during image processing, according to an embodiment of the present invention;
[0016] FIG. 6 is a schematic diagram of ellipse object created during image processing , according to an embodiment of the present invention;
[0017] FIG. 7 is a schematic diagram of the image after applying template matching during image processing, according to an embodiment of the present invention;
[0018] FIG. 8 is a schematic diagram of peaks found in the image during image processing, according to an embodiment of the present invention;
[0019] FIG. 9 is a schematic diagram of the labelling peaks of image during image processing, according to an embodiment of the present invention;
[0020] FIG. 10 is a schematic diagram of overlaying of borders to image containing all peaks during image processing, according to an embodiment of the present invention;
[0021] FIG. 11 is a schematic diagram of image after applying watershed algorithm during image processing, according to an embodiment of the present invention; and
[0022] FIG. 12 depicts an exemplary flowchart illustrating a grade assessment method of agriculture produce, according to an embodiment of the present invention.
[0023] To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures.
DETAILED DESCRIPTION
[0024] As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to.
[0025] The phrases “at least one”, “one or more”, and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
[0026] The term “a” or “an” entity refers to one or more of that entity. As such, the terms “a” (or “an”), “one or more” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising”, “including”, and “having” can be used interchangeably.
[0027] The term “automatic” and variations thereof, as used herein, refers to any process or operation done without material human input when the process or operation is performed. However, a process or operation can be automatic, even though performance of the process or operation uses material or immaterial human input, if the input is received before performance of the process or operation. Human input is deemed to be material if such input influences how the process or operation will be performed. Human input that consents to the performance of the process or operation is not deemed to be “material”.
[0028] FIG. 1 illustrates an exemplary network environment (100) where various embodiments of the present invention may be implemented. The network environment (100) includes an agriculture produce grade assessment system (102) connected to various electronic devices 104a (mobile), 104b (tablet),...104n, (hereinafter referred as 104) via a network (106). The Network (106) may include, but is not restricted to, a communication network such as Internet, PSTN, Local Area Network (LAN), Wide Area Network (WAN), Metropolitan Area Network (MAN), and so forth. In an embodiment, the network (106) can be a data network such as the Internet. Further, the messages exchanged between the agriculture produce grade assessment system (102) and the mobile devices (104) can comprise any suitable message format and protocol capable of communicating the information necessary for the agriculture produce grade assessment system (102) to provide grade assessment of the agriculture produce such as potatoes. The mobile devices (104) may utilize the agriculture produce grade assessment system (102) to capture images of the agriculture produce and provide the captured images to the agriculture produce grade assessment system (102).
[0029] In an embodiment of the present invention, the agriculture produce grade assessment system (102) may be a computing device. In operation, a user of the mobile device (104) may access the agriculture produce grade assessment system (102) to capture an image of the agriculture produce spread on a background cloth with a reference object. The agriculture produce grade assessment system (102) includes a processor (110) and a memory (112). In one embodiment, the processor (110) includes a single processor and resides at the agriculture produce grade assessment system (102). In another embodiment, the processor (110) may include multiple sub-processors and may reside at the agriculture produce grade assessment system as well as the mobile device.
[0030] Further, the memory (112) includes one or more instructions that may be executed by the processor (110) to capture image of an agriculture produce lying on a predetermined background cloth with a reference object, process the captured image to perform object detection, object segmentation, and feature extraction for the agriculture produce, and generate a report about accuracy of size, color, and defect detection of the agriculture produce. In one embodiment, the memory (112) includes the modules (114), a database (116), and other data (not shown in figure). The other data may include various data generated during processing the captured images. In one embodiment, the database (116) is stored internal to the agriculture produce grade assessment system (102). In another embodiment, the database (116) may be stored external to the agriculture produce grade assessment system (102), and may be accessed via the network (106). Furthermore, the memory (112) of the agriculture produce grade assessment system (102) is coupled to the processor (110).
[0031] Referring to FIG. 2, the modules (114) includes an image acquisition module (202), a lighting compensation module (204), an image processing module (206), an output module (208), and a training module (210). The modules (114) are instructions stored in the memory and may process a captured image to facilitate grading of the agriculture produce.
[0032] The image acquisition module (202) is configured to capture image of an agriculture produce lying on a predetermined background cloth with a reference object. In an embodiment, before capturing the image of the agriculture produce, a sample of the agriculture produce is prepared. The sample is spread on the predetermined background cloth having a different or contrasting color (for example, black, blue, or red color) than the agriculture produce. Further, the texture of the cloth may be non-reflecting to avoid glare, and size of the background cloth may be 1m X 1m.
[0033] Further, the reference object is kept on the background cloth. In an embodiment, the reference object facilitates minimizing the distortion due to varied lighting conditions. The reference object has a different or contrasting and bright color such as red, blue, green, or any contrasting color with respect to the produce & background cloth color. Further, the texture of the reference object is non-reflecting to avoid glare, shape is square or circular or any symmetrical geometric shape, and size of the reference object may be from 50 mm to 200 mm.
[0034] In an embodiment, the sample of agriculture produce such as potatoes is spread on the background cloth away from the reference object. The sample is spread so that there is no piling of produce. Further, the sample may have a count of around 50-200 units of agriculture produce such as potato. Those skilled in the art will appreciate that location for image acquisition may be cold storage, farm, or plant and does not require a laboratory. After spreading the agriculture produce on the background cloth having reference object well, an image of the agriculture produce is captured using a camera of mobile device, as shown in FIG. 3.
[0035] Further, lighting compensation module (204) is configured to compensate for variation in lighting based on the reference object. According to an embodiment of the present invention, the lighting variation is compensated by calculating the degree of variation in color of the reference object, as captured from the already known color of the reference object, and this variation in the lighting is applied to all the pixels of the captured image by the light compensation module (204).
[0036] Further, an image processing module (206) configured to process the captured image and detect objects of the agriculture produce. According to an embodiment of the present invention, the image processing module (206) includes various sub-modules to process the captured image and perform object detection for accurate count, color determination, detection of exact boundaries of each object, classification of each object for the defect type and defect variant, and detection of presence of certain chemicals & pathogens for food safety. In an embodiment, the sub-modules of the image processing module (206) include, but not limited to, object detection sub-module (206-A), object segmentation sub-module (206-B), feature extraction sub-module (206-C), and object classification sub-module (206-D).
[0037] In an embodiment, the object detection sub-module (206-A) may first determine the pixel to cm ratio using a marker of known dimension, then the colorspace is modified by converting BGR (blue, green, and red) into RGB (red, green, and blue). Then, the noise is filtered to smoothen the image using Gaussian technique. Further, in an embodiment, a support vector machine (SVM) classifier may be used to obtain micro images of the sample image having a potato at centre.
[0038] Further, in an embodiment, the object segmentation sub-module (206-B) may apply U-NET model to get masked image, as shown in FIG. 4, and then may modify the micro images produced by the object detection sub-module into masked micro-images. Further, the object segmentation module (206-B) may apply contouring on the masked micro images. During contouring, the object segmentation sub-module is configured to first apply a distance transform to the mask, as shown in FIG. 5. Further, the object segmentation sub-module is configured to add padding, and thereby creating a separate ellipse object, as shown in FIG. 6.
[0039] Further, the object segmentation sub-module is configured to apply a template matching, as shown in FIG. 7. Further, the object segmentation sub-module is configured to find peaks in the image, as shown in FIG. 8. Further, the object segmentation sub-module is configured to label each of the peaks with unique values, as shown in FIG. 9. Further, the object segmentation sub-module is configured to find border of potatoes and overlaying borders to image containing all peaks, as shown in FIG. 10. Further, the object segmentation sub-module is configured to apply a watershed algorithm, as shown in FIG. 11. Further, the object segmentation sub-module is configured to use the output of watershed algorithm to get a different mask having micro-images where centre of potatoes is present, and then contouring is applied on the mask.
[0040] In an embodiment, the feature extraction sub-module (206-C) of the image processing module (206) is configured to calculate height, width, length, breadth, and volume of the agriculture produce. The feature extraction sub-module of the image processing module (206) may first fit the minimum area rectangle into the contours of potato image, find length and breadth using coordinates of the edges, and multiply the obtained length and width with cm per pixel ratio to calculate the height and width of individual object (such as potato) in the agriculture produce.
[0041] Further, the feature extraction sub-module may determine length by calculating maximum distance between any two points on the contour, and may determine width by calculating maximum distance between any two points on the contour along perpendicular axis to axis of shortest length. Further, the feature extraction sub-module may calculate volume by using a formula of volume (4/3)*(3.14*length*width*breadth).
[0042] Further, an object classification sub-module (206-D) of the image processing module (206) may perform classification of each object for the defect type and defect variant, and detection of presence of certain chemicals & pathogens for food safety by using deep learning classification and auto-encoder.
[0043] Finally, an output module (208) is configured to output a grade assessment report of agriculture produce, having accuracy of size, distribution, and color of the agriculture produce. In an embodiment, the output module (208) may provide output in form of a report about the quality of agriculture produce such as size distribution, color distribution, external defect detection, internal defect detection, chemical and microbial profiling.
[0044] In an exemplary report provided by the output module (208) of the agriculture produce grade assessment system (102), size distribution may be 99% accurate, color distribution may be 95% accurate, external defect detection may be 95% accurate, internal defect detection may be 95% accurate, chemical and microbial profiling may be 95% accurate. Those skilled in the art will appreciate that images of the produce may be captured in a warehouse, and the grade assessment may be performed at a remote location, and a buyer may utilize the grade assessment to determine the price and quality of the produce.
[0045] Further, in an embodiment, the training module (210) is configured to train the image processing for various analysis of the image. In an embodiment, standard U-Net architecture may be used by the training module (210). Further, training images may consist of color photographs of potatoes spread across on the floor or mat. Each image can contain up to 200 potato objects. The potatoes may be well spread or touching each other.
[0046] In an embodiment, for each training image, the training module (210) may utilize a mask image to train the U-Net. In an embodiment, the mask image is a gray scale image where the potato regions may be marked as white and the rest of the objects and the background may be black in color. Before feeding the images into the U-Net algorithm, the training module (210) is configured to split each image into 5X5 equal segments.
[0047] In an embodiment, the training module (210) is configured to use the standard U-Net architecture with the training images as input and the masks as target. Further, for training, each image segment may be converted into 256X256 sizes. Further, the masks may also be rescaled accordingly. Further during training, to avoid the overfitting, image augmentation may be done by random horizontal flips, random shift scale and rotate and random hue saturation values by the training module (210).
[0048] Further, in an embodiment, each image may be cut into 25 equal segments, and masks may be predicted for each segment. Finally, the masks may be stitched together to get the whole mask by the training module (210). In an embodiment, the training module (210) may feed mask into computer vision (CV) algorithms to calculate the dimensions of each potato.
[0049] FIG. 12 illustrates an exemplary flowchart of a grade assessment method of the agriculture produce, according to an embodiment of the present invention.
[0050] Initially, at step 1202, an image of an agriculture produce on a background cloth with reference object is captured. In an embodiment, a sample of the agriculture produce may be spread on a predetermined background cloth having a different or contrasting color (for example, black, blue, or red color) than the agriculture produce. Further, the reference object may have a contrasting and bright color such as red, blue, green, or any contrasting color with respect to the produce & background cloth color.
[0051] At step 1204, lighting variation for the captured image is compensated based on the reference object color variation. In an embodiment, the lighting variation may be compensated by calculating a degree of variation in color of the reference object, as captured from the already known color of the reference object, and applying this variation in the lighting to all the pixels of the captured image.
[0052] At step 1206, image processing is performed on the captured image to detect objects with boundaries and classification. According to an embodiment of the present invention, the image processing may include object detection, object segmentation, and feature extraction, as explained above.
[0053] At step 1208, it is determined whether image processing has been done on all objects in the captured image of the agriculture produce. If yes, the method proceeds to step 1210. Otherwise, the method returns to step 1206.
[0054] At step 1210, an output is provided as a grade assessment report of the agriculture produce, having accuracy of size, distribution, color and defects of the agriculture produce. In an embodiment, the report generated may provide size distribution as 99% accurate, color distribution as 95% accurate, external defect detection as 95% accurate, internal defect detection as 95% accurate, chemical and microbial profiling as 95% accurate. Those skilled in the art will appreciate that buyers, sitting at even remote locations, may utilize the grading to determine the price and quality of the produce.
[0055] The agriculture produce grade assessment system (102) and the method (1200) performed by the agriculture produce grade assessment system (102) advantageously provides grade assessment of the agriculture produce such as potatoes for the size distribution, color distribution, external defect detection, internal defect detection, chemical, and microbial profiling. Such grade assessment of the agriculture produce may be utilized by the buyers, to determine the price and quality of the produce. Further, the agriculture produce grade assessment system (102) advantageously provides for compensating for variation in lighting during image capturing by utilizing a reference object. Those skilled in the art will appreciate that such compensation of lighting facilitates uniform image capturing of the agriculture produce, helping the buyers to determine the quality of agriculture produce, and results in better utilization of values of the agriculture produce for farmers.
[0056] The foregoing discussion of the present invention has been presented for purposes of illustration and description. It is not intended to limit the present invention to the form or forms disclosed herein. In the foregoing Detailed Description, for example, various features of the present invention are grouped together in one or more embodiments, configurations, or aspects for the purpose of streamlining the disclosure. The features of the embodiments, configurations, or aspects may be combined in alternate embodiments, configurations, or aspects other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention the present invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment, configuration, or aspect. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of the present invention.
[0057] Moreover, though the description of the present invention has included description of one or more embodiments, configurations, or aspects and certain variations and modifications, other variations, combinations, and modifications are within the scope of the present invention, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative embodiments, configurations, or aspects to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.

Documents

Application Documents

# Name Date
1 201721041407-FORM FOR STARTUP [20-11-2017(online)].pdf 2017-11-20
2 201721041407-FORM FOR SMALL ENTITY(FORM-28) [20-11-2017(online)].pdf 2017-11-20
3 201721041407-FORM 1 [20-11-2017(online)].pdf 2017-11-20
4 201721041407-FIGURE OF ABSTRACT [20-11-2017(online)].pdf 2017-11-20
5 201721041407-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [20-11-2017(online)].pdf 2017-11-20
6 201721041407-EVIDENCE FOR REGISTRATION UNDER SSI [20-11-2017(online)].pdf 2017-11-20
7 201721041407-DRAWINGS [20-11-2017(online)].pdf 2017-11-20
8 201721041407-DECLARATION OF INVENTORSHIP (FORM 5) [20-11-2017(online)].pdf 2017-11-20
9 201721041407-COMPLETE SPECIFICATION [20-11-2017(online)].pdf 2017-11-20
10 201721041407-FORM-26 [20-04-2018(online)].pdf 2018-04-20
11 Abstract.jpg 2018-08-11
12 201721041407-ORIGINAL UR 6( 1A) FORM 26-140518.pdf 2019-01-10
13 201721041407-FORM 3 [27-05-2019(online)].pdf 2019-05-27
14 201721041407-FORM 18 [22-11-2021(online)].pdf 2021-11-22
15 201721041407-FER.pdf 2022-04-21

Search Strategy

1 201721041407E_21-04-2022.pdf