Abstract: A method to count and classify produce in a plant 10 is described. The method comprises capturing an image 12 of produce that are growing on plants by a camera, and segmenting the image 14 that is captured by the camera into a plurality of segments. The method further comprises classifying each segment 16 of the plurality of segments into at least one maturity stage of the produce using at least one trained model, and determining the quantity of produce 18 in each stage of maturity based on the classification.
Claims:We Claim
1. A method to count and classify produce in a plant (10), the method comprising:
capturing an image (12) of produce that are growing on plants by a camera;
segmenting the image (14) captured by said camera into a plurality of segments;
classifying each segment (16) of the plurality of segments into at least one maturity stage of the produce using at least one trained model; and
determining the quantity of produce (18) in each stage of maturity based on said classification.
2. The method to count and classify produce in a plant (10) in accordance with Claim 1, further comprising classifying each segment of the plurality of segments into a group (20), each group (20) corresponds to a particular stage of maturity of the produce.
3. The method to count and classify produce in a plant (10) in accordance with Claim 1, wherein the quantity of produce in each stage of maturity is determined using at least one of a cluster based approach (22) and a boundary based approach (24).
4. The method to count and classify produce in a plant (10) in accordance with Claim 3, wherein the boundary based approach (24) includes drawing a boundary around each produce and determining the number of boundaries that are drawn.
5. The method to count and classify produces in a plant (10) in accordance with Claim 3, wherein the cluster based approach (22) includes drawing a cluster around bunches of produce that are spaced from each other and determining the number of produce in each cluster.
6. The method to count and classify produce in a plant (10) in accordance with Claim 1, wherein the number of segments into which the image is segmented is user defined.
, Description:Complete Specification:
The following specification describes and ascertains the nature of this invention and the manner in which it is to be performed.
Field of the invention
[0001] This invention relates to a method to count and classify produce by means of computer vision.
Background of the invention
[0002] CN 108491892 A describes a produce sorting system based on machine vision. The invention realizes produce sorting on two aspects such as hardware design and model selection and software system development by using a machine vision technology, and comprises the steps of designing and selecting a machine vision industrial camera, a lens and a lighting scheme to ensure the produce sorting system to continuously and steadily obtain a high-quality produce image; suppressing salt-and-pepper noise and Gaussian noise in the produce image by using median filtering and binomial filtering; highlighting edge information of the produce image by adopting an anti-sharpening mask method at the same time; realizing produce classification by adopting a supervised Gaussian mixture model clustering algorithm; and calibrating data grasped by a robot guided by a monocular camera through a Zhang Zhengyou calibration method, and calculating the actual grasping position and pose.
Brief description of the accompanying drawing
[0003] Figure 1 illustrates a flow chart depicting the method to count and classify produce in one embodiment of the invention.
Detailed description of the embodiments
[0004] Figure 1 illustrates a flow chart depicting the method to count and classify produce 10 in one embodiment of the invention. The method comprises capturing an image 12 of produce that are growing on plants by a camera, and segmenting the image 14 that is captured by the camera into a plurality of segments. The method further comprises classifying each segment 16 of the plurality of segments into at least one maturity stage of the produce using at least one trained model, and determining the quantity of produce 18 in each stage of maturity based on the classification.
[0005] The method to count and classify produce in a plant 10 comprises securing a camera to a drone 12. In an alternate exemplary embodiment, the camera may be secured to any transportation device such as a movable robot, an automobile, a human operator and the like who can capture images of produce that are growing on plants. In addition, the camera on the transportation device acquires high quality images of the agricultural scene, the stems, produce, and flowers that are growing on plants. The produce typically includes produce and flowers that are growing on the plants. The images of produce that are growing on plants are then transported to a remote computer server for post processing.
[0006] The method to count and classify produce in a plant 10 comprises segmenting the image 14 captured by the camera into a plurality of segments. The number of segments into which the image is segmented is user defined and can be manually changed. Each segment of the plurality of segments is classified 16 into at least one maturity stage of the produce or bin using at least one trained model 17. The trained model 17 is described in further detail. In creating the trained model 17, a plurality of bins 18 are created. Each bin 18 corresponds to a particular stage of maturity of a produce. For example, the flowers and least mature produce are deposited manually by a user in a first bin 18. The partially mature produce are deposited manually by the user in the second bin 18. The fully mature produce are deposited manually by the user in the third bin 18. Using thousands of sample produce and depositing them manually into the first bin 18, the second bin 18, and the third bin 18 respectively, the trained model 17 learns to automatically recognize and deposit a given produce into the first bin 18, the second bin 18, or the third bin 18 respectively depending on the stage of maturity of the produce. As more sample produce are deposited manually into the first bin 18, the second bin 18, and the third bin 18 respectively, the accuracy of the trained model 17 increases. Hence the accuracy of depositing the sample produce automatically into the first bin 18, the second bin 18, or the third bin 18 increases with increasing the number of sample produce.
[0007] Since each segment of the plurality of segments is classified into at least one maturity stage of the produce using the at least one trained model 17, the produce in each segment is classified as flower / immature produce that is deposited in the first bin 18, partially mature produce that is deposited in the second bin 18, and fully mature produce that is deposited in the third bin 18. Once the produce is deposited in the first bin 18, the second bin 18, and the third bin 18 respectively, the quantity of produce is determined in each stage of maturity or each bin based on the classification.
[0008] The quantity of produce that is in each bin is determined based on two approaches. The first approach is known as a cluster based approach 24 and the second approach is known as a boundary based approach 20. Once the produce is deposited in the first bin 18, the second bin 18, and the third bin 18 respectively, the quantity of immature produce / flower that is deposited in the first bin 18 is determined by using the cluster based approach 24. More specifically, the cluster based approach 24 includes drawing a cluster around a bunch of produce, and determining the number of produce that are within that cluster by measuring the dimension of that cluster. The spacing between bunches of produce is considered as a spacing between two clusters. Therefore, a first cluster is drawn around the first bunch of produce, and a second cluster is drawn around the second bunch of produce, with a spacing between the first cluster and the second cluster. The spacing between the first bunch of produce and the second bunch of produce is used to determine if a separate cluster is required to be drawn between the first bunch of produce and the second bunch of produce. A bunch of produce spaced from one another to constitute a number of bunches of produce would entail drawing a cluster around each bunch of produce and determining the number of produce in each cluster of the total number of bunches of produce. As the number of produce within each cluster is estimated, the increase in the count of the number of produce within each cluster increases, the ability of the neural network to estimate the number of produce that is within each cluster increases. In the exemplary embodiment, based on the dimension of each cluster, the neural network calculates the number of produce that are within each cluster. Therefore, with an increased number of iterations, the accuracy of the neural network to estimate the number of produce that is within each cluster correspondingly increases with each subsequent iteration.
[0009] The quantity of partially mature produce that is deposited in the second bin 18 is determined by using the cluster based approach 24 as well as the boundary based approach 22. More specifically, the cluster based approach 24 includes drawing a cluster around a bunch of partially mature produce, and determining the number of produce that are within that cluster by measuring the dimension of that cluster. The spacing between bunches of produce is considered as a spacing between two clusters. Therefore, a first cluster is drawn around the first bunch of produce, and a second cluster is drawn around the second bunch of produce, with a spacing between the first cluster and the second cluster. A bunch of produce spaced from one another to constitute a number of bunches of produce would entail drawing a cluster around each bunch of produce and determining the number of produce in each cluster of the total number of bunches of produce. As the number of produce within each cluster is estimated, the increase in the count of the number of produce within each cluster increases the ability of the neural network to estimate the number of produce that are within each cluster. The boundary based approach includes drawing a boundary around each produce that is within each segment, and determining the number of boundaries that are drawn in each segment. Therefore, a boundary is drawn around the first produce, and a second boundary is drawn around the second produce, and so on. The number of boundaries within each segment is determined, and hence the total number of boundaries within the image captured by the camera is computed. Therein, the number of produce that is captured by the boundary based approach 22 and the cluster based approach 24 is compared, and the greater of the number of produce computed from the boundary based approach 22 and the cluster based approach 24 is taken as the number of produce within the image.
[0010] The quantity of fully mature produce that is deposited in the third bin 18 is determined by using the boundary based approach 22. More specifically, the boundary based approach 22 includes drawing a boundary around each produce that is within each segment, and determining the number of boundaries that are drawn in each segment. Therefore, a first boundary is drawn around the first produce, a second boundary is drawn around the second produce, and so on. The number of boundaries within each segment that is deposited within the third bin 18 is determined, and hence the number of boundaries within the image that is captured by the camera is computed. Therein, the number of produce that is captured by the boundary based approach 22 is taken as the number of produce within the image.
[0011] Once the number of produce within the first bin 18 by using the cluster based approach 24, the number of produce within the second bin 18 by using the boundary based approach 22 and the cluster based approach 24, and the number of produce within the third bin 18 by using the boundary based approach 22 is computed, the total number of produce per image is determined by summing up the number of produce within the first bin 18, the number of produce within the second bin 18, and the number of produce within the third bin 18. Therein, the total number of produce in each frame of the image is computed. Once the total number of produce in each frame of the image is computed, the total number of images in all the frames are summed up to constitute the total number of produce that has been captured by the camera. Therefore, the total number of produce over the geographical area where the produce is captured by the camera is determined.
[0012] A working of the method to count and classify produce is described as an example. The camera on the transportation device captures high quality images 12 of the agricultural scene, the stems, produce, and flowers that are growing on plants. The images of produce that are growing on plants are then transported to a remote computer server for post processing the data. The high quality images that are captured by the camera is segmented 14 into a plurality of segments. Each segment 14 of the plurality of segments is classified 16 into at least one maturity stage of the produce or bin 18 using at least one trained model 17. Therefore, the trained model 17 helps in classifying each segment 14 of the plurality of segments into a respective bin 18.
[0013] Since each segment 14 of the plurality of segments is classified into at least one maturity stage of the produce using the at least one trained model 17, the produce in each segment is classified 16 as flower / immature produce that is deposited in the first bin 18, partially mature produce that is deposited in the second bin 18, and fully mature produce that is deposited in the third bin 18. Once the produce is deposited in the first bin 18, the second bin 18, and the third bin 18 respectively, the quantity of produce is determined in each stage of maturity or each bin 18 based on the classification.
[0014] The quantity of produce that is in the first bin 18 is determined based on the cluster based approach 24. The number of produce in each cluster further comprises using the method to determine the number of produce in each cluster to count flowers and produce that are in the early stages of development. Therein, the quantity of partially mature produce that is in the second bin 18 is determined based on the boundary based approach 22 and based on the cluster based approach 24. More specifically, the cluster based approach 24 is used to count produce that is in the ripening stages of produce development, and the quantity of produce that is the greater of the boundary based approach 22 and the cluster based approach 24 is taken into consideration for determining the number of produce in the second bin 18. Therein, the quantity of mature produce that is in the third bin 18 is determined based on the boundary based approach 22. Therefore, the number of produce by drawing a boundary around each produce and determining the number of boundaries that are drawn is used to count produce that are in the fully mature stage of development.
[0015] Once the number of produce within the first bin 18 by using the cluster based approach 24, the number of produce within the second bin 18 by using the boundary based approach 22 and the cluster based approach 24, and the number of produce within the third bin 18 by using the boundary based approach 22 is computed, the total number of produce per image is determined by summing up the total number of produce within the first bin 18, the total number of produce within the second bin 18, and the total number of produce within the third bin 18 respectively. Therein, the total number of produce in each frame of the image is computed. Once the total number of produce in each frame of the image is computed, the total number of images in all the frames are summed up to constitute the total number of produce that has been captured by the camera. Therefore, the total number of produce over the geographical area where the produce is captured by the camera is determined. This constitutes the working of the method to count and classify produce that is growing within a geographical area.
[0016] It must be understood that the embodiments explained above are only illustrative and do not limit the scope of the disclosure. Many modifications in the embodiments with regard to dimensions of various components are envisaged and form a part of this invention. The scope of the invention is only limited by the scope of the claims.
| # | Name | Date |
|---|---|---|
| 1 | 201941020753-Annexure [23-12-2020(online)].pdf | 2020-12-23 |
| 1 | 201941020753-POWER OF AUTHORITY [24-05-2019(online)].pdf | 2019-05-24 |
| 2 | 201941020753-Response to office action [23-12-2020(online)].pdf | 2020-12-23 |
| 2 | 201941020753-FORM 1 [24-05-2019(online)].pdf | 2019-05-24 |
| 3 | 201941020753-DRAWINGS [24-05-2019(online)].pdf | 2019-05-24 |
| 3 | 201941020753-Annexure [27-11-2020(online)].pdf | 2020-11-27 |
| 4 | 201941020753-Covering Letter [27-11-2020(online)].pdf | 2020-11-27 |
| 4 | 201941020753-DECLARATION OF INVENTORSHIP (FORM 5) [24-05-2019(online)].pdf | 2019-05-24 |
| 5 | 201941020753-Form 1 (Submitted on date of filing) [27-11-2020(online)].pdf | 2020-11-27 |
| 5 | 201941020753-COMPLETE SPECIFICATION [24-05-2019(online)].pdf | 2019-05-24 |
| 6 | abstract 201941020753.jpg | 2019-05-27 |
| 6 | 201941020753-Power of Attorney [27-11-2020(online)].pdf | 2020-11-27 |
| 7 | 201941020753-Response to office action [27-11-2020(online)].pdf | 2020-11-27 |
| 7 | 201941020753-Request Letter-Correspondence [27-11-2020(online)].pdf | 2020-11-27 |
| 8 | 201941020753-Response to office action [27-11-2020(online)].pdf | 2020-11-27 |
| 8 | 201941020753-Request Letter-Correspondence [27-11-2020(online)].pdf | 2020-11-27 |
| 9 | abstract 201941020753.jpg | 2019-05-27 |
| 9 | 201941020753-Power of Attorney [27-11-2020(online)].pdf | 2020-11-27 |
| 10 | 201941020753-COMPLETE SPECIFICATION [24-05-2019(online)].pdf | 2019-05-24 |
| 10 | 201941020753-Form 1 (Submitted on date of filing) [27-11-2020(online)].pdf | 2020-11-27 |
| 11 | 201941020753-Covering Letter [27-11-2020(online)].pdf | 2020-11-27 |
| 11 | 201941020753-DECLARATION OF INVENTORSHIP (FORM 5) [24-05-2019(online)].pdf | 2019-05-24 |
| 12 | 201941020753-DRAWINGS [24-05-2019(online)].pdf | 2019-05-24 |
| 12 | 201941020753-Annexure [27-11-2020(online)].pdf | 2020-11-27 |
| 13 | 201941020753-Response to office action [23-12-2020(online)].pdf | 2020-12-23 |
| 13 | 201941020753-FORM 1 [24-05-2019(online)].pdf | 2019-05-24 |
| 14 | 201941020753-POWER OF AUTHORITY [24-05-2019(online)].pdf | 2019-05-24 |
| 14 | 201941020753-Annexure [23-12-2020(online)].pdf | 2020-12-23 |