Abstract: Disclosed herein is a method and system for counting plurality of objects placed in a region. An image of the region is captured and partitioned into segments based on depth of the plurality of objects. Further, shape of each of the plurality of objects in each object region of each segment is determined and validated based on comparison of the determined shape with predetermined shapes. Finally, count of the plurality of objects of each shape is aggregated for determining count of the plurality of objects in the region. In an embodiment, the present disclosure helps in automatically recognizing and counting the plurality of objects of multiple dimensions and multiple shapes, even when the image of the region includes a distorted/unfavorable background. FIG. 1
Claims:WE CLAIM:
1.A method for counting plurality of objects placed in a region, the method comprising:
partitioning, by an object counting system (103), an image (102) of the region, comprising the plurality of objects, into one or more segments (206) based on depth of each of the plurality of objects, wherein the image (102) of the region is received from an image capturing unit (101) associated with the object counting system (103);
identifying, by the object counting system (103), one or more object regions (207) in each of the one or more segments (206);
determining, by the object counting system (103), shape (208) of each of the plurality of objects in each of the one or more object regions (207) of each of the one or more segments (206);
validating, by the object counting system (103), the shape (208) of each of the plurality of objects based on comparison of the shape (208) of each of the plurality of objects with predetermined shapes (210); and
aggregating, by the object counting system (103), count of the plurality of objects of each shape (208) in each of the one or more segments (206) for determining count (105) of the plurality of objects in the region.
2. The method as claimed in claim 1, wherein the depth of each of the plurality of objects in each of the one or more segments (206) is determined using a depth sensor configured in the image capturing unit (101).
3. The method as claimed in claim 1, wherein the one or more object regions (207) are identified in each of the one or more segments (206) using a pre-trained machine learning classifier configured in the object counting system (103).
4. The method as claimed in claim 1, wherein determining the shape (208) of each of the plurality of objects comprises:
obtaining spatial frequencies of each of the one or more object regions (207) comprising the plurality of objects;
generating a magnitude spectrum histogram of the spatial frequencies of each of the one or more object regions (207); and
associating a peak value of the magnitude spectrum histogram with a predetermined scaling factor for obtaining pixel coordinates corresponding to the plurality of objects, thereby determining the shape (208) of the plurality of objects.
5. The method as claimed in claim 1, wherein validating the shape (208) of each of the plurality of objects comprises:
assigning a probability score for the shape (208) of each of the plurality of objects based on comparison of the shape (208) of each of the plurality of objects with the predetermined shapes (210); and
validating the shape (208) of each of the plurality of objects when the probability score of the shape (208) is higher than a threshold probability score.
6. The method as claimed in claim 1 further comprises eliminating overlap in the plurality of objects identified in the one or more object regions (207) in each of the one or more segments (206) by:
determining pixel indices corresponding to pixel area occupied by each of the plurality of objects in the one or more object regions (207);
identifying one or more overlapping objects upon detecting intersection among the pixel indices corresponding to one or more of the plurality of objects;
computing a Mutual Shape Co-occurrence Factor (MSCF) for each pair in the one or more overlapping objects based on an image gradient identified for each of the one or more overlapping objects; and
selecting one object among each pair of the one or more overlapping objects for eliminating, when value of the MSCF, corresponding to the selected one object, is more than a predefined threshold MSCF.
7. An object counting system (103) for counting plurality of objects placed in a region, the object counting system (103) comprises:
a processor (202); and
a memory (203), communicatively coupled to the processor (202), wherein the memory (203) stores processor-executable instructions, which on execution cause the processor (202) to:
partition an image (102) of the region, comprising the plurality of objects, into one or more segments (206) based on depth of each of the plurality of objects, wherein the image (102) of the region is received from an image capturing unit (101) associated with the object counting system (103);
identify one or more object regions (207) in each of the one or more segments (206);
determine shape (208) of each of the plurality of objects in each of the one or more object regions (207) of each of the one or more segments (206);
validate the shape (208) of each of the plurality of objects based on comparison of the shape (208) of each of the plurality of objects with predetermined shapes (210); and
aggregate count of the plurality of objects of each shape (208) in each of the one or more segments (206) to determine count (105) of the plurality of objects in the region.
8. The object counting system (103) as claimed in claim 7, wherein the processor (202) determines the depth of each of the plurality of objects in each of the one or more segments (206) using a depth sensor configured in the image capturing unit (101).
9. The object counting system (103) as claimed in claim 7, wherein the processor (202) identifies the one or more object regions (207) in each of the one or more segments (206) using a pre-trained machine learning classifier configured in the object counting system (103).
10. The object counting system (103) as claimed in claim 7, wherein to determine the shape (208) of each of the plurality of objects, the processor (202) is configured to:
obtain spatial frequencies of each of the one or more object regions (207) comprising the plurality of objects;
generate a magnitude spectrum histogram of the spatial frequencies of each of the one or more object regions (207); and
associate a peak value of the magnitude spectrum histogram with a predetermined scaling factor to obtain pixel coordinates corresponding to the plurality of objects.
11. The object counting system (103) as claimed in claim 7, wherein to validate the shape (208) of each of the plurality of objects, the processor (202) is configured to:
assign a probability score for the shape (208) of each of the plurality of objects based on comparison of the shape (208) of each of the plurality of objects with the predetermined shapes (210); and
validate the shape (208) of each of the plurality of objects when the probability score of the shape (208) is higher than a threshold probability score.
12. The object counting system (103) as claimed in claim 7, wherein to eliminate overlaps in the plurality of objects identified in the one or more object regions (207) in each of the one or more segments (206), the processor (202) is configured to:
determine pixel indices corresponding to pixel area occupied by each of the plurality of objects in the one or more object regions (207);
identify one or more overlapping objects upon detecting intersection among the pixel indices corresponding to one or more of the plurality of objects;
compute a Mutual Shape Co-occurrence Factor (MSCF) for each pair in the one or more overlapping objects based on an image gradient identified for each of the one or more overlapping objects; and
select one object among each pair of the one or more overlapping objects to eliminate, when value of the MSCF, corresponding to the selected one object, is more than a predefined threshold MSCF.
Dated this 28th day of March 2018
SWETHA S. N
OF K&S PARTNERS
ATTORNEY FOR THE APPLICANT
, Description:TECHNICAL FIELD
The present subject matter is, in general, related to image analysis and more particularly, but not exclusively, to a method and system for counting plurality of objects placed in a region by means of image processing techniques.
| # | Name | Date |
|---|---|---|
| 1 | 201841011737-STATEMENT OF UNDERTAKING (FORM 3) [28-03-2018(online)].pdf | 2018-03-28 |
| 2 | 201841011737-REQUEST FOR EXAMINATION (FORM-18) [28-03-2018(online)].pdf | 2018-03-28 |
| 3 | 201841011737-POWER OF AUTHORITY [28-03-2018(online)].pdf | 2018-03-28 |
| 4 | 201841011737-FORM 18 [28-03-2018(online)].pdf | 2018-03-28 |
| 5 | 201841011737-FORM 1 [28-03-2018(online)].pdf | 2018-03-28 |
| 6 | 201841011737-DRAWINGS [28-03-2018(online)].pdf | 2018-03-28 |
| 7 | 201841011737-DECLARATION OF INVENTORSHIP (FORM 5) [28-03-2018(online)].pdf | 2018-03-28 |
| 8 | 201841011737-COMPLETE SPECIFICATION [28-03-2018(online)].pdf | 2018-03-28 |
| 9 | 201841011737-REQUEST FOR CERTIFIED COPY [04-05-2018(online)].pdf | 2018-05-04 |
| 10 | 201841011737-REQUEST FOR CERTIFIED COPY [04-05-2018(online)]-1.pdf | 2018-05-04 |
| 11 | 201841011737-Proof of Right (MANDATORY) [30-07-2018(online)].pdf | 2018-07-30 |
| 12 | Correspondence by Agent _Form 1_01-08-2018.pdf | 2018-08-01 |
| 13 | 201841011737-PETITION UNDER RULE 137 [16-02-2021(online)].pdf | 2021-02-16 |
| 14 | 201841011737-OTHERS [16-02-2021(online)].pdf | 2021-02-16 |
| 15 | 201841011737-FORM 3 [16-02-2021(online)].pdf | 2021-02-16 |
| 16 | 201841011737-FER_SER_REPLY [16-02-2021(online)].pdf | 2021-02-16 |
| 17 | 201841011737-DRAWING [16-02-2021(online)].pdf | 2021-02-16 |
| 18 | 201841011737-CORRESPONDENCE [16-02-2021(online)].pdf | 2021-02-16 |
| 19 | 201841011737-COMPLETE SPECIFICATION [16-02-2021(online)].pdf | 2021-02-16 |
| 20 | 201841011737-CLAIMS [16-02-2021(online)].pdf | 2021-02-16 |
| 21 | 201841011737-ABSTRACT [16-02-2021(online)].pdf | 2021-02-16 |
| 22 | 201841011737-FER.pdf | 2021-10-17 |
| 23 | 201841011737-US(14)-HearingNotice-(HearingDate-06-06-2023).pdf | 2023-05-09 |
| 24 | 201841011737-POA [17-05-2023(online)].pdf | 2023-05-17 |
| 25 | 201841011737-FORM 13 [17-05-2023(online)].pdf | 2023-05-17 |
| 26 | 201841011737-Correspondence to notify the Controller [17-05-2023(online)].pdf | 2023-05-17 |
| 27 | 201841011737-AMENDED DOCUMENTS [17-05-2023(online)].pdf | 2023-05-17 |
| 28 | 201841011737-Written submissions and relevant documents [23-06-2023(online)].pdf | 2023-06-23 |
| 29 | 201841011737-FORM-26 [23-06-2023(online)].pdf | 2023-06-23 |
| 30 | 201841011737-FORM 3 [23-06-2023(online)].pdf | 2023-06-23 |
| 31 | 201841011737-PatentCertificate13-07-2023.pdf | 2023-07-13 |
| 32 | 201841011737-IntimationOfGrant13-07-2023.pdf | 2023-07-13 |
| 1 | 2020-09-2813-06-53E_28-09-2020.pdf |