Abstract: This disclosure relates generally to video analysis, and more particularly to systems and methods for mapping object co-ordinates from a video frame view to real world co-ordinates using perspective transformation. In one embodiment, a processor-implemented video frame coordinate transformation method is disclosed. The method may include obtaining an image from an image capture device and identifying, via one or more hardware processors, an object depicted in the image. Further, the method may include determining image-frame object coordinates for the object, and selecting one of a plurality of coordinate transformation matrices associated with the image capture device, based on the image-frame object coordinates for the object. Also, the method may include calculating, via the one or more hardware processors, real-world object coordinates for the object using the image-frame coordinates and the selected coordinate transformation matrix, and determining a trajectory of the object using the calculated real-world object coordinates.
CLIAMS:We claim:
1. A processor-implemented video frame coordinate transformation method, comprising:
obtaining an image from an image capture device;
identifying, via one or more hardware processors, an object depicted in the image;
determining, via the one or more hardware processors, image-frame object coordinates for the object;
selecting, via the one or more hardware processors, one of a plurality of coordinate transformation matrices associated with the image capture device, based on the image-frame object coordinates for the object;
calculating, via the one or more hardware processors, real-world object coordinates for the object using the image-frame coordinates and the selected coordinate transformation matrix; and
determining, via the one or more hardware processors, a trajectory of the object using the calculated real-world object coordinates.
2. The method of claim 1, wherein selecting the one of the plurality of coordinate transformation matrices comprises:
identifying a quadrilateral associated with a static object in the image as closest, to the image-frame object coordinates for the object, among quadrilaterals used to determine the plurality of coordinate transformation matrices; and
selecting a coordinate transformation matrix determined using the identified quadrilateral as the one of the plurality of coordinate transformation matrices.
3. The method of claim 1, further comprising:
classifying, via the one or more hardware processors, the image as an image that includes object motion; and
identifying, via the one or more hardware processors, the object depicted in the image after classifying the image as an image that includes object motion.
4. The method of claim 1, further comprising:
extracting, via the one or more hardware processors, one or more features of the object for uniquely identifying the object; and
determining, via the one or more hardware processors, the trajectory of the object based on uniquely identifying the object using the one or more extracted features.
5. The method of claim 1, wherein determining the trajectory of the object includes:
determining, via the one or more hardware processors, a position of the object on a map of a site in which the image capture device is located.
6. The method of claim 1, further comprising:
determining, via the one or more hardware processors, that a position on a map of a site in which the image capture device is located is not included in any trajectory of any object for a predetermined period of time; and
classifying, via the one or more hardware processors, the position on the map as a blind spot.
7. The method of claim 1, further comprising:
obtaining, via the one or more hardware processors, a second image from a second image capture device; and
determining, via the one or more hardware processors, the trajectory of the object using the second image captured from the second image capture device.
8. The method of claim 7, further comprising:
calculating a first position of the object on a map of a site using the image;
calculating a second position of the object on the map of the site using the second image;
determining that a distance between the first position and the second position is less than a predetermined threshold; and
identifying a region of overlap on the map of the site based on determining that the distance between the first position and the second position is less than the predetermined threshold.
9. A video frame coordinate transformation system, comprising:
one or more hardware processors; and
one or more memory devices storing instructions executable by the one or more hardware processors for:
obtaining an image from an image capture device;
identifying, via the one or more hardware processors, an object depicted in the image;
determining, via the one or more hardware processors, image-frame object coordinates for the object;
selecting, via the one or more hardware processors, one of a plurality of coordinate transformation matrices associated with the image capture device, based on the image-frame object coordinates for the object;
calculating, via the one or more hardware processors, real-world object coordinates for the object using the image-frame coordinates and the selected coordinate transformation matrix; and
determining, via the one or more hardware processors, a trajectory of the object using the calculated real-world object coordinates.
10. The system of claim 9, wherein selecting the one of the plurality of coordinate transformation matrices comprises:
identifying a quadrilateral associated with a static object in the image as closest, to the image-frame object coordinates for the object, among quadrilaterals used to determine the plurality of coordinate transformation matrices; and
selecting a coordinate transformation matrix determined using the identified quadrilateral as the one of the plurality of coordinate transformation matrices.
11. The system of claim 9, the one or more memory devices further storing instructions executable by the one or more hardware processors for:
classifying, via the one or more hardware processors, the image as an image that includes object motion; and
identifying, via the one or more hardware processors, the object depicted in the image after classifying the image as an image that includes object motion.
12. The system of claim 9, the one or more memory devices further storing instructions executable by the one or more hardware processors for:
extracting, via the one or more hardware processors, one or more features of the object for uniquely identifying the object; and
determining, via the one or more hardware processors, the trajectory of the object based on uniquely identifying the object using the one or more extracted features.
13. The system of claim 9, wherein determining the trajectory of the object includes:
determining, via the one or more hardware processors, a position of the object on a map of a site in which the image capture device is located.
14. The system of claim 9, the one or more memory devices further storing instructions executable by the one or more hardware processors for:
determining, via the one or more hardware processors, that a position on a map of a site in which the image capture device is located is not included in any trajectory of any object for a predetermined period of time; and
classifying, via the one or more hardware processors, the position on the map as a blind spot.
15. The system of claim 9, the one or more memory devices further storing instructions executable by the one or more hardware processors for:
obtaining, via the one or more hardware processors, a second image from a second image capture device; and
determining, via the one or more hardware processors, the trajectory of the object using the second image captured from the second image capture device.
16. The system of claim 15, the one or more memory devices further storing instructions executable by the one or more hardware processors for:
calculating a first position of the object on a map of a site using the image;
calculating a second position of the object on the map of the site using the second image;
determining that a distance between the first position and the second position is less than a predetermined threshold; and
identifying a region of overlap on the map of the site based on determining that the distance between the first position and the second position is less than the predetermined threshold.
17. A non-transitory computer-readable medium storing computer-executable video frame coordinate transformation instructions for:
obtaining an image from an image capture device;
identifying, via one or more hardware processors, an object depicted in the image;
determining, via the one or more hardware processors, image-frame object coordinates for the object;
selecting, via the one or more hardware processors, one of a plurality of coordinate transformation matrices associated with the image capture device, based on the image-frame object coordinates for the object;
calculating, via the one or more hardware processors, real-world object coordinates for the object using the image-frame coordinates and the selected coordinate transformation matrix; and
determining, via the one or more hardware processors, a trajectory of the object using the calculated real-world object coordinates.
Dated this 29th day of January, 2015
Swetha S.N
Of K&S Partners
Agent for the Applicant
,TagSPECI:TECHNICAL FIELD
This disclosure relates generally to video analysis, and more particularly to systems and methods for mapping object co-ordinates from a video frame view to real world co-ordinates using perspective transformation.
| # | Name | Date |
|---|---|---|
| 1 | 420-CHE-2015 FORM-9 29-01-2015.pdf | 2015-01-29 |
| 1 | 420-CHE-2015-IntimationOfGrant06-02-2023.pdf | 2023-02-06 |
| 2 | 420-CHE-2015 FORM-18 29-01-2015.pdf | 2015-01-29 |
| 2 | 420-CHE-2015-PatentCertificate06-02-2023.pdf | 2023-02-06 |
| 3 | 420CHE2015_CertifiedCopyRequest.pdf ONLINE | 2015-02-12 |
| 3 | 420-CHE-2015-FORM 3 [16-01-2023(online)].pdf | 2023-01-16 |
| 4 | 420-CHE-2015-Request For Certified Copy-Online(12-02-2015).pdf | 2015-02-12 |
| 4 | 420-CHE-2015-PETITION UNDER RULE 137 [16-01-2023(online)].pdf | 2023-01-16 |
| 5 | IP29966_Spec.pdf | 2015-03-12 |
| 5 | 420-CHE-2015-Written submissions and relevant documents [16-01-2023(online)].pdf | 2023-01-16 |
| 6 | IP29966-FIG.pdf | 2015-03-12 |
| 6 | 420-CHE-2015-US(14)-ExtendedHearingNotice-(HearingDate-02-01-2023).pdf | 2022-12-27 |
| 7 | FORM 5-IP29966.pdf | 2015-03-12 |
| 7 | 420-CHE-2015-AMENDED DOCUMENTS [13-12-2022(online)].pdf | 2022-12-13 |
| 8 | FORM 3-IP29966.pdf | 2015-03-12 |
| 8 | 420-CHE-2015-Correspondence to notify the Controller [13-12-2022(online)].pdf | 2022-12-13 |
| 9 | 420-CHE-2015-FORM 13 [13-12-2022(online)].pdf | 2022-12-13 |
| 9 | 420CHE2015_CertifiedCopyRequest.pdf | 2015-03-13 |
| 10 | 420-CHE-2015 POWER OF ATTORNEY 22-05-2015.pdf | 2015-05-22 |
| 10 | 420-CHE-2015-POA [13-12-2022(online)].pdf | 2022-12-13 |
| 11 | 420-CHE-2015 FORM-1 22-05-2015.pdf | 2015-05-22 |
| 11 | 420-CHE-2015-US(14)-HearingNotice-(HearingDate-27-12-2022).pdf | 2022-12-07 |
| 12 | 420-CHE-2015 CORRESPONDENCE OTHERS 22-05-2015.pdf | 2015-05-22 |
| 12 | 420-CHE-2015-ABSTRACT [16-03-2020(online)].pdf | 2020-03-16 |
| 13 | 420-CHE-2015-CLAIMS [16-03-2020(online)].pdf | 2020-03-16 |
| 13 | REQUEST FOR CERTIFIED COPY [16-09-2015(online)].pdf | 2015-09-16 |
| 14 | 420-CHE-2015-CORRESPONDENCE [16-03-2020(online)].pdf | 2020-03-16 |
| 14 | 420-CHE-2015-FER.pdf | 2019-09-16 |
| 15 | 420-CHE-2015-DRAWING [16-03-2020(online)].pdf | 2020-03-16 |
| 15 | 420-CHE-2015-RELEVANT DOCUMENTS [16-03-2020(online)].pdf | 2020-03-16 |
| 16 | 420-CHE-2015-FER_SER_REPLY [16-03-2020(online)].pdf | 2020-03-16 |
| 16 | 420-CHE-2015-PETITION UNDER RULE 137 [16-03-2020(online)].pdf | 2020-03-16 |
| 17 | 420-CHE-2015-OTHERS [16-03-2020(online)].pdf | 2020-03-16 |
| 17 | 420-CHE-2015-FORM 3 [16-03-2020(online)].pdf | 2020-03-16 |
| 18 | 420-CHE-2015-Information under section 8(2) [16-03-2020(online)].pdf | 2020-03-16 |
| 19 | 420-CHE-2015-FORM 3 [16-03-2020(online)].pdf | 2020-03-16 |
| 19 | 420-CHE-2015-OTHERS [16-03-2020(online)].pdf | 2020-03-16 |
| 20 | 420-CHE-2015-FER_SER_REPLY [16-03-2020(online)].pdf | 2020-03-16 |
| 20 | 420-CHE-2015-PETITION UNDER RULE 137 [16-03-2020(online)].pdf | 2020-03-16 |
| 21 | 420-CHE-2015-DRAWING [16-03-2020(online)].pdf | 2020-03-16 |
| 21 | 420-CHE-2015-RELEVANT DOCUMENTS [16-03-2020(online)].pdf | 2020-03-16 |
| 22 | 420-CHE-2015-CORRESPONDENCE [16-03-2020(online)].pdf | 2020-03-16 |
| 22 | 420-CHE-2015-FER.pdf | 2019-09-16 |
| 23 | 420-CHE-2015-CLAIMS [16-03-2020(online)].pdf | 2020-03-16 |
| 23 | REQUEST FOR CERTIFIED COPY [16-09-2015(online)].pdf | 2015-09-16 |
| 24 | 420-CHE-2015-ABSTRACT [16-03-2020(online)].pdf | 2020-03-16 |
| 24 | 420-CHE-2015 CORRESPONDENCE OTHERS 22-05-2015.pdf | 2015-05-22 |
| 25 | 420-CHE-2015 FORM-1 22-05-2015.pdf | 2015-05-22 |
| 25 | 420-CHE-2015-US(14)-HearingNotice-(HearingDate-27-12-2022).pdf | 2022-12-07 |
| 26 | 420-CHE-2015 POWER OF ATTORNEY 22-05-2015.pdf | 2015-05-22 |
| 26 | 420-CHE-2015-POA [13-12-2022(online)].pdf | 2022-12-13 |
| 27 | 420-CHE-2015-FORM 13 [13-12-2022(online)].pdf | 2022-12-13 |
| 27 | 420CHE2015_CertifiedCopyRequest.pdf | 2015-03-13 |
| 28 | 420-CHE-2015-Correspondence to notify the Controller [13-12-2022(online)].pdf | 2022-12-13 |
| 28 | FORM 3-IP29966.pdf | 2015-03-12 |
| 29 | 420-CHE-2015-AMENDED DOCUMENTS [13-12-2022(online)].pdf | 2022-12-13 |
| 29 | FORM 5-IP29966.pdf | 2015-03-12 |
| 30 | 420-CHE-2015-US(14)-ExtendedHearingNotice-(HearingDate-02-01-2023).pdf | 2022-12-27 |
| 30 | IP29966-FIG.pdf | 2015-03-12 |
| 31 | IP29966_Spec.pdf | 2015-03-12 |
| 31 | 420-CHE-2015-Written submissions and relevant documents [16-01-2023(online)].pdf | 2023-01-16 |
| 32 | 420-CHE-2015-Request For Certified Copy-Online(12-02-2015).pdf | 2015-02-12 |
| 32 | 420-CHE-2015-PETITION UNDER RULE 137 [16-01-2023(online)].pdf | 2023-01-16 |
| 33 | 420CHE2015_CertifiedCopyRequest.pdf ONLINE | 2015-02-12 |
| 33 | 420-CHE-2015-FORM 3 [16-01-2023(online)].pdf | 2023-01-16 |
| 34 | 420-CHE-2015-PatentCertificate06-02-2023.pdf | 2023-02-06 |
| 34 | 420-CHE-2015 FORM-18 29-01-2015.pdf | 2015-01-29 |
| 35 | 420-CHE-2015-IntimationOfGrant06-02-2023.pdf | 2023-02-06 |
| 35 | 420-CHE-2015 FORM-9 29-01-2015.pdf | 2015-01-29 |
| 1 | 2020-06-2315-07-33AE_23-06-2020.pdf |
| 2 | 2019-09-1310-27-28_13-09-2019.pdf |