Abstract: A METHOD OF STITCHING IMAGES CAPTURED BY A VEHICLE, AND A SYSTEM THEREOF ABSTRACT The present disclosure relates to a method of stitching images captured by a vehicle (101). A first image (105) and a second image (106) is received. The first image (105) and the second image (106) are segmented based on characteristics of pixels. Groups of pixels having similar characteristics are identified to form clusters in a predetermined portion of overlap of the first image (105) and the second image (106). A confidence score is generated for the first image (105) and the second image (106). A difference in the confidence score is computed. At least one of, the first image capturing unit (102) and the second image capturing unit (103) is aligned to capture at least one of, a first aligned image and a second aligned image based on the difference in the confidence score. The first aligned image and the second aligned image are stitched. Figure 3
We claim:
1. A method of stitching images captured by a vehicle (101), comprising:
receiving, by an Electronic Control Unit (ECU) (104) of the vehicle (101), a first image (105) comprising a first portion of a scene (100), from a first image capturing unit (102) installed in the vehicle (101) and a second image (106) comprising a second portion of the scene (100), from a second image capturing unit (103) installed in the vehicle (101);
segmenting, by the ECU (104), the first image (105) and the second image (106) based on one or more characteristics of a plurality of pixels of the first image (105) and the second image (106);
identifying, by the ECU (104), one or more groups of pixels from the plurality of pixels in each of the first image (105) and the second image (106) having similar characteristics from the one or more characteristics, wherein the identified one or more groups of pixels form one or more clusters, wherein a centroid is determined for each of the one or more clusters in a predetermined portion of overlap;
generating, by the ECU (104), a confidence score for the first image (105) and the second image (106) based on the centroid of each of the one or more clusters;
computing, by the ECU (104), a difference in the confidence score of the first image (105) and the second image (106);
aligning, by the ECU (104), at least one of, the first image capturing unit (102) and the second image capturing unit (103) based on the difference in the confidence score, wherein at least one of, the aligned first image capturing unit (102) and the aligned second image capturing unit (103) captures at least one of, a first aligned image and a second aligned image respectively; and
stitching, by the ECU (104), the first aligned image and the second aligned image.
2. The method as claimed in claim 1, wherein the segmentation of the first image (105) and the second image (106) is performed using Neural Networks, wherein the first image (105) and the second image (106) have the predetermined portion of overlap.
3. The method as claimed in claim 1, wherein the one or more clusters are formed based on at least one of, the similar characteristics in the plurality of pixels from the one or more characteristics, a class of objects in the first image (105) and the second image (106) and a relative distance between the one or more pixels, wherein the one or more characteristics comprises at least one of, a gray scale level of a plurality of pixels, a power spectrum of the
first image (105) and the second image (106), a texture of the objects in the first image (105) and the second image (106), a shape of objects in the first image (105) and the second image (106), an intensity of the plurality of pixels, and a spatial location of the objects, and a color of the plurality of pixels.
4. The method as claimed in claim 1, wherein aligning the first image capturing unit (102)
and the second image capturing unit (103) comprises:
determining, a pair of centroids in each of the first image (105) and the second image (106);
determining, a distance of the pair of centroids from respective centroids in the first image (105) and the second image (106) using Bayesian Conditional Probability to determine a misalignment between the first image (105) and the second image (106); and
aligning at least one of, the first image capturing unit (102) and the second image capturing unit (103) based on the misalignment.
5. The method as claimed in claim 1, wherein stitching the first aligned image and the second aligned image comprises adding the plurality of pixels of the second aligned image to the plurality of pixels of the first aligned image along an overlapping end of respective images.
6. An Electronic Control Unit (ECU) (104) of a vehicle (101), for stitching images captured by a vehicle (101), comprising:
one or more processors (203); and
a memory (202), wherein the memory (202) stores processor-executable instructions, which, on execution, cause the processor to:
receive a first image (105) comprising a first portion of a scene (100), from a first image capturing unit (102) installed in the vehicle (101) and a second image (106) comprising a second portion of the scene (100), from a second image capturing unit (103) installed in the vehicle (101);
segment the first image (105) and the second image (106) based on one or more characteristics of a plurality of pixels of the first image (105) and the second image (106);
identify one or more groups of pixels from the plurality of pixels in each of the first image (105) and the second image (106) having similar characteristics from the one or more
characteristics, wherein the identified one or more groups of pixels form one or more clusters, wherein a centroid is determined for each of the one or more clusters;
generate a confidence score for the first image (105) and the second image (106) based on the centroid of each of the one or more clusters, wherein the confidence score of each of the first and the second image (106) indicates a predetermined portion of overlap of the first image (105) and the second image (106) respectively;
compute a difference in the confidence score of the first image (105) and the second image (106);
align the first image capturing unit (102) and the second image capturing unit (103) based on the difference in the confidence score, wherein at least one of, the aligned first image capturing unit (102) and the aligned second image capturing unit (103) captures at least one of, a first aligned image and a second aligned image respectively; and stitch the first aligned image and the second aligned image.
7. The ECU (104) as claimed in claim 6, wherein the one or more processors (203) segments the first image (105) and the second image (106) using Neural Networks.
8. The ECU (104) as claimed in claim 6, wherein the one or more processors (203) forms the one or more clusters are formed based on at least one of, the similar characteristics in the plurality of pixels, a class of objects in the first image (105) and the second image (106) and a relative distance between the one or more pixels.
9. The ECU (104) as claimed in claim 6, wherein the one or more processors (203) aligns the first image capturing unit (102) and the second image capturing unit (103) by:
determining, a pair of centroids each in the first image (105) and the second image (106);
determining, a distance of the pair of centroids from respective centroids in the first image (105) and the second image (106) using Bayesian Conditional Probability to determine a misalignment between the first image (105) and the second image (106); and
aligning at least one of, the first image capturing unit (102) and the second image capturing unit (103) based on the misalignment.
10. The ECU (104) as claimed in claim 6, wherein the one or more processors (203) stitches
the first aligned image and the second aligned image by adding the plurality of pixels of
the second aligned image to the plurality of pixels of the first aligned image along an overlapping end of respective images.
| # | Name | Date |
|---|---|---|
| 1 | 202041013697-IntimationOfGrant10-05-2024.pdf | 2024-05-10 |
| 1 | 202041013697-STATEMENT OF UNDERTAKING (FORM 3) [28-03-2020(online)].pdf | 2020-03-28 |
| 2 | 202041013697-PatentCertificate10-05-2024.pdf | 2024-05-10 |
| 2 | 202041013697-Request Letter-Correspondence [28-03-2020(online)].pdf | 2020-03-28 |
| 3 | 202041013697-REQUEST FOR EXAMINATION (FORM-18) [28-03-2020(online)].pdf | 2020-03-28 |
| 3 | 202041013697-Proof of Right [15-02-2023(online)].pdf | 2023-02-15 |
| 4 | 202041013697-POWER OF AUTHORITY [28-03-2020(online)].pdf | 2020-03-28 |
| 4 | 202041013697-AMENDED DOCUMENTS [20-01-2023(online)].pdf | 2023-01-20 |
| 5 | 202041013697-Power of Attorney [28-03-2020(online)].pdf | 2020-03-28 |
| 5 | 202041013697-CLAIMS [20-01-2023(online)].pdf | 2023-01-20 |
| 6 | 202041013697-FORM 18 [28-03-2020(online)].pdf | 2020-03-28 |
| 6 | 202041013697-COMPLETE SPECIFICATION [20-01-2023(online)].pdf | 2023-01-20 |
| 7 | 202041013697-FORM 1 [28-03-2020(online)].pdf | 2020-03-28 |
| 7 | 202041013697-FER_SER_REPLY [20-01-2023(online)].pdf | 2023-01-20 |
| 8 | 202041013697-FORM 13 [20-01-2023(online)].pdf | 2023-01-20 |
| 8 | 202041013697-Form 1 (Submitted on date of filing) [28-03-2020(online)].pdf | 2020-03-28 |
| 9 | 202041013697-DRAWINGS [28-03-2020(online)].pdf | 2020-03-28 |
| 9 | 202041013697-FORM 3 [20-01-2023(online)].pdf | 2023-01-20 |
| 10 | 202041013697-DECLARATION OF INVENTORSHIP (FORM 5) [28-03-2020(online)].pdf | 2020-03-28 |
| 10 | 202041013697-OTHERS [20-01-2023(online)].pdf | 2023-01-20 |
| 11 | 202041013697-COMPLETE SPECIFICATION [28-03-2020(online)].pdf | 2020-03-28 |
| 11 | 202041013697-PETITION UNDER RULE 137 [20-01-2023(online)]-1.pdf | 2023-01-20 |
| 12 | 202041013697-FER.pdf | 2022-09-19 |
| 12 | 202041013697-PETITION UNDER RULE 137 [20-01-2023(online)].pdf | 2023-01-20 |
| 13 | 202041013697-POA [20-01-2023(online)].pdf | 2023-01-20 |
| 14 | 202041013697-FER.pdf | 2022-09-19 |
| 14 | 202041013697-PETITION UNDER RULE 137 [20-01-2023(online)].pdf | 2023-01-20 |
| 15 | 202041013697-COMPLETE SPECIFICATION [28-03-2020(online)].pdf | 2020-03-28 |
| 15 | 202041013697-PETITION UNDER RULE 137 [20-01-2023(online)]-1.pdf | 2023-01-20 |
| 16 | 202041013697-DECLARATION OF INVENTORSHIP (FORM 5) [28-03-2020(online)].pdf | 2020-03-28 |
| 16 | 202041013697-OTHERS [20-01-2023(online)].pdf | 2023-01-20 |
| 17 | 202041013697-FORM 3 [20-01-2023(online)].pdf | 2023-01-20 |
| 17 | 202041013697-DRAWINGS [28-03-2020(online)].pdf | 2020-03-28 |
| 18 | 202041013697-Form 1 (Submitted on date of filing) [28-03-2020(online)].pdf | 2020-03-28 |
| 18 | 202041013697-FORM 13 [20-01-2023(online)].pdf | 2023-01-20 |
| 19 | 202041013697-FORM 1 [28-03-2020(online)].pdf | 2020-03-28 |
| 19 | 202041013697-FER_SER_REPLY [20-01-2023(online)].pdf | 2023-01-20 |
| 20 | 202041013697-FORM 18 [28-03-2020(online)].pdf | 2020-03-28 |
| 20 | 202041013697-COMPLETE SPECIFICATION [20-01-2023(online)].pdf | 2023-01-20 |
| 21 | 202041013697-Power of Attorney [28-03-2020(online)].pdf | 2020-03-28 |
| 21 | 202041013697-CLAIMS [20-01-2023(online)].pdf | 2023-01-20 |
| 22 | 202041013697-POWER OF AUTHORITY [28-03-2020(online)].pdf | 2020-03-28 |
| 22 | 202041013697-AMENDED DOCUMENTS [20-01-2023(online)].pdf | 2023-01-20 |
| 23 | 202041013697-REQUEST FOR EXAMINATION (FORM-18) [28-03-2020(online)].pdf | 2020-03-28 |
| 23 | 202041013697-Proof of Right [15-02-2023(online)].pdf | 2023-02-15 |
| 24 | 202041013697-Request Letter-Correspondence [28-03-2020(online)].pdf | 2020-03-28 |
| 24 | 202041013697-PatentCertificate10-05-2024.pdf | 2024-05-10 |
| 25 | 202041013697-IntimationOfGrant10-05-2024.pdf | 2024-05-10 |
| 25 | 202041013697-STATEMENT OF UNDERTAKING (FORM 3) [28-03-2020(online)].pdf | 2020-03-28 |
| 1 | SearchHistoryE_19-09-2022.pdf |