Abstract: Calibration and distance prediction for driving assistance is provided. A camera of a vehicle is calibrated to obtain a distance data set. The distance data set includes a distance of each row of pixels of a first image captured by the camera. The distance data set may be further utilized in real¬time to predict a distance of an object from the vehicle. Based on the predicted distance, a warning message for an impending collision may be generated and communicated to a driver of the vehicle, thereby facilitating driving assistance to the driver in the real-time.
1. A method, comprising:
identifying, by circuitry, in a first image including at least a plurality of lines, a first plurality of rows of pixels corresponding to the plurality of lines, wherein a first line of the plurality of lines is at a known distance from a second line of the plurality of lines;
estimating, by the circuitry, a first distance and a second distance of the first line and the second line, respectively, from an image-capturing device used for capturing the first image, wherein the first distance and the second distance are estimated based on at least the known distance, a first width of the first line in the first image, and a second width of the second line in the first image;
estimating, by the circuitry, a third distance of each row of pixels of a second plurality of rows of pixels in the first image from the image-capturing device based on at least a focal length of the image-capturing device and a third width of a third line corresponding to each row of pixels, wherein the third width is estimated based on at least the plurality of lines and each row of pixels; and
storing, by the circuitry in a memory, a distance data set including at least the first distance, the second distance, and the third distance, wherein a fourth distance of a first object from a second object is predicted based on the stored distance data set and a second image of the first object.
2. The method of claim 1, further comprising:
converting, by the circuitry, the first image from a first color model to a second color model;
filtering, by the circuitry, the converted first image to obtain a filtered first image including a known color; and
identifying, by the circuitry, the first plurality of rows of pixels from the filtered first image.
3. The method of claim 2, wherein
the first and second color models are at least one of a black and white color model, a greyscale color model, a red, green, and blue (RGB) color model, a hue, saturation, and
value (HSV) color model, a cyan, magenta, yellow, and black (CMYK) color model, a hue, saturation, and brightness (HSB) color model, a hue, saturation, and lightness (HSL) color model, or a hue, chroma, and value (HCV) color model, and
wherein the first color model is different from the second color model.
4. The method of claim 1, further comprising:
estimating, by the circuitry, the focal length of the image-capturing device based on at least one of the first line or the second line.
5. The method of claim 1, wherein the first image comprises the first and second pluralities of rows of pixels.
6. The method of claim 1, further comprising:
detecting, by the circuitry, a bottom edge of the first object based on the second image;
identifying, by the circuitry, a third row of pixels in the second image corresponding to the detected bottom edge; and
retrieving, by the circuitry from the memory, a distance value corresponding to a fourth row of pixels associated with the first image based on the third row of pixels, wherein the retrieved distance value indicates the fourth distance of the first object from the second object, and wherein a row number of the third row of pixels is equal to a row number of the fourth row of pixels.
7. A method, comprising:
capturing, by an image-capturing device of a vehicle device installed in a vehicle, a first image of a first object;
detecting, by a processor of the vehicle device, a bottom edge of the first object based on the first image;
identifying, by the processor, a first row of pixels in the first image corresponding to the detected bottom edge; and
predicting, by the processor, a first distance of the first object from the vehicle based on a distance value, wherein the distance value is retrieved, based on the first row of pixels, from a distance data set stored in a memory of the vehicle device, and wherein the stored distance data set is estimated using steps comprising:
identifying, in a second image including at least a plurality of lines, a second plurality of rows of pixels corresponding to the plurality of lines, wherein a first line of the plurality of lines is at a known distance from a second line of the plurality of lines;
estimating a second distance and a third distance of the first line and the second line, respectively, from a second object, wherein the second distance and the third distance are estimated based on at least the known distance, a first width of the first line in the second image, and a second width of the second line in the second image; and
estimating the distance data set of a third plurality of rows of pixels in the second image from the second object based on at least a third width of a third line corresponding to each of the third plurality of rows of pixels and one of the first line or the second line, wherein the distance data set further includes at least the second distance and the third distance.
8. The method of claim 7, wherein the second plurality of rows of pixels are identified using
steps comprising:
converting the second image from a first color model to a second color model; filtering the converted second image to obtain a filtered image including a known color; and
identifying the second plurality of rows of pixels from the filtered second image.
9. The method of claim 8, wherein
the first and second color models are at least one of a black and white color model, a greyscale color model, a red, green, and blue (RGB) color model, a hue, saturation, and value (HSV) color model, a cyan, magenta, yellow, and black (CMYK) color model, a hue,
saturation, and brightness (HSB) color model, a hue, saturation, and lightness (HSL) color model, or a hue, chroma, and value (HCV) color model, and
wherein the first color model is different from the second color model.
10. The method of claim 7, wherein the second image comprises the second and third pluralities of rows of pixels.
11. The method of claim 7, wherein the third width is estimated based on at least the plurality of lines and each row of pixels of the third plurality of rows of pixels.
12. The method of claim 7, wherein the retrieved distance value is associated with a fourth row of pixels in the second image, and wherein a row number of the first row of pixels in the first image is equal to a row number of the fourth row of pixels in the second image.
13. The method of claim 7, further comprising:
generating, by the processor, a warning message based on at least the first distance of the first object from the vehicle; and
communicating, by the processor to a driver of the vehicle, the warning message indicating an impending collision.
14. A system, comprising:
an image-capturing device configured to:
capture a first image of a first object; and a processor configured to:
detect a bottom edge of the first object based on the first image;
identify a first row of pixels in the first image corresponding to the detected bottom edge; and
predict a first distance of the first object from a vehicle based on a distance value, wherein the distance value is retrieved, based on the first row of pixels, from a distance data set stored in a memory, and wherein the stored distance data set is estimated using steps comprising:
identifying, in a second image including at least a plurality of lines, a second plurality of rows of pixels corresponding to the plurality of lines, wherein a first line of the plurality of lines is at a known distance from a second line of the plurality of lines;
estimating a second distance and a third distance of the first line and the second line, respectively, from a second object, wherein the second distance and the third distance are estimated based on at least the known distance, a first width of the first line in the second image, and a second width of the second line in the second image; and
estimating the distance data set of a third plurality of rows of pixels in the second image from the second object based on at least a third width of a third line corresponding to each of the third plurality of rows of pixels and one of the first line or the second line, wherein the distance data set further includes at least the second distance and the third distance.
15. The system of claim 14, wherein the second plurality of rows of pixels are identified using
steps comprising:
converting the second image from a first color model to a second color model; filtering the converted second image to obtain a filtered second image including a known color; and
identifying the second plurality of rows of pixels from the filtered second image.
16. The system of claim 15, wherein
the first and second color models are at least one of a black and white color model, a greyscale color model, a red, green, and blue (RGB) color model, a hue, saturation, and value (HSV) color model, a cyan, magenta, yellow, and black (CMYK) color model, a hue, saturation, and brightness (HSB) color model, a hue, saturation, and lightness (HSL) color model, or a hue, chroma, and value (HCV) color model, and
wherein the first color model is different from the second color model.
17. The system of claim 14, wherein the second image comprises the second and third pluralities of rows of pixels.
18. The system of claim 14, wherein the third width is estimated based on at least the plurality of lines and each row of pixels of the third plurality of rows of pixels.
19. The system of claim 14, wherein the retrieved distance value is associated with a fourth row of pixels in the second image, and wherein a row number of the first row of pixels in the first image is equal to a row number of the fourth row of pixels in the second image.
20. The system of claim 14, wherein the processor is further configured to:
generate a warning message based on at least the first distance of the first object from the vehicle; and
communicate the warning message to a driver of the vehicle indicating an impending collision.
| # | Name | Date |
|---|---|---|
| 1 | 201941005110-Correspondence to notify the Controller [15-10-2024(online)].pdf | 2024-10-15 |
| 1 | 201941005110-FORM 1 [08-02-2019(online)].pdf | 2019-02-08 |
| 1 | 201941005110-IntimationOfGrant10-03-2025.pdf | 2025-03-10 |
| 1 | 201941005110-Written submissions and relevant documents [29-11-2024(online)].pdf | 2024-11-29 |
| 2 | 201941005110-Correspondence to notify the Controller [15-10-2024(online)].pdf | 2024-10-15 |
| 2 | 201941005110-DRAWINGS [08-02-2019(online)].pdf | 2019-02-08 |
| 2 | 201941005110-PatentCertificate10-03-2025.pdf | 2025-03-10 |
| 2 | 201941005110-US(14)-HearingNotice-(HearingDate-14-11-2024).pdf | 2024-10-15 |
| 3 | 201941005110-ABSTRACT [12-01-2024(online)].pdf | 2024-01-12 |
| 3 | 201941005110-COMPLETE SPECIFICATION [08-02-2019(online)].pdf | 2019-02-08 |
| 3 | 201941005110-US(14)-HearingNotice-(HearingDate-14-11-2024).pdf | 2024-10-15 |
| 3 | 201941005110-Written submissions and relevant documents [29-11-2024(online)].pdf | 2024-11-29 |
| 4 | 201941005110-ABSTRACT [12-01-2024(online)].pdf | 2024-01-12 |
| 4 | 201941005110-CLAIMS [12-01-2024(online)].pdf | 2024-01-12 |
| 4 | 201941005110-Correspondence to notify the Controller [15-10-2024(online)].pdf | 2024-10-15 |
| 4 | 201941005110-FORM 3 [18-04-2019(online)].pdf | 2019-04-18 |
| 5 | 201941005110-US(14)-HearingNotice-(HearingDate-14-11-2024).pdf | 2024-10-15 |
| 5 | 201941005110-FORM-26 [30-04-2019(online)].pdf | 2019-04-30 |
| 5 | 201941005110-COMPLETE SPECIFICATION [12-01-2024(online)].pdf | 2024-01-12 |
| 5 | 201941005110-CLAIMS [12-01-2024(online)].pdf | 2024-01-12 |
| 6 | Correspondenc by Agent_Power Of Attorney_06-05-2019.pdf | 2019-05-06 |
| 6 | 201941005110-FER_SER_REPLY [12-01-2024(online)].pdf | 2024-01-12 |
| 6 | 201941005110-COMPLETE SPECIFICATION [12-01-2024(online)].pdf | 2024-01-12 |
| 6 | 201941005110-ABSTRACT [12-01-2024(online)].pdf | 2024-01-12 |
| 7 | 201941005110-CLAIMS [12-01-2024(online)].pdf | 2024-01-12 |
| 7 | 201941005110-FER_SER_REPLY [12-01-2024(online)].pdf | 2024-01-12 |
| 7 | 201941005110-FORM-26 [12-01-2024(online)].pdf | 2024-01-12 |
| 7 | 201941005110-Proof of Right (MANDATORY) [15-05-2019(online)].pdf | 2019-05-15 |
| 8 | 201941005110-COMPLETE SPECIFICATION [12-01-2024(online)].pdf | 2024-01-12 |
| 8 | 201941005110-FORM-26 [12-01-2024(online)].pdf | 2024-01-12 |
| 8 | 201941005110-Information under section 8(2) [12-01-2024(online)].pdf | 2024-01-12 |
| 8 | Correspondence by Agent_ Form 1_20-05-2019.pdf | 2019-05-20 |
| 9 | 201941005110-FER.pdf | 2023-07-12 |
| 9 | 201941005110-FER_SER_REPLY [12-01-2024(online)].pdf | 2024-01-12 |
| 9 | 201941005110-FORM 3 [04-10-2019(online)].pdf | 2019-10-04 |
| 9 | 201941005110-Information under section 8(2) [12-01-2024(online)].pdf | 2024-01-12 |
| 10 | 201941005110-ENDORSEMENT BY INVENTORS [01-12-2020(online)].pdf | 2020-12-01 |
| 10 | 201941005110-FER.pdf | 2023-07-12 |
| 10 | 201941005110-FORM 13 [11-04-2023(online)].pdf | 2023-04-11 |
| 10 | 201941005110-FORM-26 [12-01-2024(online)].pdf | 2024-01-12 |
| 11 | 201941005110-Covering Letter [15-09-2021(online)].pdf | 2021-09-15 |
| 11 | 201941005110-FORM 13 [11-04-2023(online)].pdf | 2023-04-11 |
| 11 | 201941005110-Information under section 8(2) [12-01-2024(online)].pdf | 2024-01-12 |
| 11 | 201941005110-POA [11-04-2023(online)].pdf | 2023-04-11 |
| 12 | 201941005110-FER.pdf | 2023-07-12 |
| 12 | 201941005110-FORM 18 [07-02-2023(online)].pdf | 2023-02-07 |
| 12 | 201941005110-POA [11-04-2023(online)].pdf | 2023-04-11 |
| 12 | 201941005110-RELEVANT DOCUMENTS [11-04-2023(online)].pdf | 2023-04-11 |
| 13 | 201941005110-RELEVANT DOCUMENTS [11-04-2023(online)].pdf | 2023-04-11 |
| 13 | 201941005110-FORM 18 [07-02-2023(online)].pdf | 2023-02-07 |
| 13 | 201941005110-FORM 13 [11-04-2023(online)].pdf | 2023-04-11 |
| 14 | 201941005110-Covering Letter [15-09-2021(online)].pdf | 2021-09-15 |
| 14 | 201941005110-FORM 18 [07-02-2023(online)].pdf | 2023-02-07 |
| 14 | 201941005110-POA [11-04-2023(online)].pdf | 2023-04-11 |
| 15 | 201941005110-Covering Letter [15-09-2021(online)].pdf | 2021-09-15 |
| 15 | 201941005110-ENDORSEMENT BY INVENTORS [01-12-2020(online)].pdf | 2020-12-01 |
| 15 | 201941005110-FORM 13 [11-04-2023(online)].pdf | 2023-04-11 |
| 15 | 201941005110-RELEVANT DOCUMENTS [11-04-2023(online)].pdf | 2023-04-11 |
| 16 | 201941005110-ENDORSEMENT BY INVENTORS [01-12-2020(online)].pdf | 2020-12-01 |
| 16 | 201941005110-FER.pdf | 2023-07-12 |
| 16 | 201941005110-FORM 18 [07-02-2023(online)].pdf | 2023-02-07 |
| 16 | 201941005110-FORM 3 [04-10-2019(online)].pdf | 2019-10-04 |
| 17 | 201941005110-Information under section 8(2) [12-01-2024(online)].pdf | 2024-01-12 |
| 17 | Correspondence by Agent_ Form 1_20-05-2019.pdf | 2019-05-20 |
| 17 | 201941005110-Covering Letter [15-09-2021(online)].pdf | 2021-09-15 |
| 17 | 201941005110-FORM 3 [04-10-2019(online)].pdf | 2019-10-04 |
| 18 | 201941005110-Proof of Right (MANDATORY) [15-05-2019(online)].pdf | 2019-05-15 |
| 18 | Correspondence by Agent_ Form 1_20-05-2019.pdf | 2019-05-20 |
| 18 | 201941005110-FORM-26 [12-01-2024(online)].pdf | 2024-01-12 |
| 18 | 201941005110-ENDORSEMENT BY INVENTORS [01-12-2020(online)].pdf | 2020-12-01 |
| 19 | 201941005110-FER_SER_REPLY [12-01-2024(online)].pdf | 2024-01-12 |
| 19 | 201941005110-FORM 3 [04-10-2019(online)].pdf | 2019-10-04 |
| 19 | 201941005110-Proof of Right (MANDATORY) [15-05-2019(online)].pdf | 2019-05-15 |
| 19 | Correspondenc by Agent_Power Of Attorney_06-05-2019.pdf | 2019-05-06 |
| 20 | Correspondence by Agent_ Form 1_20-05-2019.pdf | 2019-05-20 |
| 20 | Correspondenc by Agent_Power Of Attorney_06-05-2019.pdf | 2019-05-06 |
| 20 | 201941005110-FORM-26 [30-04-2019(online)].pdf | 2019-04-30 |
| 20 | 201941005110-COMPLETE SPECIFICATION [12-01-2024(online)].pdf | 2024-01-12 |
| 21 | 201941005110-CLAIMS [12-01-2024(online)].pdf | 2024-01-12 |
| 21 | 201941005110-FORM 3 [18-04-2019(online)].pdf | 2019-04-18 |
| 21 | 201941005110-FORM-26 [30-04-2019(online)].pdf | 2019-04-30 |
| 21 | 201941005110-Proof of Right (MANDATORY) [15-05-2019(online)].pdf | 2019-05-15 |
| 22 | 201941005110-ABSTRACT [12-01-2024(online)].pdf | 2024-01-12 |
| 22 | 201941005110-COMPLETE SPECIFICATION [08-02-2019(online)].pdf | 2019-02-08 |
| 22 | 201941005110-FORM 3 [18-04-2019(online)].pdf | 2019-04-18 |
| 22 | Correspondenc by Agent_Power Of Attorney_06-05-2019.pdf | 2019-05-06 |
| 23 | 201941005110-COMPLETE SPECIFICATION [08-02-2019(online)].pdf | 2019-02-08 |
| 23 | 201941005110-DRAWINGS [08-02-2019(online)].pdf | 2019-02-08 |
| 23 | 201941005110-FORM-26 [30-04-2019(online)].pdf | 2019-04-30 |
| 23 | 201941005110-US(14)-HearingNotice-(HearingDate-14-11-2024).pdf | 2024-10-15 |
| 24 | 201941005110-FORM 3 [18-04-2019(online)].pdf | 2019-04-18 |
| 24 | 201941005110-FORM 1 [08-02-2019(online)].pdf | 2019-02-08 |
| 24 | 201941005110-DRAWINGS [08-02-2019(online)].pdf | 2019-02-08 |
| 24 | 201941005110-Correspondence to notify the Controller [15-10-2024(online)].pdf | 2024-10-15 |
| 25 | 201941005110-Written submissions and relevant documents [29-11-2024(online)].pdf | 2024-11-29 |
| 25 | 201941005110-FORM 1 [08-02-2019(online)].pdf | 2019-02-08 |
| 25 | 201941005110-COMPLETE SPECIFICATION [08-02-2019(online)].pdf | 2019-02-08 |
| 26 | 201941005110-PatentCertificate10-03-2025.pdf | 2025-03-10 |
| 26 | 201941005110-DRAWINGS [08-02-2019(online)].pdf | 2019-02-08 |
| 27 | 201941005110-IntimationOfGrant10-03-2025.pdf | 2025-03-10 |
| 27 | 201941005110-FORM 1 [08-02-2019(online)].pdf | 2019-02-08 |
| 1 | SearchHistoryE_11-07-2023.pdf |