Abstract: Method and device for identifying path boundary for vehicle navigation are disclosed. The method includes capturing a plurality of images of a path being traversed by a vehicle, through a plurality of cameras placed to meet predefined placement criteria. The method further includes processing shadowed regions within each of the plurality of images based on an associated Hue Saturation and Value (HSV) color space to generate a plurality of shadow processed images. The method includes identifying boundaries of the path within each of the plurality of shadow processed images based on a histogram of each of the plurality of shadow processed images. The method further includes estimating a distance between the boundaries of the path identified in the plurality of shadow processed images, based on a disparity map created for the plurality of shadow processed images and parameters associated with the plurality of cameras. FIG. 3
Claims:WE CLAIM
1. A method for identifying path boundary for vehicle navigation, the method comprising:
capturing, by a vehicle navigation device, a plurality of images of a path being traversed by a vehicle, through a plurality of cameras placed to meet predefined placement criteria;
processing, by the vehicle navigation device, shadowed regions within each of the plurality of images based on an associated Hue Saturation and Value (HSV) color space to generate a plurality of shadow processed images;
identifying, by the vehicle navigation device, boundaries of the path within each of the plurality of shadow processed images based on a histogram of each of the plurality of shadow processed images; and
estimating, by the vehicle navigation device, a distance between the boundaries of the path identified in the plurality of shadow processed images, based on a disparity map created for the plurality of shadow processed images and parameters associated with the plurality of cameras.
2. The method of claim 1, further comprising computing navigation coordinates for the vehicle on the path being traversed by the vehicle, based on the distance estimated between the boundaries of the path.
3. The method of claim 1, wherein the predefined placement criteria comprise at least one of:
placement of each of the plurality of cameras in same horizontal plane; and
parallel placement of optical axis of each of the plurality of cameras.
4. The method of claim 1, further comprising converting each of the plurality of images captured by the plurality of cameras to HSV color space.
5. The method of claim 1, wherein processing shadowed region in an image from the plurality of images comprises:
identifying at least one shadowed pixel having V value greater than a shadow threshold value;
computing a spectral ratio correction factor based on a sum of S value of each pixel in the image and a sum of V value of each pixel in the image; and
applying the spectral ratio correction factor to each of the at least one shadowed pixel in the image to generate a shadow processed image.
6. The method of claim 5, wherein processing shadowed region in the image further comprises blurring the shadow processed image with median filtering.
7. The method of claim 1, wherein identifying boundaries of the path within a shadow processed image from the plurality of shadow processed images comprises:
converting the shadow processed image to a grayscale image;
plotting a histogram associated with the grayscale image;
identifying a local minima and a local maxima of peaks in the histogram;
determining a threshold pixel value corresponding to a path region in the shadow processed image, and a peak value associated with the threshold pixel value in the histogram; and
plotting contours using the histogram based on the threshold pixel value to identify boundaries of the path.
8. The method of claim 7, wherein identifying boundaries of the path within the shadow processed image further comprises applying Ramer Douglas Peucker (RDP) algorithm to the shadow processed image, when the path is unstructured.
9. The method of claim 8, wherein estimating the distance between the boundaries of the path comprises:
extracting a plurality of patches along epipolar lines from each of the plurality of shadow processed images;
determining a plurality of mean squared error values based on comparison of each block within each of the plurality of patches with each block within remaining plurality of patches; and
creating a disparity map for the plurality of shadow processed images based on the plurality of mean squared error values.
10. The method of claim 1, wherein a stereo camera comprises the plurality of cameras.
11. A vehicle navigation device for identifying path boundary, the vehicle navigation device comprising:
a plurality of cameras, wherein placement of the plurality of cameras satisfies predefined placement criteria;
a processor communicatively coupled to the plurality of cameras; and
a memory communicatively coupled to the processor and having processor instructions stored thereon, causing the processor, on execution to:
capture a plurality of images of a path being traversed by a vehicle through the plurality of cameras;
process shadowed regions within each of the plurality of images based on an associated Hue Saturation and Value (HSV) color space to generate a plurality of shadow processed images;
identify boundaries of the path within each of the plurality of shadow processed images based on a histogram of each of the plurality of shadow processed images; and
estimate a distance between the boundaries of the path identified in the plurality of shadow processed images, based on a disparity map created for the plurality of shadow processed images and parameters associated with the plurality of cameras.
12. The vehicle navigation device of claim 11, wherein the processor instructions further cause the processor to compute navigation coordinates for the vehicle on the path being traversed by the vehicle, based on the distance estimated between the boundaries of the path.
13. The vehicle navigation device of claim 11, wherein the predefined placement criteria comprise at least one of:
placement of each of the plurality of cameras in same horizontal plane; and
parallel placement of optical axis of each of the plurality of cameras.
14. The vehicle navigation device of claim 11, further comprising converting each of the plurality of images captured by the plurality of cameras to HSV color space.
15. The vehicle navigation device of claim 11, wherein to process shadowed region in an image from the plurality of images, the processor instructions further cause the processor to:
identify at least one shadowed pixel having V value greater than a shadow threshold value;
compute a spectral ratio correction factor based on a sum of S value of each pixel in the image and a sum of V value of each pixel in the image; and
apply the spectral ratio correction factor to each of the at least one shadowed pixel in the image to generate a shadow processed image.
16. The vehicle navigation device of claim 15, wherein to process shadowed region in the image, the processor instructions further cause the processor to blur the shadow processed image with median filtering.
17. The vehicle navigation device of claim 11, wherein to identify boundaries of the path within a shadow processed image from the plurality of shadow processed images, the processor instructions further cause the processor to:
convert the shadow processed image to a grayscale image;
plot a histogram associated with the grayscale image;
identify a local minima and a local maxima of peaks in the histogram;
determine a threshold pixel value corresponding to a path region in the shadow processed image, and a peak value associated with the threshold pixel value in the histogram; and
plot contours using the histogram based on the threshold pixel value to identify boundaries of the path.
18. The vehicle navigation device of claim 17, wherein the processor instructions further cause the processor to applying Ramer Douglas Peucker (RDP) algorithm to the shadow processed image, when the path is unstructured.
19. The vehicle navigation device of claim 18, wherein to estimate the distance between the boundaries of the path, the processor instructions further cause the processor to:
extract a plurality of patches along epipolar lines from each of the plurality of shadow processed images;
determine a plurality of mean squared error values based on comparison of each block within each of the plurality of patches with each block within remaining plurality of patches; and
create a disparity map for the plurality of shadow processed images based on the plurality of mean squared error values.
Dated this 9th day of August, 2017
R Ramya Rao
of K&S Partners
Agent for the Applicant , Description:TECHNICAL FIELD
This disclosure relates generally to autonomous vehicles and more particularly to method and device for identifying path boundary for vehicle navigation.
| # | Name | Date |
|---|---|---|
| 1 | 201741028343-STATEMENT OF UNDERTAKING (FORM 3) [09-08-2017(online)].pdf | 2017-08-09 |
| 2 | 201741028343-REQUEST FOR EXAMINATION (FORM-18) [09-08-2017(online)].pdf | 2017-08-09 |
| 3 | 201741028343-POWER OF AUTHORITY [09-08-2017(online)].pdf | 2017-08-09 |
| 4 | 201741028343-FORM 18 [09-08-2017(online)].pdf | 2017-08-09 |
| 5 | 201741028343-FORM 1 [09-08-2017(online)].pdf | 2017-08-09 |
| 6 | 201741028343-DRAWINGS [09-08-2017(online)].pdf | 2017-08-09 |
| 7 | 201741028343-DECLARATION OF INVENTORSHIP (FORM 5) [09-08-2017(online)].pdf | 2017-08-09 |
| 8 | 201741028343-COMPLETE SPECIFICATION [09-08-2017(online)].pdf | 2017-08-09 |
| 9 | 201741028343-REQUEST FOR CERTIFIED COPY [11-08-2017(online)].pdf | 2017-08-11 |
| 10 | abstract 201741028343.jpg | 2017-08-21 |
| 11 | 201741028343-Proof of Right (MANDATORY) [11-10-2017(online)].pdf | 2017-10-11 |
| 12 | Correspondence by Agent_Form 1_13-10-2017.pdf | 2017-10-13 |
| 13 | Correspondence by Agent_Form 1_13-10-2017..pdf | 2017-10-13 |
| 14 | Correspondence by Agent_Form 1_13-10-2017..pdf | 2017-10-13 |
| 15 | 201741028343-REQUEST FOR CERTIFIED COPY [06-02-2018(online)].pdf | 2018-02-06 |
| 16 | 201741028343-FER.pdf | 2020-07-06 |
| 17 | 201741028343-PETITION UNDER RULE 137 [05-01-2021(online)].pdf | 2021-01-05 |
| 18 | 201741028343-FORM 3 [05-01-2021(online)].pdf | 2021-01-05 |
| 19 | 201741028343-FER_SER_REPLY [05-01-2021(online)].pdf | 2021-01-05 |
| 20 | 201741028343-PatentCertificate15-11-2023.pdf | 2023-11-15 |
| 21 | 201741028343-IntimationOfGrant15-11-2023.pdf | 2023-11-15 |
| 22 | 201741028343-PROOF OF ALTERATION [08-02-2024(online)].pdf | 2024-02-08 |
| 1 | 201741028343searchstdE_02-07-2020.pdf |