Sign In to Follow Application
View All Documents & Correspondence

Method And System For Detecting Obstacles By Autonomous Vehicles In Real Time

Abstract: The present disclosure relates to detection of obstacles by an autonomous vehicle in real-time. An obstacle detection system of an autonomous vehicle obtains point cloud data from single frame Light Detection and Ranging (LIDAR) data and camera data, of the surroundings of the vehicle. Further, the system processes the camera data to identify and extract regions comprising the obstacles. Further, the system extracts point cloud data corresponding to the obstacles and enhances the point cloud data, with the detailed information of the obstacles provided by the camera data. Further, the system processes the enhanced point cloud data to determine the obstacles along with the structure and orientation of obstacles. Upon determining the details of the obstacles, the system provides instructions to the vehicle to manoeuvre the obstacle. The disclosed obstacle detection system provides accurate data about a structure, an orientation and a location of the obstacles. Figure 2

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
30 January 2018
Publication Number
31/2019
Publication Type
INA
Invention Field
COMMUNICATION
Status
Email
bangalore@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2022-06-21
Renewal Date

Applicants

WIPRO LIMITED
Doddakannelli, Sarjapur Road, Bangalore 560035.

Inventors

1. RANJITH ROCHAN MEGANATHAN
90, Block No11 Jail Hill Revenue Quarters, The Nilgiris 643001, Ooty, Tamil Nadu,
2. SUJATHA JAGANNATH
O-103, HMT Township, sector 1, Jalahalli, Bangalore-560013.

Specification

Claims:We claim:

1. A method for obstacle detection by a vehicle in real-time, comprising:
receiving, by an Electronic Control Unit (ECU) of a vehicle, one or more images of surroundings of the vehicle from an imaging unit and point cloud data of the surroundings of the vehicle from a Light Detection and Ranging (LIDAR) unit;
generating, by the ECU, a disparity image by analysing the one or more images;
extracting, by the ECU, one or more regions comprising one or more obstacles, from the disparity image;
determining, by the ECU, boundary points of each of the one or more obstacles, for determining a position of each of the one or more obstacles in terms of image co-ordinates in the disparity image;
extracting, by the ECU, point cloud data points from the point cloud data based on the image co-ordinates of the boundary points corresponding to the one or more obstacles; and
determining, by the ECU, a gradient for each of the point cloud data points to determine a structural orientation of the one or more obstacles, thereby detecting the one or more obstacles.

2. The method as claimed in claim 1, wherein determining the structural orientation of the one or more obstacles comprises:
identifying, by the ECU, at least one point from the point cloud data points;
projecting, by the ECU, a first virtual hemisphere and a second virtual hemisphere with the identified at least one point as a centre of the first virtual hemisphere and the second virtual hemisphere respectively, wherein the first virtual hemisphere projects outwardly towards a top direction and the second virtual hemisphere projects outwardly towards a bottom direction of the identified at least one point, wherein the projection of the first virtual hemisphere and the second virtual hemisphere is performed until one of a radius of virtual hemispheres is equal to a first threshold value and number of point cloud data points present in the virtual hemisphere is equal to a second threshold value; and
connecting, by the ECU, the identified at least one point to each of the point cloud data points that lie within the first virtual hemisphere, for forming a gradient corresponding to each of the point cloud data points in the first virtual hemisphere;
connecting, by the ECU, the identified at least one point to each of the point cloud data points that lie within the second virtual hemisphere, for forming a gradient corresponding to each of the point cloud data points in the second virtual hemisphere;
determining, by the ECU, a resultant gradient in the first virtual hemisphere and the second virtual hemisphere based on the gradient corresponding to each of the point cloud data points in the first virtual hemisphere and the second virtual hemisphere, with respect to the at least one point; and
accumulating, by the ECU, the resultant gradient in the first virtual hemisphere and the second virtual hemisphere, of each of the point cloud data points, for determining the structural orientation of the one or more obstacles.

3. The method as claimed in claim 1, wherein detecting each of the one or more obstacles comprises:
mapping, by the ECU, each pixel within the boundary points of each of the one or more obstacles to the gradient formed for each point of the point cloud data points of the corresponding one or more obstacles, based on a distance error, to generate an enhanced point cloud data corresponding to the one or more obstacles;
dividing, by the ECU, the point cloud data of the surroundings into plurality of two Dimensional (2D) virtual grids, wherein each of the 2D virtual grid comprises a plurality of enhanced point cloud data points; and
detecting, by the ECU, one or more 2D virtual grids representing each of the one or more obstacles based on a height difference between at least two enhanced point cloud data points in each of the one or more 2D virtual grids, wherein the height difference exceeds a height threshold.

4. The method as claimed in claim 1, wherein the point cloud data of the surroundings comprises a plurality of 3-Dimensional (3D) data points defined by corresponding cartesian co-ordinates.

5. The method as claimed in claim 1, wherein the disparity image is generated from the one or more images comprising a first image and a second image, wherein generating of the disparity image comprises:
matching each pixel in the first image to a corresponding pixel in the second image;
computing a disparity value between each pixel in the first image to a corresponding pixel in the second image; and
generating a disparity image based on the disparity value, wherein each pixel in the disparity image comprises the disparity value.

6. An obstacle detection system for detecting obstacles by a vehicle in real-time, the obstacle detection system comprising;
an Electronic Control Unit (ECU), configured to:
receive, one or more images of surroundings of the vehicle from an imaging unit and point cloud data of the surroundings of the vehicle from a Light Detection and Ranging (LIDAR) unit;
generate, a disparity image by analysing the one or more images;
extract, one or more regions comprising one or more obstacles, from the disparity image;
determine, boundary points of each of the one or more obstacles, for determining a position of each of the one or more obstacles in terms of image co-ordinates in the disparity image;
extract, point cloud data points from the point cloud data based on the image co-ordinates of the boundary points corresponding to the one or more obstacles; and
determine, a gradient for each of the point cloud data points to determine a structural orientation of the one or more obstacles, thereby detecting each of the one or more obstacles.

7. The obstacle detection system as claimed in claim 6, wherein during determining the structural orientation of the one or more obstacles, the ECU is configured to:
identify, at least one point from the point cloud data points;
project, a first virtual hemisphere and a second virtual hemisphere with the identified at least one point as a centre of the first virtual hemisphere and the second virtual hemisphere respectively, wherein the first virtual hemisphere projects outwardly towards a top direction and the second virtual hemisphere projects outwardly towards a bottom direction of the identified at least one point, wherein the projection of the first virtual hemisphere and the second virtual hemisphere is performed until one of a radius of virtual hemispheres is equal to a first threshold value and number of point cloud data points present in the virtual hemisphere is equal to a second threshold value;
connect, the identified at least one point to each of the point cloud data points that lie within the first virtual hemisphere, for forming a gradient corresponding to each of the point cloud data points in the first virtual hemisphere;
connect, the identified at least one point to each of the point cloud data points that lie within the second virtual hemisphere, for forming a gradient corresponding to each of the point cloud data points in the second virtual hemisphere;
determine, a resultant gradient in the first virtual hemisphere and the second virtual hemisphere based on the gradient corresponding to each of the point cloud data points in the first virtual hemisphere and the second virtual hemisphere, with respect to the at least one point; and
accumulate, the resultant gradient in the first virtual hemisphere and the second virtual hemisphere, of each of the point cloud data points, for determining the structural orientation of the one or more obstacles.

8. The obstacle detection system as claimed in claim 6, wherein during detecting of one or more obstacles in the surrounding of the vehicle the ECU is configured to:
map, each pixel within the boundary points of each of the one or more obstacles to the gradient formed for each point of the point cloud data points of the corresponding one or more obstacles, based on a distance error, to generate an enhanced point cloud data corresponding to the one or more obstacles;
divide, the point cloud data of the surroundings into plurality of two Dimensional (2D) virtual grids, wherein each of the 2D virtual grid comprises a plurality of enhanced point cloud data points; and
detect, one or more 2D virtual grids representing each of the one or more obstacles based on a height difference between at least two enhanced point cloud data points in each of the one or more 2D virtual grids, wherein the height difference exceeds a height threshold.

9. The obstacle detection system as claimed in claim 6, wherein the point cloud data of the surroundings comprises a plurality of 3-Dimensional (3D) data points defined by corresponding cartesian co-ordinates.

10. The obstacle detection system as claimed in claim 6, wherein the disparity image is generated from the one or more images comprising a first image and a second image, wherein generating of the disparity image comprises:
matching each pixel in the first image to a corresponding pixel in the second image;
computing a disparity value between each pixel in the first image to a corresponding pixel in the second image; and
generating a disparity image based on the disparity value, wherein each pixel in the disparity image comprises the disparity value.

Dated this 30th day of January 2018

R Ramya Rao
IN/PA-1607
Of K&S Partners
Agent for the Applicant , Description:TECHNICAL FIELD
The present disclosure relates to autonomous vehicles. Particularly, but not exclusively, the present disclosure relates to a method and a system for detecting obstacles around an autonomous vehicle in real-time.

Documents

Application Documents

# Name Date
1 201841003468-PROOF OF ALTERATION [05-09-2022(online)].pdf 2022-09-05
1 201841003468-STATEMENT OF UNDERTAKING (FORM 3) [30-01-2018(online)].pdf 2018-01-30
2 201841003468-IntimationOfGrant21-06-2022.pdf 2022-06-21
2 201841003468-REQUEST FOR EXAMINATION (FORM-18) [30-01-2018(online)].pdf 2018-01-30
3 201841003468-REQUEST FOR CERTIFIED COPY [30-01-2018(online)].pdf 2018-01-30
3 201841003468-PatentCertificate21-06-2022.pdf 2022-06-21
4 201841003468-POWER OF AUTHORITY [30-01-2018(online)].pdf 2018-01-30
4 201841003468-FER.pdf 2021-10-17
5 201841003468-FORM 18 [30-01-2018(online)].pdf 2018-01-30
5 201841003468-CLAIMS [29-09-2021(online)].pdf 2021-09-29
6 201841003468-FORM 1 [30-01-2018(online)].pdf 2018-01-30
6 201841003468-COMPLETE SPECIFICATION [29-09-2021(online)].pdf 2021-09-29
7 201841003468-DRAWINGS [30-01-2018(online)].pdf 2018-01-30
7 201841003468-DRAWING [29-09-2021(online)].pdf 2021-09-29
8 201841003468-FER_SER_REPLY [29-09-2021(online)].pdf 2021-09-29
8 201841003468-DECLARATION OF INVENTORSHIP (FORM 5) [30-01-2018(online)].pdf 2018-01-30
9 201841003468-COMPLETE SPECIFICATION [30-01-2018(online)].pdf 2018-01-30
9 201841003468-FORM 3 [29-09-2021(online)].pdf 2021-09-29
10 201841003468-Information under section 8(2) [29-09-2021(online)].pdf 2021-09-29
10 201841003468-REQUEST FOR CERTIFIED COPY [06-03-2018(online)].pdf 2018-03-06
11 201841003468-OTHERS [29-09-2021(online)].pdf 2021-09-29
11 201841003468-Proof of Right (MANDATORY) [23-04-2018(online)].pdf 2018-04-23
12 201841003468-PETITION UNDER RULE 137 [29-09-2021(online)].pdf 2021-09-29
12 Correspondence by Agent_Form30,Form1_26-04-2018.pdf 2018-04-26
13 201841003468-PETITION UNDER RULE 137 [29-09-2021(online)].pdf 2021-09-29
13 Correspondence by Agent_Form30,Form1_26-04-2018.pdf 2018-04-26
14 201841003468-OTHERS [29-09-2021(online)].pdf 2021-09-29
14 201841003468-Proof of Right (MANDATORY) [23-04-2018(online)].pdf 2018-04-23
15 201841003468-Information under section 8(2) [29-09-2021(online)].pdf 2021-09-29
15 201841003468-REQUEST FOR CERTIFIED COPY [06-03-2018(online)].pdf 2018-03-06
16 201841003468-COMPLETE SPECIFICATION [30-01-2018(online)].pdf 2018-01-30
16 201841003468-FORM 3 [29-09-2021(online)].pdf 2021-09-29
17 201841003468-FER_SER_REPLY [29-09-2021(online)].pdf 2021-09-29
17 201841003468-DECLARATION OF INVENTORSHIP (FORM 5) [30-01-2018(online)].pdf 2018-01-30
18 201841003468-DRAWINGS [30-01-2018(online)].pdf 2018-01-30
18 201841003468-DRAWING [29-09-2021(online)].pdf 2021-09-29
19 201841003468-FORM 1 [30-01-2018(online)].pdf 2018-01-30
19 201841003468-COMPLETE SPECIFICATION [29-09-2021(online)].pdf 2021-09-29
20 201841003468-FORM 18 [30-01-2018(online)].pdf 2018-01-30
20 201841003468-CLAIMS [29-09-2021(online)].pdf 2021-09-29
21 201841003468-POWER OF AUTHORITY [30-01-2018(online)].pdf 2018-01-30
21 201841003468-FER.pdf 2021-10-17
22 201841003468-REQUEST FOR CERTIFIED COPY [30-01-2018(online)].pdf 2018-01-30
22 201841003468-PatentCertificate21-06-2022.pdf 2022-06-21
23 201841003468-REQUEST FOR EXAMINATION (FORM-18) [30-01-2018(online)].pdf 2018-01-30
23 201841003468-IntimationOfGrant21-06-2022.pdf 2022-06-21
24 201841003468-STATEMENT OF UNDERTAKING (FORM 3) [30-01-2018(online)].pdf 2018-01-30
24 201841003468-PROOF OF ALTERATION [05-09-2022(online)].pdf 2022-09-05

Search Strategy

1 2021-03-2616-37-47E_26-03-2021.pdf

ERegister / Renewals

3rd: 05 Sep 2022

From 30/01/2020 - To 30/01/2021

4th: 05 Sep 2022

From 30/01/2021 - To 30/01/2022

5th: 05 Sep 2022

From 30/01/2022 - To 30/01/2023

6th: 12 Jan 2023

From 30/01/2023 - To 30/01/2024

7th: 18 Jan 2024

From 30/01/2024 - To 30/01/2025

8th: 24 Jan 2025

From 30/01/2025 - To 30/01/2026