Sign In to Follow Application
View All Documents & Correspondence

Method And System For Generating And Updating Vehicle Navigation Maps With Features Of Navigation Paths

Abstract: This disclosure relates generally to vehicle navigation maps, and more particularly to method and system for generating and updating vehicle navigation maps with features of navigation paths. In one embodiment, a method may include receiving a position of a vehicle and an environmental field of view (FOV) of the vehicle along a navigation path of the vehicle on a navigation map, extracting features of the navigation path from the environmental FOV, correlating the features with the navigation path on the navigation map based on the position, generating a features based navigation map based on the correlation, and transmitting the features based navigation map to a server of a navigation map service provider for storage and for subsequent use. The features based navigation map, when required to assist a navigation of another vehicle, may be accessed to assess the features of the navigation path and to provide the assessment to the other vehicle. FIGURE 2

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
04 January 2018
Publication Number
27/2019
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
bangalore@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-07-31
Renewal Date

Applicants

WIPRO LIMITED
Doddakannelli, Sarjapur Road, Bangalore 560035, Karnataka, India.

Inventors

1. RANJITH ROCHAN MEGANATHAN
90, Block No.11 Jail Hill Revenue Quarters, Ooty, The Nilgiris 643001, Tamil Nadu, India.
2. AARTHI ALAGAMMAI KASI
25, EVR Salai, West Govindapuram, Dindigul 624001, Tamil Nadu, India.
3. SUJATHA JAGANNATH
O-103, HMT Township, Sector 1, Jalahalli, Bangalore-560013, Karnataka, India.
4. RAGHOTTAM MANNOPANTAR
Pristine Paradise, #105, Near Shantiniketan School, Bilekahalli, Bangalore -560076, Karnataka, India.

Specification

Claims:WE CLAIM
1. A method of generating a vehicle navigation map with features of a navigation path, the method comprising:
receiving, via a map generation device, a position of a vehicle and an environmental field of view (FOV) of the vehicle along a navigation path of the vehicle on a navigation map;
extracting, via the map generation device, a plurality of features of the navigation path from the environmental FOV;
correlating, via the map generation device, the plurality of features with the navigation path on the navigation map based on the position;
generating, via the map generation device, a features based navigation map based on the correlation; and
transmitting, via the map generation device, the features based navigation map to a server of a navigation map service provider for storage and for subsequent use, wherein, when required to assist a navigation of another vehicle, the features based navigation map is accessed, the plurality of features of the navigation path is assessed, and the assessment is provided to the other vehicle.

2. The method of claim 1, wherein the position of the vehicle along the navigation path is received from a GPS sensor, wherein the environmental FOV along the navigation path is received from a light detection and ranging (LIDAR) scanner placed in front of the vehicle, and wherein the navigation path and the navigation map are received from the navigation map service provider.

3. The method of claim 1, wherein the plurality of features are extracted by extracting and analyzing ground data points from environmental data points cloud of the environmental FOV.

4. The method of claim 1, wherein the plurality of features comprises at least one of a width of the navigation path, a hump on the navigation path, an angle of elevation of the navigation path, a number of intersections on the navigation path, or an angle of turn at an intersection on the navigation path.

5. The method of claim 4, wherein extracting the width of the navigation path from the environmental FOV comprises:
extracting ground data points from environmental data points cloud of the environmental FOV;
determining a gradient between adjacent data points in the ground data points;
determining boundary data points based on the corresponding gradients and a threshold value; and
determining the width of the navigation path based on a distance between the boundary data points.

6. The method of claim 4, wherein extracting the angle of turn at the intersection on the navigation path from the environmental FOV comprises:
from the environmental FOV, determining a width of the navigation path and a width of an intersecting navigation path;
determining center data points of the navigation path based on the width of the navigation path, and center data points of the intersecting navigation path based on the width of the intersecting navigation path;
determining a slope of a center line of the navigation path based on the center data points of the navigation path, and a slope of a center line of the intersecting navigation path based on the center data points of the intersecting navigation path; and
determining the angle of turn at the intersection based on the slope of the center line of the navigation path and the slope of the center line of the intersecting navigation path.

7. The method of claim 4, wherein extracting the angle of elevation of the navigation path from the environmental FOV comprises:
extracting ground data points from environmental data points cloud of the environmental FOV;
clustering the ground data points to obtain clustered data points; and
determining the angle of elevation of the navigation path based on a relative gradient between the clustered data points.

8. The method of claim 4, wherein extracting the hump on the navigation path from the environmental FOV comprises:
obtaining a set of frames of environmental FOV at different time intervals;
overlaying the set of frames of environmental FOV on top of each other to generate an overlaid frame;
extracting ground data points from the overlaid frame;
determining a gradient, in a vertical direction, between adjacent data points in the ground data points;
iterating the process of obtaining a new set of frames and determining a new gradient based on the gradient and a threshold value;
determining a coordinate data point for each of the iteration, and based on the gradient; and
determining a slope of the hump based on the coordinate points.

9. The method of claim 1, wherein the plurality of features are correlated with the navigation path by overlaying each of the plurality of features with the navigation path for a given position on the navigation map.

10. The method of claim 1, wherein assisting the navigation of the other vehicle comprises providing, based on the assessment, at least one of an alert, a warning message, a notification, an instruction, or an alternate navigation path to at least one of an infotainment device of the other vehicle, an autonomous navigation module of the other vehicle, or a personal device of a passenger of the other vehicle.

11. The method of claim 1, further comprising:
receiving a feedback from the other vehicle with respect to a presence of one or more new features or an absence of one or more existing features of the navigation path; and
dynamically updating the features based navigation map based on the feedback.

12. A system for generating a vehicle navigation map with features of a navigation path, the system comprising:
a map generation device comprising at least one processor and a computer-readable medium storing instructions that, when executed by the at least one processor, cause the at least one processor to perform operations comprising:
receiving a position of a vehicle and an environmental field of view (FOV) of the vehicle along a navigation path of the vehicle on a navigation map;
extracting a plurality of features of the navigation path from the environmental FOV;
correlating the plurality of features with the navigation path on the navigation map based on the position;
generating a features based navigation map based on the correlation; and
transmitting the features based navigation map to a server of a navigation map service provider for storage and for subsequent use, wherein, when required to assist a navigation of another vehicle, the features based navigation map is accessed, the plurality of features of the navigation path is assessed, and the assessment is provided to the other vehicle.

13. The system of claim 12, wherein the plurality of features are extracted by extracting and analyzing ground data points from environmental data points cloud of the environmental FOV.

14. The system of claim 12, wherein the plurality of features comprises at least one of a width of the navigation path, a hump on the navigation path, an angle of elevation of the navigation path, a number of intersections on the navigation path, or an angle of turn at an intersection on the navigation path.

15. The system of claim 12, wherein the plurality of features are correlated with the navigation path by overlaying each of the plurality of features with the navigation path for a given position on the navigation map.

16. The system of claim 12, wherein assisting the navigation of the other vehicle comprises providing, based on the assessment, at least one of an alert, a warning message, a notification, an instruction, or an alternate navigation path to at least one of an infotainment device of the other vehicle, an autonomous navigation module of the other vehicle, or a personal device of a passenger of the other vehicle.

17. The system of claim 12, wherein the operations further comprise:
receiving a feedback from the other vehicle with respect to a presence of one or more new features or an absence of one or more existing features of the navigation path; and
dynamically updating the features based navigation map based on the feedback.

Dated this 4th day of January 2018

Swetha SN
IN/PA-2123
Of K&S Partners
Agent for the Applicant
, Description:TECHNICAL FIELD
This disclosure relates generally to vehicle navigation maps, and more particularly to method and system for generating and updating vehicle navigation maps with features of navigation paths.

Documents

Application Documents

# Name Date
1 201841000427-PROOF OF ALTERATION [04-10-2023(online)].pdf 2023-10-04
1 201841000427-STATEMENT OF UNDERTAKING (FORM 3) [04-01-2018(online)].pdf 2018-01-04
2 201841000427-REQUEST FOR EXAMINATION (FORM-18) [04-01-2018(online)].pdf 2018-01-04
2 201841000427-IntimationOfGrant31-07-2023.pdf 2023-07-31
3 201841000427-REQUEST FOR CERTIFIED COPY [04-01-2018(online)].pdf 2018-01-04
3 201841000427-PatentCertificate31-07-2023.pdf 2023-07-31
4 201841000427-POWER OF AUTHORITY [04-01-2018(online)].pdf 2018-01-04
4 201841000427-FER.pdf 2021-10-17
5 201841000427-FORM 18 [04-01-2018(online)].pdf 2018-01-04
5 201841000427-FER_SER_REPLY [22-03-2021(online)].pdf 2021-03-22
6 201841000427-FORM 3 [04-03-2021(online)].pdf 2021-03-04
6 201841000427-FORM 1 [04-01-2018(online)].pdf 2018-01-04
7 201841000427-Information under section 8(2) [04-03-2021(online)].pdf 2021-03-04
7 201841000427-DRAWINGS [04-01-2018(online)].pdf 2018-01-04
8 201841000427-PETITION UNDER RULE 137 [04-03-2021(online)].pdf 2021-03-04
8 201841000427-DECLARATION OF INVENTORSHIP (FORM 5) [04-01-2018(online)].pdf 2018-01-04
9 201841000427-RELEVANT DOCUMENTS [04-03-2021(online)].pdf 2021-03-04
9 201841000427-COMPLETE SPECIFICATION [04-01-2018(online)].pdf 2018-01-04
10 abstract 201841000427 .jpg 2018-01-05
10 Correspondence by Agent_Form 1_26-04-2018.pdf 2018-04-26
11 201841000427-Proof of Right (MANDATORY) [23-04-2018(online)].pdf 2018-04-23
11 201841000427-REQUEST FOR CERTIFIED COPY [12-03-2018(online)].pdf 2018-03-12
12 201841000427-Proof of Right (MANDATORY) [23-04-2018(online)].pdf 2018-04-23
12 201841000427-REQUEST FOR CERTIFIED COPY [12-03-2018(online)].pdf 2018-03-12
13 abstract 201841000427 .jpg 2018-01-05
13 Correspondence by Agent_Form 1_26-04-2018.pdf 2018-04-26
14 201841000427-COMPLETE SPECIFICATION [04-01-2018(online)].pdf 2018-01-04
14 201841000427-RELEVANT DOCUMENTS [04-03-2021(online)].pdf 2021-03-04
15 201841000427-DECLARATION OF INVENTORSHIP (FORM 5) [04-01-2018(online)].pdf 2018-01-04
15 201841000427-PETITION UNDER RULE 137 [04-03-2021(online)].pdf 2021-03-04
16 201841000427-DRAWINGS [04-01-2018(online)].pdf 2018-01-04
16 201841000427-Information under section 8(2) [04-03-2021(online)].pdf 2021-03-04
17 201841000427-FORM 1 [04-01-2018(online)].pdf 2018-01-04
17 201841000427-FORM 3 [04-03-2021(online)].pdf 2021-03-04
18 201841000427-FER_SER_REPLY [22-03-2021(online)].pdf 2021-03-22
18 201841000427-FORM 18 [04-01-2018(online)].pdf 2018-01-04
19 201841000427-POWER OF AUTHORITY [04-01-2018(online)].pdf 2018-01-04
19 201841000427-FER.pdf 2021-10-17
20 201841000427-REQUEST FOR CERTIFIED COPY [04-01-2018(online)].pdf 2018-01-04
20 201841000427-PatentCertificate31-07-2023.pdf 2023-07-31
21 201841000427-REQUEST FOR EXAMINATION (FORM-18) [04-01-2018(online)].pdf 2018-01-04
21 201841000427-IntimationOfGrant31-07-2023.pdf 2023-07-31
22 201841000427-STATEMENT OF UNDERTAKING (FORM 3) [04-01-2018(online)].pdf 2018-01-04
22 201841000427-PROOF OF ALTERATION [04-10-2023(online)].pdf 2023-10-04

Search Strategy

1 SearchStrategy_201841000427E_21-09-2020.pdf

ERegister / Renewals

3rd: 16 Oct 2023

From 04/01/2020 - To 04/01/2021

4th: 16 Oct 2023

From 04/01/2021 - To 04/01/2022

5th: 16 Oct 2023

From 04/01/2022 - To 04/01/2023

6th: 16 Oct 2023

From 04/01/2023 - To 04/01/2024

7th: 16 Oct 2023

From 04/01/2024 - To 04/01/2025

8th: 02 Jan 2025

From 04/01/2025 - To 04/01/2026