Sign In to Follow Application
View All Documents & Correspondence

A Vehicle Tracking System And A Method Thereof

Abstract: The present invention provides a system and method for tracking the current position of a vehicle by capturing surrounding information from the vicinity of the vehicle. The method comprises steps to process the captured surrounding information in the form of image frames using a locally adapted processor or a cloud server implemented with big data like framework. The location related data are transmitted to a remote server or cloud or locally stored and processed on fog computing environment. The location related data are analyzed for some amount of time based on historical mage frames captured during the transit of the vehicle from one location to another to computed route information undertaken by the vehicle. Thereby, the actual position of the vehicle is determined on the map data by estimating on the route information and the captured image frames being transmitted or stored locally.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
15 February 2017
Publication Number
33/2018
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
cal@patentindia.com
Parent Application

Applicants

ITC INFOTECH INDIA LIMITED
37 J.L. Nehru Road Kolkata West Bengal India 700071

Inventors

1. NEELAKANTAN, Naresh
No.18, ITC Infotech India Limited, Dodda Banaswadi main road, Maruthisevanagar, Bangalore Karnataka India 560005

Specification

Claims:1. A method for tracking a location of at least one vehicle employing Advanced Driver Assistance System (ADAS), said method comprising:
capturing, by means of an image capturing device, plurality of surrounding information from the vicinity of said vehicle, when said vehicle is transiting from one location to another, wherein said surrounding information captured as plurality of image frames;
storing, by means of a remotely adapted and/or locally adapted storage unit, said plurality of image frames as historical data captured during the transit of said vehicle;
transmitting said historical data to an image processor means equipped with map data, wherein said processor means is remotely located or locally adapted in said vehicle; and
processing, by said image processor means, said image frames so as to compute at least one route information undertaken by said vehicle according to said historical data;
determining, by said image processor means, at least one information corresponding to said location of said vehicle on said map data according to said route information; and thereby
synchronizing the route information of said vehicle by updating said map data with said location information of the vehicle.

2. A method for tracking a location of at least one vehicle, comprising:
capturing, by means of an image capturing device, plurality of surrounding information from the vicinity of said vehicle, when said vehicle is transiting from one location to another, wherein said surrounding information captured as plurality of image frames ;
storing, by means of a remotely adapted and/or locally adapted storage unit, said plurality of image frames as historical data captured during the transit of said vehicle;
transmitting said historical data to an image processor means equipped with map data, wherein said processor means is remotely located or locally adapted in said vehicle; and
processing, by said image processor means, said image frames so as to compute at least one route information undertaken by said vehicle according to said historical data;
calculating, by said image processor means, at least one information corresponding to said location of said vehicle on said map data according to said route information; and thereby
synchronizing the route information of said vehicle by updating said map data with said location information of the vehicle.

3. The method as claimed in claim 1, wherein said image capturing device is selected from a camera device or vision sensor unit of ADAS implemented in said vehicle.

4. The method claimed in claim 2, wherein said image capturing device is a camera or vision sensor unit mounted in said vehicle.

5. The method as claimed in claim 1 or claim 2, wherein said image capturing device captures said surrounding information selected from traffic signs, road signs, locality information from direction boards, billboards, hoarding information, landmark information, address information, and any combination thereof.

6. The method as claimed in claim 1 or claim 2, wherein said image processing means being operated on a big data like framework to perform text mining and text processing from said image frames using a MapReduce like method.

7. The method as claimed in claim 1 or claim 2, wherein said image processing means is selected from a cloud system remotely located from the vehicle or a fog system locally adapted with said vehicle.

8. The method as claimed in claim 7, wherein said map data corresponds to offline map configured using said locally adapted fog computing processor.

9. The method as claimed in claim 7, wherein said map data configured using said remotely adapted cloud server.

10. The method as claimed in claim 1 or claim 2, wherein the step of transmitting said historical data to a remotely adapted image processor means is by means of mobile network data.

11. A tracking system for obtaining location based information of at least one vehicle having Advanced Driver Assistance System (ADAS) by performing method steps as claimed in the method claim 1, wherein said system comprising:
means for capturing plurality of surrounding information from the vicinity of said vehicle in the form of plurality of image frames, when said vehicle is transiting from one location to another;
an image processing means, remotely adapted or locally adapted in said vehicle, wherein said image processing means configured to process said plurality of image frames captured during the transit and compute a route information undertaken by said vehicle according to said image frames;
an image analysis means adapted to extrapolate said location based information of said vehicle by comparing said route information on a map data equipped with said remotely located image processing means and/or said locally adapted processing means; and
displaying means to indicate current location of said vehicle according to said location based information and thereby continuously synchronizing said location based information back into the computed route information to update said map data.

12. The system as claimed in claim 11, wherein said means for capturing plurality of surrounding information includes a camera or vision sensor selected from said advanced driver assistance system.

13. The system as claimed in claim 11, wherein said image processing means is selected from a cloud system remotely located from the vehicle or a fog system locally adapted with said ADAS of vehicle.

14. The system as claimed in claim 11, wherein said plurality of image frames are stored as historical date by using said storage unit.

15. The system as claimed in claim 11, wherein said image processing means being operated on a big data like framework to perform text mining and text processing from said image frames using a MapReduce like method.

16. A tracking system for obtaining location based information of at least one vehicle by performing method steps as claimed in the method claim 2, wherein said system comprising:
means for capturing plurality of surrounding information from the vicinity of said vehicle in the form of plurality of image frames, when said vehicle is transiting from one location to another;
an image processing means, remotely adapted or locally adapted in said vehicle, wherein said image processing means configured to process said plurality of image frames captured during the transit and compute a route information undertaken by said vehicle according to said image frames;
an image analysis means adapted to extrapolate said location based information of said vehicle by comparing said route information on a map data equipped with said remotely located image processing means and/or said locally adapted processing means; and
displaying means to indicate current location of said vehicle according to said location based information and thereby continuously synchronizing said location based information back into the computed route information to update said map data.
, Description:TECHNICAL FIELD

[001] The present subject matter described herein, in general, relates to the field of vehicle tracking system. More specifically, the present invention relates to a system and method for tracking a vehicle location by gathering surrounding/locality information.

BACKGROUND

[002] Locating the position of a moving vehicle on a map is a common feature in the present world. Satellite based navigation systems such as Global Positioning system, (GPS) or telecommunication triangulation systems/Global System for Mobile communication (GSM) are commonly used to locate the moving vehicle. A navigation system based on radio waves from GPS satellites, accurately estimates the present position of the vehicle/host vehicle in which it is installed.

[003] However, the satellite or telecom signal often undergoes a lot of interference or jamming leading to inaccuracy in locating the vehicle. When radio waves cannot be received from the GPS satellites, the error in the position located by the autonomous navigation method is amplified along with the passage of time. Thus, the accuracy of the position gradually declines.

[004] With Advanced Driver Assistance Systems (ADAS) becoming a standard feature in new vehicles, it paves way for building a stand-alone vehicle tracking system and also complements as an add-on to existing GPS or GSM based vehicle tracking system. This new method of locating the position using ADAS increases accuracy in existing GPS and GSM based vehicle tracking systems, which are currently prone to failure and less accuracy under certain conditions.

[005] Some existing technique for tracking the vehicle location are as follows. Reference is made to US4891650 which teaches a vehicle location system. The prior art document describes a systems to locate a vehicle based on alarm signals from it in case of any emergency situation is encountered.

[006] Reference is made to US5043736 which teaches a Cellular position locating system. This prior art describes a method of locating a handheld equipped with cellular signal. Using cellular modem along with GPS satellite, the location of the handheld is narrowed down thereby giving the latitude and longitude of the location.

[007] Reference is made to US4928106 which teaches Global positioning system receiver with improved radio frequency and digital processing. This prior art describes a traditional GPS system which gives the location, velocity and other critical parameters of the receiver based on signals received from minimum number of satellites.

[008] Reference is made to US 20110161032 A1 which describes about Correction of a vehicle position by means of landmarks. This prior art describes a method of correcting the position of a vehicle given by tracking apparatus on the basis of the landmarks encountered during the transit of the vehicle at the instance of reaching respective landmarks.

[009] Reference is made to US8918277 B2, wherein a method for recognizing road signs in the vicinity of a vehicle and for synchronization thereof to road sign information from a digital map where, in the case of a recognition of road signs, road-sign recognition data are generated, navigation data being provided for localizing the vehicle in digital map data, and the road-sign recognition data being synchronized to the map data. This document limits captured image data to road signs and the method of synchronizing the same onto a digital map. The system does not get additional information regarding locality or address information or landmark information which gets collected en route and the system does not abridge all these collected information to track the host vehicle on the map data.

[0010] Reference is made to US9177212 B2 wherein a method combines a road sign recognition system and a lane detection system of a motor vehicle. The road sign recognition system generates road sign information from sensor data of a camera-based or video-based sensor system and the lane detection system generates lane course information from the sensor data. Meaning-indicating data for road signs are generated from the lane course information, and are used to check the plausibility of and/or to interpret the road sign information. Data indicating the course of the lane are generated from the road sign information, and are used to check the plausibility of and/or to interpret the lane course information. The method described in this prior art, however, is not capable to locate or track the host vehicle on a digital map based on collected road sign and lane course information. Also, similar to the previous case, the collected image data set is limited to road signs and lane course data.

[0011] Reference is made to US 20150185021 A1 wherein a method of measuring a position of a vehicle using a cloud computing includes obtaining surrounding information according to a driving of the vehicle and driving information of the vehicle. The obtained surrounding information and the driving information of the vehicle are transmitted to a server which is remotely located from the vehicle and equipped with map data. A position of the vehicle is calculated through the surrounding information and the driving information of the vehicle by the server. The calculated position of the vehicle is transmitted to the vehicle. The calculated position of the vehicle is outputted. The prior art does provide location of information from both surrounding information captured in LIDAR and driving information synchronized with a cloud computing platform. However, the system capturing the surrounding information is limited to relatively expensive LIDAR sensor as compared to the camera or vision sensor used in the current method. Added to this, the surrounding information is also limited as perceived by a LIDAR sensor in comparison to a camera or vision sensor due to the type of sensor used. Further, a synchronization mechanism is described on the cloud computing platform which might not be required in the current method if an offline digital map is present in the processing platform on the host vehicle.

[0012] Further reference is made to US9208389 B2, which discloses an apparatus and method for recognizing a current position of a vehicle. The apparatus receives information about an initial position of a vehicle from an external input, and receives an image signal from an image sensor to take a picture of an identifiable object and extracts image signal information corresponding to the identifiable object from the image signal, and connects to an internal network of the vehicle of the vehicle and receives information about a traveling state of the vehicle, and calculates the current position of the vehicle based on the information obtained. The method used for locating a vehicle have been described by using a Map matching method.

[0013] Yet another reference is made to US20140372020 A1 relates generally to a navigation system and, more specifically, to devices and techniques for augmenting navigational instructions with visual information determined based on image information acquired from one or more image acquisition devices. Further, the disclosed embodiments in the document teaches, a vehicle which may include a GPS sensor configured, to provide an output representative of a position of the vehicle. The vehicle may also include one or more image acquisition devices configured to acquire a plurality of images of an environment outside of the vehicle. The vehicle may include a navigation system may include at least one processing device configured to determine, based on an output of one or more position sensors associated with the navigation system, a current location of at least one component associated with the navigation system and determine a destination location different from the current location. This document describes a navigation system which may include a GPS going from one location to another. This system may augment one or more image information captured during the course of journey in the navigation direction from one place to another in the navigation device. Thus, the captured images, themselves, are not capable of locating the vehicle in which the image capturing device is present. Further one of the disadvantages of this prior art system is that the system would need GPS for navigation.

[0014] Thus, summarizing the drawbacks of the conventional techniques to track the location of a vehicle is as follows:
1. The functionality of the conventional tracking system are restricted to navigation-based data such GPS or GSM and in the absence of the same, the system will not generate any output.
2. The failure in tracking a vehicle when the GPS and Cell is lost can be explored due to: GPS Jamming, Spectrum and Band compatibility with new areas like V2V/V2X being explored, Space weather because of which GPS signal reception is distorted, Non-visibility of satellites, War zones where GPS satellites are blocked, Long tunnels and underground road networks and the like.
3. Further during rerouting of the navigation substantial time is lost which is not desirable.
4. The system capturing the surrounding information using LIDAR sensor is expensive as compared to the camera or vision sensor.

[0015] Accordingly, in view of the hitherto drawbacks of the prior art techniques, there is a dire need to provide an improved vehicle tracking system which generates a perceivable location output making it unique as it uses a method comparable to a human observation of objects around during his transit from one location to another.

SUMMARY OF THE PRESENT INVENTION

[0016] The following presents a simplified summary of the invention in order to provide a basic understanding of some aspects of the invention. This summary is not an extensive overview of the present invention. It is not intended to identify the key/critical elements of the invention or to delineate the scope of the invention. Its sole purpose is to present some concept of the invention in a simplified form as a prelude to a more detailed description of the invention presented later.

[0017] The object of the present invention is to estimate the vehicle’s current position without using GPS or GSM based navigation systems.

[0018] Another object of the present invention is to track the current location of the vehicle/host vehicle by gathering surrounding/locality information using an image capturing device mounted on the host vehicle or using the ADAS system in the said vehicle.

[0019] Yet another object of the present invention is to provide a method and a system to locate or track the position of the host vehicle using a remotely or locally stored digital map according to the analyzed image frames captured from the surrounding information from the vicinity of the vehicle when the vehicle is transiting from one location to another.

[0020] Yet another object of the present invention is to provide a method and a system to locate or track the position of the host vehicle using a remotely or locally stored digital map according to the analyzed image frames captured from the surrounding of the vehicle when the vehicle is transiting from one location to another.

[0021] Yet another object of the present invention is to provide an automotive vehicle tracking system based on analysis and estimation of vehicle’s current position from the surrounding information nearby the vehicle which are obtained from image frames captured by the camera/vision sensor of the ADAS system in the vehicle.

[0022] Still another object of the present invention is to provide an offline digital map present in the processing platform on the host vehicle. Further processing in the current method uses a local existing big data like processing technique which may be like MapReduce for text mining and processing collected locality information from images.

[0023] Accordingly, in first aspect of the present invention, there is provided a method for tracking a location of at least one vehicle employing a Advanced Driver Assistance System (ADAS), said method comprising:
• capturing, by means of an image capturing device, plurality of surrounding information from the vicinity of said vehicle, when said vehicle is transiting from one location to another, wherein said surrounding information captured as plurality of image frames;
• storing, by means of a remotely adapted and/or locally adapted storage unit, said plurality of image frames as historical data captured during the transit of said vehicle;
• transmitting said historical data to an image processor means equipped with map data, wherein said processor means is remotely located or locally adapted in said vehicle; and
• processing, by said image processor means, said image frames so as to compute at least one route information undertaken by said vehicle according to said historical data;
• determining, by said image processor means, at least one information corresponding to said location of said vehicle on said map data according to said route information; and thereby
• synchronizing the route information of said vehicle by updating said map data with said location information of the vehicle.

[0024] In second aspect of the present invention, there is provided a tracking system for obtaining location based information of at least one vehicle using Advanced Driver Assistance System (ADAS) performing the method as mentioned above, wherein said system comprising:
means for capturing plurality of surrounding information from the vicinity of said vehicle in the form of plurality of image frames, when said vehicle is transiting from one location to another;
an image processing means, remotely adapted or locally adapted in said vehicle, wherein said image processing means configured to process said plurality of image frames captured during the transit and compute a route information undertaken by said vehicle according to said image frames;
an image analysis means adapted to extrapolate said location based information of said vehicle by comparing said route information on a map data equipped with said remotely located image processing means and/or said locally adapted processing means; and
displaying means to indicate current location of said vehicle according to said location based information and thereby continuously synchronizing said location based information back into the computed route information to update said map data.

[0025] In third aspect, there is provided a method for tracking a location of at least one vehicle, said method comprising:
• capturing, by means of an image capturing device, plurality of surrounding information from the vicinity of said vehicle, when said vehicle is transiting from one location to another, wherein said surrounding information captured as plurality of image frames;
• storing, by means of a remotely adapted and/or locally adapted storage unit, said plurality of image frames as historical data captured during the transit of said vehicle;
• transmitting said historical data to an image processor means equipped with map data, wherein said processor means is remotely located or locally adapted in said vehicle; and
• processing, by said image processor means, said image frames so as compute at least one route information undertaken by said vehicle according to said historical data;
• determining, by said image processor means, at least one information corresponding to said location of said vehicle on said map data according to said route information; and thereby
• synchronizing the route information of said vehicle by updating said map data with said location information of the vehicle.

[0026] In fourth aspect, there is provided a tracking system for obtaining location based information of at least one vehicle by performing method steps as mentioned above, wherein said system comprising:
means for capturing plurality of surrounding information from the vicinity of said vehicle in the form of plurality of image frames, when said vehicle is transiting from one location to another;
an image processing means, remotely adapted or locally adapted in said vehicle, wherein said image processing means configured to process said plurality of image frames captured during the transit and compute a route information undertaken by said vehicle according to said image frames;
an image analysis means adapted to extrapolate said location based information of said vehicle by comparing said route information on a map data equipped with said remotely located image processing means and/or said locally adapted processing means; and
displaying means to indicate current location of said vehicle according to said location based information and thereby continuously synchronizing said location based information back into the computed route information to update said map data.

[0027] Other aspects, advantages, and salient features of the invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention.

BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS

The above and other aspects, features, and advantages of certain exemplary embodiments of the present invention will be more apparent from the following description taken in conjunction with the accompanying drawings in which:

[0028] Figure 1 illustrates a top view of a front camera that is installed behind the windshield in the vehicle to capture image frame from direction information board, according to an embodiment of present invention.

[0029] Figure 2 illustrates an Image Frame containing direction sign board with location based information in accordance with an embodiment of the present invention.

[0030] Figure 3 illustrates a block diagram representation of the system in accordance with an embodiment of the present invention.

[0031] Figure 4 illustrates a process flow diagram for computing the current location of the vehicle based on the historical data captured during the transit, in accordance with an embodiment of the present invention.

[0032] Figure 5 illustrates a location to show the location of the vehicle estimated by analysis in accordance with an embodiment of the present invention.

[0033] Persons skilled in the art will appreciate that elements in the figures are illustrated for simplicity and clarity and may have not been drawn to scale. For example, the dimensions of some of the elements in the figure may be exaggerated relative to other elements to help to improve understanding of various exemplary embodiments of the present disclosure. Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.

DETAILED DESCRIPTION OF THE PRESENT INVENTION

[0034] The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary.

[0035] Accordingly, those of ordinary skilled in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope of the invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.

[0036] The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention are provided for illustration purpose only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.

[0037] It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise.

[0038] By the term “substantially” it is meant that the recited characteristic, parameter, or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement error, measurement accuracy limitations and other factors known to those of skill in the art, may occur in amounts that do not preclude the effect the characteristic was intended to provide.

[0039] Features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments and/or in combination with or instead of the features of the other embodiments.

[0040] It should be emphasized that the term “comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.

[0041] The present invention relates to method of tracking for obtaining the current position of the vehicle based on the information captured selected from the surrounding environment that may include but not limited to, street, locality names and other sign board information. These surrounding information are obtained from image frames captured by an image capturing device mounted on the vehicle or the image capturing device which forms the part of the ADAS. The vehicle is traced by keeping track of the path it takes from the previously known position or historical evidences/data collected during the journey of the vehicle from one location to another and cross referenced on a map data. Finally the estimation of the vehicle’s position can be traced near real time without using GPS or GSM based navigation systems.

[0042] In one implementation, the information is captured using an image capturing device selected from a camera device or vision sensor unit mounted on said vehicle or the camera device of the ADAS implemented in the vehicle. The image capturing device may capture said surrounding information selected from but not limited to traffic signs, road signs, locality information from direction boards, billboards, hoarding information, landmark information, address information, or any combination thereof.

[0043] In one implementation, the historical data collected during the journey of the vehicle is communicated to a locally adapted processor for processing. However, as an alternate embodiment such historical data may also be processed at a remotely adapted image processor by communicating the captured images to the processor by means of mobile network data.

[0044] In one implementation, the ADAS in the vehicle comprises : at least one camera or vision sensor adapted to capture plurality of surrounding information nearby said vehicle; image processing means remotely adapted using cloud computing system and/or locally adapted using fog computing system; and a storage unit coupled with and readable by the processor means and containing plurality of instruction that, when executed by the processor, enables the processor to estimate current location of said vehicle.

[0045] In the present invention the system for tracking the current location of the vehicle can be performed using a remotely located cloud computing server or the locally adapted by fog computing system employed in the vehicle.

[0046] The subject matter of the present invention can be performed either by using Fog Computing system or by using cloud computing system. The Cloud Computing system empowers its client computers through sharing resources for computing and data storage located in the server at a remote data centre or a “Cloud”, whereas the Fog Computing makes these resources available to the networked clients very near to the source of the data. Due to its proximity to the clients on “ground”, it is called as Fog Computing, drawing analogies from ‘fog’ near the ground and ‘clouds’ in the sky.

[0047] Thus, in the vehicle position tracking method using a cloud computing according to an exemplary embodiment of the present invention, the map data may not be provided to the vehicle itself, but provided to the remotely located server. All of the procedure for processing plurality of captured image data and extrapolating the current location using the computed route information on map data, can be performed in the server, and the performed result is transmitted to the vehicle. Thus, there is a continuous near real time position tracking may be possible in the vehicle according to the embodiment.

[0048] According, to another exemplary embodiment, in the vehicle position tracking method using a fog system, the map data is locally stored using a storage unit in the processor. All of the procedure for processing plurality of captured image data can be performed using a locally adapted fog system implemented within the network of the vehicle and thereby extrapolating the current location using a computed route information on an offline map data locally stored in a storage unit within the network of the vehicle.

[0049] In one implementation, image capturing device is preferably a camera device of ADAS implemented in said vehicle. Referring to Figure 1, the reference numeral (10) depicts the vehicle’s top view and the reference numeral (12) is the front camera capturing the location based information from the direction sign board present on the roadway. The reference numeral (14) depicts the direction sign board having location based information and distance to the next roadway.

[0050] In one implementation, the camera device in the ADAS of the vehicle is adapted to capture plurality of surrounding information in the form of plurality of images frames containing location related data during transition of the vehicle from one location to another. The image frames can be then transmitted to a remote server and preferably the cloud server for processing of image frames to compute the route undertaken by the vehicle when the image frames was being captured. The analysis on the set of image frames can be subsequently carried out on a big data like framework for better calculation of the route taken from historical data. Accordingly, the actual position of the vehicle can be determined from the online map or navigational aid by estimating on the analyzed images and the captured image frames being transmitted or stored locally using a storage unit.

[0051] In another implementation, the camera device in the vehicle is adapted to capture plurality of surrounding information in the form of plurality of images frames containing location related data during transition of the vehicle from one location to another. The image frames can be then transmitted to a locally adapted server/processor means and preferably the fog computing system for processing of image frames to compute the route undertaken by the vehicle when the image frames was being captured. The analysis on the set of image frames can be subsequently carried out on abig data like framework for better calculation of the route taken from historical data. The location related data can be analyzed for some amount of time based on historical image evidences. Thereby, the actual position of the vehicle is computed by cross reference with a digital map (offline map) locally stored and processed on fog computing environment.

[0052] In both implementation, the surrounding information can be captured from but not limited to traffic signs, locality information from direction boards, billboards, hoardings or any combination thereof.

[0053] In one implementation, the vehicle may transmit the captured surrounding information and driving information to a remote server/processor, e.g., cloud server, remotely located from the vehicle. Here, the processor means may be equipped with map data, and the vehicle may be able to perform data communication using mobile network connectivity in order to transmit the above surrounding information.

[0054] In another implementation, the vehicle may transmit the captured surrounding information and driving information to a locally adapted server/processor such as the fog system in the ADAS of the vehicle. Here, the processor means may be equipped with digital map data, and the vehicle may be able to perform data communication using mobile network connectivity in order to transmit the above surrounding information by means of interrelated computing devices.

[0055] Thus in the present invention, the image frames from the front camera can be captured and processed during the transit of the vehicle. These image frames may contain location information that may include but not limited to direction on sign boards, addresses from hoardings on top of buildings or shops and landmarks. The flow of these address details will be back-traced on the fog/local computing device in the vehicle. This facilitates identification of the exact position of the vehicle in the offline map with the help of Big Data like processing framework.

[0056] In one implementation, the present invention may use backtracking route information of the host vehicle based on extensive big data like framework. According to an exemplary embodiment, if the host vehicle has passed a direction signboard saying location B to the right direction and in the next update captures a landmark enroute, then it has followed the path on the map which is the actual tracking of the vehicle on the map.

[0057] In one implementation, reference is made to figure 2 which show how the direction sign board can contain the route information. The reference numeral (20) indicates how far the vehicle is from the approaching locality when the image frame was captured by the ADAS system. The reference numeral (22) indicates the approaching locality. And the reference numeral (24) indicates the heading direction for the approaching locality. These data are useful for calculating the route taken by the vehicle as the image frames are captured and processed during its transit.

[0058] In an exemplary embodiment, the direction sign board may contain more information of the current road where the vehicle may be traveling like an interstate highway or a city street with number information. Along with these direction boards, there can be instances where the front camera might capture some image frames of hoardings on buildings or shops containing the location based information of the current street. Location based information can processed by using image processing module on board the car in the fog computer or the location based information may be sent across over to the cloud/server by the network in the vehicle for processing. Direction sign boards may be as simple as shown in Figure 2 or can be a complicated set of directions which have to be saved along the direction of the route to extrapolate the vehicle’s location.

[0059] In one implementation, as shown in figure 3, it shows the block diagram of the system for tracking the current location of the vehicle. Initially the images are captured which is indicated by the capture block. These image frames will contain location based information, distance to specified location being approached by the vehicle and route to be taken by the vehicle to reach there. This location based information can be processed by the cloud computer remotely located from the vehicle or the fog system in the vehicle which is indicated by the process block. The image frames can be analyzed to calculate the route which the vehicle has taken when the image frames are captured. This is indicated by the analyze block. Analysis on the set of images can be carried out on extensive big data like frameworks for better calculation of the route taken from historical data. The current location of the vehicle will be extrapolated from the route information which is indicated by the update block. This location may be present in an offline map or an online server based map on the cloud. The same location related information can be synchronized back into the existing route and image frames captured on the go which is indicated by the synchronize block. The entire process can be repeated until the vehicle stops. This, in a sense, makes way for a new system of vehicle tracking close to real time based on the time taken to extrapolate the information from the location based information captured by the image frames on the route.

[0060] In the implementation, the vehicle tracking system comprising: camera or vision sensor for capturing plurality of surrounding information nearby said vehicle in the form of image frames when said vehicle is transiting from one location to another location. The cameras of the vehicle may be part of an ADAS camera set. The image processor means being operated on a big data like framework and can be remotely located from said vehicle or locally adapted in the vehicle, wherein said processor means can be configured to: receive plurality of captured images during the transit of said vehicle, forming a historical data set and thereby compute the historical data set to identify the route information taken by the vehicle, extrapolate the location based information of the vehicle by comparing the route information with an online or offline map data and continuously synchronizing said location based information back into the computed route information to update the map data.

[0061] In the implementation, the image processing means being operated on a big data like framework to perform text mining and text processing using MapReduce or similar methods on a digital map information and is remotely located from said host vehicle assuming mobile network connectivity exists for transmitting without GPS captured image information and/or locally adapted using said ADAS.

[0062] In one implementation, as shown in figure 4, the process flow diagram describes the method for tracking the current position of the vehicle. The image frames that are captured on the route of the vehicle have to make sensible information of the location of the vehicle. As indicated in the flow, plurality of image frames are captured when the vehicle is transiting from one location to another location. Then, the image frames may be processed to obtain location based information. Then the process is repeated until accuracy of the location is obtained. If the route information is accurate by analyzing the historical data set, the final location is updated on the map either offline or online. This result is synchronized and the procedure is repeated until the vehicle stops.

[0063] As shown in figure 4, the steps of the process flow diagram can be described as follows:
1. Information Capture - Initially the images may be captured which is indicated by the capture block. These image frames may contain location based information, distance to specified location being approached by the vehicle and route to be taken by the vehicle to reach there.
2. Image Processing - This location based information can processed by the cloud computer or the fog system in the vehicle which is indicated by the process block. The image frames are analyzed to calculate the route information which the vehicle has taken when the image frames are captured.
3. Image Analysis - Analysis on the data set of image frames can be carried out on extensive big data like frameworks for better calculation of the route taken from historical data. The current location of the vehicle is extrapolated from the route information which is indicated by the update block. This location may be present in an offline map or an online server based map on the cloud.
4. Synchronization with offline maps and vehicle tracking - The same location is synchronized back to the existing route and image frames captured on the go which is indicated by the synchronize block. The entire process may be repeated until the vehicle stops. This, in a sense, makes way for a new system of vehicle tracking close to real time or near real time based on the time taken to extrapolate the information from the location information captured by the image frames on the route.

[0064] According, to one exemplary embodiment, the analysis of the image frames, assuming vehicle may be heading to a location H from a location A. After the vehicle leaves location A, it may encounter a location based information at a location C saying it is currently heading on a roadway B approaching location E by taking a right turn at the next intersection D. The next location containing image frame may be on a roadway F after the intersection D on the location E giving the next direction to location H through direction G. By making use of these location based information and waypoint information such as the intersection, the information can be searched on the offline or online map data to estimate the route. The last captured location based information from the image frame gives the nearest location to the vehicle on the map. These location data can be further complemented with vehicle data coming from sensors in the vehicles like speed and heading direction at the particular instance the image frame is captured to narrow down the point where it is located approximately. The analysis can be carried out on the fog computing environment or a cloud/server with/without big data like frameworks.

[0065] According to an exemplary embodiment, the route analysis carried out on the images captured. As described in the previous paragraph, a history of waypoints known from the vehicle’s path can be analyzed. Assuming a vehicle headed to location H from location A, a number of route traversals can be obtained from offline data maps between the two locations A and H. The extrapolation of these route traversals obtained on the map with the image frames capturing locations along the direction of actual vehicle’s route can be done. This can be done by matching locations along each traversal obtained from the map. Out of these, wherever the number of location areas along the route will be more, that route is assumed to be the vehicle’s traversal path. This way we can obtain the route traversed and finally the next set of image frames gives the approximate heading location of the vehicle.

[0066] In one implementation, the figure 5 show a representation of how the vehicle’s location is updated on the map as interpreted. This map maybe offline without connectivity on a fog computing environment or maybe online by connecting through the vehicle’s telecom network. The reference numeral (30) indicates the location of the vehicle after extrapolation of the location from the analyzed set of image frame data.

[0067] In one implementation, there can be a handful of sign boards encountered during the vehicle’s transit from one location to another in the absence of GPS and GSM. The location based information on the sign boards at every instance can be captured as image frames and the direction taken to go the next corresponding location gives the route taken. This route can be correlated on the offline map downloaded in the fog computer, step by step, to finally plot the route taken. The latest image frame gives the location approximate to where the vehicle is present at that moment.

[0068] In another implementation, if a GSM or 3G or 4G or any other future telecom technology exists along with the present invention. Triangulation based on the telecom signal’s location with respect to its nearby towers will give an approximate location of the vehicle. Combining this with any location based sign board information captured by the front camera of the ADAS locates the exact position of the vehicle on the map. Assuming the vehicle passes by a building with a location based information on its address given saying the building number, street number, and area and postal code of the location. This is translated on the map to find the exact position of the vehicle along with the triangulated telecom information to determine the exact location of the vehicle.

[0069] In another implementation, in case GPS exists on board the vehicle. This gives information of the location provided the conditions necessary for satellite communication of GPS signals are met. Assuming the vehicle enters an underground road or area where there is no network with just sign boards on the waypoints of the vehicle, locations based information are translated from the last existing GPS location to determine the location of the vehicle in the underground road network. This complements with the GPS based vehicle tracking.

[0070] In another implementation, in case the vehicle is transiting in a militarized zone consisting of no GPS and GSM telecom connectivity. Based on the address information on the route of the vehicle’s transit and other waypoint information like a hill, bridge, tunnel or others, the vehicle’s approximate location can be decoded.

[0071] Advantages of the present invention include vehicle tracking being made possible where GPS based vehicle tracking or GSM based triangulation fails. In more detail, it includes conditions where GPS signal is jammed by nearby interferences. Also, because of less availability of spectrum and band compatibility for GPS or GSM due to new areas like V2V/V2X and Wi-Fi hotspots are being explored. Sometimes, GPS signal might also get distorted because of space weather. Also, when less or no satellites are visible in range of the GPS device, the vehicle cannot be tracked. Militarized or war zones are critical areas where GPS satellites may be blocked by military regulations. Long tunnels and underground road networks without network connectivity are other areas where the current information with fog computer and offline map can be extrapolated to obtain vehicle location.

[0072] Thus, the present invention is a near real time vehicle tracking system with limited or no satellite or telecom connectivity. It is predominantly dependent on image frames containing location based information. These image frames make up for a vehicle route containing few minutes of historical data without which it is very difficult to calculate the current location of the vehicle. On a safe route containing a lot of location based information from the captured image frames, the present invention qualifies as a best bet when no or limited satellite or telecom connectivity exists. On routes containing lesser location based information from captured image frames, this invention can be complemented for vehicle tracking with GPS and GSM connectivity.

[0073] While the prior composed depiction of the innovation empowers one of conventional ability to make and utilization what is viewed as right away to be the best mode thereof, those of standard aptitude will comprehend and admire the presence of varieties, blends, and equivalents of the particular encapsulation, system, and illustrations thus. The innovation ought to hence not be constrained by the above portrayed epitome, technique, and samples, yet by all exemplifications and routines inside the extension and soul of the development as guaranteed.

[0074] Although an automatic vehicle tracking system and a method thereof have been described in language specific to structural features and/or methods, it is to be understood that the embodiments disclosed in the above section are not necessarily limited to the specific features or methods or elements described herein. Rather, the specific features are disclosed as examples of implementations of the automatic vehicle tracking system and a method thereof

Documents

Application Documents

# Name Date
1 201731005404-AbandonedLetter.pdf 2024-07-12
1 Form 3 [15-02-2017(online)].pdf 2017-02-15
2 201731005404-FER.pdf 2020-06-25
2 Form 18 [15-02-2017(online)].pdf_241.pdf 2017-02-15
3 201731005404-Proof of Right (MANDATORY) [12-08-2017(online)].pdf 2017-08-12
3 Form 18 [15-02-2017(online)].pdf 2017-02-15
4 Description(Complete) [15-02-2017(online)].pdf 2017-02-15
4 Drawing [15-02-2017(online)].pdf 2017-02-15
5 Description(Complete) [15-02-2017(online)].pdf_240.pdf 2017-02-15
6 Description(Complete) [15-02-2017(online)].pdf 2017-02-15
6 Drawing [15-02-2017(online)].pdf 2017-02-15
7 201731005404-Proof of Right (MANDATORY) [12-08-2017(online)].pdf 2017-08-12
7 Form 18 [15-02-2017(online)].pdf 2017-02-15
8 201731005404-FER.pdf 2020-06-25
8 Form 18 [15-02-2017(online)].pdf_241.pdf 2017-02-15
9 201731005404-AbandonedLetter.pdf 2024-07-12
9 Form 3 [15-02-2017(online)].pdf 2017-02-15

Search Strategy

1 searchE_24-06-2020.pdf