Sign In to Follow Application
View All Documents & Correspondence

Robust And Real Time Augmented Reality Based Driving Assistive System

Abstract: The present disclosure provides a system for vehicle to avoid collision. The system includes, one or more image capturing devices and one or more sensors on the vehicle. A processing unit is configured to receive the captured images and information from image capturing devices and sensor, and extract information needed to determine if there is a possible danger, leading to collision, using machine learning techniques. Upon extraction, of the information, the processing unit further extracts speed, travelling route /lane and future position of nearby vehicles and attributes of the driven vehicle, such as speed, temperature and tire pressure, and transmits the extracted information to a display unit. The display unit, further projects the information, onto a windshield of the driven vehicle, resulting in a Head Up Display (HUD). The display is in the form of a real-time augmented reality, to assist the driver to understand easily and avoid possible collision.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
06 December 2021
Publication Number
23/2023
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
info@khuranaandkhurana.com
Parent Application

Applicants

Chitkara Innovation Incubator Foundation
SCO: 160-161, Sector - 9c, Madhya Marg, Chandigarh- 160009, India.

Inventors

1. SINGH, Gurjinder
Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
2. MANTRI, Archana
Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
3. SALUJA, Nitin Kumar
Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
4. SINGH, Narinder Pal
Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
5. GHOSH, Debarshi
Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
6. SINGH, Ashwani
Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
7. RANI, Lekha
Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.

Specification

The present invention relates to a vehicle collision avoidance,
driving assistive system. In particular, it relates to a vehicle safety system that improves peripheral vision, provide warning and improve reaction time for the vehicle drivers.
BACKGROUND
[0002] Background description includes information that may be useful in
understanding the present invention. It is not an admission that any of the
information provided herein is prior art or relevant to the presently claimed
invention, or that any publication specifically or implicitly referenced is prior art.
[0003] In general, road accidents are very common and they can also cause
death. Some of the major causes of road accidents have been distracted drivers,
failure of judgement, over speeding and influence of drugs or alcohol. Also, while
driving a vehicle, there are areas that are not visible to the driver and also are not
covered by the vehicle's mirrors and when other objects/ vehicles/ pedestrians are
not in the area of sight of the driver, accidents and collisions are caused.
[0004] Hence there is a requirement in art to develop a driving assistive
system, that warms the driver of any existing or approaching dangers on the road, predicts the future lane of travel of other objects on the road and helps avoid them.
OBJECTS OF THE PRESENT DISCLOSURE
[0005] The general object of the present disclosure is to provide a vehicle
collision avoidance, driving assistive system.
[0006] Another object of the present disclosure is to provide a driving
assistive system with a real-time augmented reality-based head-up display (HUD).
[0007] Another object of the present disclosure is to provide information
regarding the current and future paths of the vehicles/ living beings/ objects on the road and road conditions to avoid any possible danger.

[0008] Another object of the present disclosure is to provide a driving
assistive system to encourage a safe and comfortable driving experience and avoid accidents or collisions on the road, while driving.
SUMMARY
[0009] The present invention relates to a vehicle driving assistive system.
In particular, it relates to a vehicle safety system that improves peripheral vision,
provide warning and improve reaction time for the vehicle drivers.
[0010] In an aspect, the present disclosure provides, a vehicle collision
avoidance, driving assistive system. The driving assistive system, may include one or more image capturing devices that may be located on the vehicle. The one or more image capturing devices may capture images pertaining to road and one or more objects external to the vehicle. One or more object detecting sensors may be located on the vehicle, that may detect location, speed and direction of one or more objects external to the vehicle. A display unit may be operatively coupled to a windshield of the vehicle. A processing unit may be operatively coupled to a combination of one or more image capturing devices, the one or more object detecting sensors on the vehicle, and the display unit. The processing unit may comprise of a processor which may be operatively coupled with a memory. The memory may store instructions executable by the processor to receive a first set of images captured by the one or more image capturing devices that may be located on the vehicle. The processor may receive a second set of signals from the one or more object detecting sensors that may be located on the vehicle. The processor may extract a first set of features from the received first set of images, which may be pertaining to one or more critical road, driving and external environment conditions outside the vehicle. The processor may extract a second set of features from the received second set of signals, that may be pertaining to location, speed and direction of one or more objects external to the vehicle. Based on the extracted first and second set of features, the processor may predict, a third set of features which may be pertaining speed, probable lane change, road crossing

of the one or more objects external to the vehicle; and may display the third set of
features on the display unit on the windshield.
[0011] In another embodiment, a display unit on the windshield of the
vehicle may display the third set of features which may be pertaining to any or a
combination of one or more objects external to the vehicle with speed, location
and direction data, a highlighted dangerous object and future location, route of
travel of the second vehicle and speed, temperature and tire pressure of the
vehicle. The display unit may be a head up display (HUD).
[0012] In yet another embodiment, the processing unit may be further
configured to track in real-time, one or more objects external to the vehicle, may
be using data from one or more imagine capturing devices and one or more object
detecting sensors. The processing unit may also be configured to predict a future
location and a route of the one or more objects external to the vehicle and may
also determine in real - time a dangerous object from the one more objects external
to the vehicle, the danger object may be potential collision.
[0013] In another embodiment, a vehicle detection unit may be operatively
coupled to the processing unit and may be further configured to detect nearby
second vehicles in the first set of images that may be captured by the image
capturing devices.
[0014] In another embodiment, an object detection unit may be operatively
coupled to the processing unit and may be further configured to detect one or more
objects, in the first set of images that may be captured by the one or more image
capturing devices.
[0015] In another embodiment, a human detection unit may be operatively
coupled to the object detection unit and may be further configured to detect one
or more living beings and may separate the one or more living beings from the
one or more objects that may be detected by the object detection unit.
[0016] In another embodiment, a lane detection unit may be operatively
coupled to the processing unit and may be further configured to detect the
trajectory or path or lane of travel, of the second vehicles that may be detected by
the vehicle detection unit.

[0017] Various objects, features, aspects, and advantages of the inventive
subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
BRIEF DESCRIPTION OF DRAWINGS
[0018] The accompanying drawings are included to provide a further
understanding of the present disclosure and are incorporated in and constitute a
part of this specification. The drawings illustrate exemplary embodiments of the
present invention and, together with the description, serve to explain the principles
of the present disclosure.
[0019] FIG. 1 illustrates an exemplary network architecture with which or
in which the proposed system is implemented, in accordance with an exemplary
embodiment of the present disclosure.
[0020] FIG. 2 illustrates exemplary functional modules of a processing unit
in accordance with an exemplary embodiment of the present disclosure.
[0021] FIG. 3 illustrates the proposed system, within a vehicle.
DETAILED DESCRIPTION
[0022] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims. The present enclosure relates, in general, to an omni shopping cart. In particular, the present system relates to an omni shopping cart that allows shoppers to shop and pay for their purchases autonomously. [0023] FIG. 1 illustrates an exemplary system architecture with which or in which the proposed system is implemented, in accordance with an exemplary embodiment of the present disclosure.

[0024] As illustrated in FIG.l, in an aspect, the system 110 may be operatively coupled to a centralized server 112. The system 110 is further configured to connect to a plurality of vehicles (104-1, 104-2... 104-N), through a network 106. The system 110, may help the individual users/ drivers ( 102-1, 102-2...102-N) of the plurality of the vehicles ( 104-1, 104-2... 104-N), to avoid collision.
[0025] In an embodiment, the system (110) can be implemented using any
or a combination of hardware components and software components such as a cloud, a server, a computing system, a user device, a network device and the like. The system 110 can interact with the computing devices 104 through the network 106. Examples of the computing device 104 can include, but not limited to, a smart phone, a portable computer, a personal digital assistant, a handheld device, a standalone unit, a laptop, a smart tablet and the like.
[0026] Further, the network (106) can be a wireless network, a wired
network or a combination thereof that can be implemented as one of the different types of networks, such as Intranet, Local Area Network (LAN), Wide Area Network (WAN), Internet, Bluetooth and the like. The wireless technologies may include GPS, Bluetooth, FID, Zigbee, and Global system for Mobile communication (GSM), Radio frequency (RF) wireless communication such as Wi-Fi, Bluetooth. Further, the network (106) can either be a dedicated network or a shared network. The shared network can represent an association of the different types of networks that can use variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like. [0027] In an exemplary embodiment, one or more image capturing devices (not shown in the figure), such as cameras, but not limited to the same, may be located on the vehicle, which may capture images pertaining to one or more objects external to the vehicle and road, for example approaching speed breakers, gravel on the road, upcoming turns, but not limited to the same. [0028] In an exemplary embodiment, one or more object detecting sensors, such as proximity sensors, but not limited to the same, may be located on the

vehicle, which may detect location, speed and direction of one or more objects external to the vehicle.
[0029] In an embodiment, the processing unit 200 may include one or more processors 202 coupled with a memory 204, wherein the memory may store instructions which when executed by the one or more processors 202 may cause the processing unit 200 to process one or more functions associated with the collision avoidance system. FIG. 2 with reference to FIG. 1, illustrates an exemplary representation of the functional modules of the processing unit, in accordance with an embodiment of the present disclosure.
[0030] As illustrated, the processing unit 202 can include one or more processor(s) 202. The one or more processor(s) 202 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 202 are configured to fetch and execute computer-readable instructions stored in a memory 204 of the processing unit 200. The memory 204 can store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service.
[0031] The processing unit 200 can also include an interface(s) 206. The interface(s) 206 may include a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, transducers, actuators, and the like. The interface(s) 206 can facilitate communication of the processing unit 200 with various devices coupled to the processing unit. The interface(s) 206 can also provide a communication pathway for one or more components of the processing unit. Examples of such components include, but are not limited to, database 208.
[0032] In another embodiment, the processing unit 200 can receive a first set of images captured by the one or more image capturing devices located on the vehicle. For example, the first set of images may be of a second vehicle in the same proximity, or a pedestrian or an animal, but not limited to the same. And the

processing unit 200 can receive a second set of signals from the one or more object detecting sensors on the vehicle. For example, the distance of the second vehicle from the driven vehicle, the location or position of the second vehicle with respect to the driven vehicle, speed of the second vehicle, but not limited to the same. [0033] In another embodiment, the processing unit 200, can extract a first set of features from the received first set of images, that can be pertaining to one or more critical road, driving and external environment conditions outside the vehicle, and can extract a second set of features from the received second set of signals, that can be pertaining to location, speed and direction of one or more objects external to the vehicle.
[0034] In an exemplary embodiment, a vehicle detection unit 222 can be operatively coupled to the processing unit 200, further configured to detect second vehicles in the first set of images, in the defined proximity of the driven vehicle. For example, if there are multiple objects present in the first set of images, the vehicle detection unit 222, detects vehicles among them such as trucks, cars, scooters, bicycles, but not limited to the same.
[0035] In another exemplary embodiment, a human detection unit 226 can be operatively coupled to the processing unit 200, further configured to detect living beings such as pedestrians on foot, animals, but not limited to the same, in the first set of images, in the defined proximity of the driven vehicle. [0036] In another exemplary embodiment, an object detection unit 224 can be operatively coupled to the processing unit 200, further configured to detect objects, other than vehicles and living beings, such as, rocks, bottles, gravel but not limited to the same, in the first set of images, in the defined proximity of the driven vehicle.
[0037] In another embodiment, a lane detection unit 228, can be configured to receive a fourth set of signals from the vehicle detection unit 222, human detection unit 226 and object detection unit, pertaining information regarding detected entities from the first set of images. The lane detection unit 228 is further configured to detect the lane/ route of travel, of the detected entities.

[0038] In another embodiment, a ML engine 214 and data signal requisition engine 212 can be operatively coupled to the processing unit 200 and configured to receive the signals from the detection units and the second set of signals from the sensors. The ML engine 214 can be further configured to process and predict in real time the future positions of the entities detected and if they are dangerous and if they may cause a collision.
[0039] In another embodiment, a vehicle control unit 220 can be operatively coupled to the processing unit 200 and configured to receive information from the ML engine 214. The vehicle control unit is further configured to process and predict a safe lane, speed of the travel for the driven vehicle, in order to avoid collision.
[0040] In an exemplary embodiment, a picture generation unit 216 and computer graphics module 218 can be operatively coupled to the processing unit 200 and configured to receive information from the vehicle control unit 220 and ML engine 214. The picture generation unit 216 can be configured to generate pictures using the information and the computer graphics module 218 can be configured to produce a real time augmented reality-based display. [0041] In another embodiment, the processing unit 200 is operatively coupled to a display unit. The display unit may comprise of one or more image projectors 308, one or more mirrors 306. The processed output from the processing unit 200, with the help of the display unit, can be displayed on ta windshield 302, of the driven vehicle, as a Head-Up Display (HUD) 304.
[0042] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.

ADVANTAGES OF THE PRESENT DISCLOSURE
[0043] The present disclosure provides a vehicle collision avoidance,
driving assistive system.
[0044] The present disclosure provides a driving assistive system with a
real-time augmented reality-based head-up display (HUD).
[0045] The present disclosure provides information regarding the current
and future paths of the vehicles/ humans/ objects on the road and road conditions
to avoid any possible danger.
[0046] The present disclosure provides a driving assistive system to
encourage a safe and comfortable driving experience and avoid accidents or
collisions on the road, while driving.

We Claim:

1. A system for facilitating vehicle collision avoidance and driving assistance (110), said system comprising:
one or more image capturing devices (not shown in the figure) located on the vehicle, said one or more image capturing devices capture images pertaining to road and one or more objects external to the vehicle;
one or more object detecting sensors (not shown in the figure) on the vehicle, said one or more object detecting sensors on the vehicle detect location, speed and direction of one or more objects external to the vehicle; a display unit operatively coupled to a windshield (302) of the vehicle;
a processing unit (200) operatively coupled to a combination of the one or more image capturing devices, the one or more object detecting sensors on the vehicle, and the display unit, wherein the processing unit (110) comprises a processor (202) operatively coupled with a memory (204), said memory storing instructions executable by the processor (202) to:
receive a first set of images captured by the one or more image capturing devices located on the vehicle;
receive a second set of signals from the one or more object detecting
sensors on the vehicle;
extract a first set of features from the received first set of images, said first set of features pertaining to one or more critical road, driving and external environment conditions outside the vehicle;
extract a second set of features from the received second set of signals, said second set of features pertaining to location, speed and direction of one or more objects external to the vehicle;

based on the extracted first and second set of features, predict, a third set of features comprising speed, probable lane change, road crossing of the one or more objects external to the vehicle; and
display the third set of features on the display unit on the windshield
(302).
2. The system as claimed in claim 1, wherein the display unit on the windshield (302) of the vehicle (300) displays the third set of features, said third set of features pertaining to any or a combination of one or more objects external to the vehicle with speed, location and direction data, a highlighted dangerous object and future location, route of travel of the second vehicle and speed, temperature and tire pressure of the vehicle, wherein the display unit is a head up display (HUD) (304).
3. The system as claimed in claim 1, wherein the processing unit (200) is further configured to:
track in real-time, one or more objects external to the vehicle, using data from one or more imagine capturing devices and one or more object detecting sensors.
predict a future location and a route of the one or more objects external to the vehicle; determine in real - time a dangerous object from the one more objects external to the vehicle, wherein the danger object is a potential collision.
4. The system claimed as in claim 1, wherein a vehicle detection unit (222), operatively coupled to the processing unit (200), is further configured to detect nearby second vehicles in the first set of images captured by the image capturing devices.
5. The system as claimed in claim 1, wherein an object detection unit (224), operatively coupled to the processing unit (200) is further configured to detect one or more objects, the first set of images captured by the one or more image capturing devices.
6. The system as claimed in claim 1, wherein a human detection unit (226), operatively coupled to the object detection unit (224) is further configured

to detect one or more living beings and separate the one or more living beings from the one or more objects detected by the object detection unit. 7. The system claimed in claim 1: wherein a lane detection unit (228), operatively coupled to the processing unit (200), is further configured to detect the trajectory or path or lane of travel, of the second vehicles detected by the vehicle detection unit.

Documents

Application Documents

# Name Date
1 202111056565-STATEMENT OF UNDERTAKING (FORM 3) [06-12-2021(online)].pdf 2021-12-06
2 202111056565-POWER OF AUTHORITY [06-12-2021(online)].pdf 2021-12-06
3 202111056565-FORM FOR STARTUP [06-12-2021(online)].pdf 2021-12-06
4 202111056565-FORM FOR SMALL ENTITY(FORM-28) [06-12-2021(online)].pdf 2021-12-06
5 202111056565-FORM 1 [06-12-2021(online)].pdf 2021-12-06
6 202111056565-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [06-12-2021(online)].pdf 2021-12-06
7 202111056565-EVIDENCE FOR REGISTRATION UNDER SSI [06-12-2021(online)].pdf 2021-12-06
8 202111056565-DRAWINGS [06-12-2021(online)].pdf 2021-12-06
9 202111056565-DECLARATION OF INVENTORSHIP (FORM 5) [06-12-2021(online)].pdf 2021-12-06
10 202111056565-COMPLETE SPECIFICATION [06-12-2021(online)].pdf 2021-12-06
11 202111056565-Proof of Right [25-02-2022(online)].pdf 2022-02-25
12 202111056565-FORM 18 [24-08-2023(online)].pdf 2023-08-24
13 202111056565-FER.pdf 2025-03-04
14 202111056565-FORM 3 [03-06-2025(online)].pdf 2025-06-03
15 202111056565-FORM-5 [04-09-2025(online)].pdf 2025-09-04
16 202111056565-FORM-26 [04-09-2025(online)].pdf 2025-09-04
17 202111056565-FER_SER_REPLY [04-09-2025(online)].pdf 2025-09-04
18 202111056565-CORRESPONDENCE [04-09-2025(online)].pdf 2025-09-04

Search Strategy

1 202111056565E_21-07-2024.pdf