Abstract: A driver assistance management system for vehicles comprising a wirelessly associated computing unit for user input and mode selection, a rotatable artificial intelligence-based imaging unit 101 for real-time 3D mapping of surroundings, a holographic projection unit 103 displays assistive visuals derived from the map, at least two pedal resistance modules 201 and force sensors measure pedal force, a dump-truck assembly 203 coupled with a hydraulic piston 203a and a locking arrangement 203b, regulating pedal movement upon detecting excessive force, cross-referencing user mode and visuals to prevent collisions, a sensing module in the steering wheel, with FBG (Fibre Bragg Grating) and MQ-3 sensors, monitors health and intoxication, triggering alerts via wirelessly associated computing unit, a speaker 104, which also prompts the user to pull over, with the microcontroller directing the resistance modules 201 to safely halt the vehicle, ensuring driving safety and precision.
Description:FIELD OF THE INVENTION
[0001] The present invention relates to a driver assistance management system for vehicles is capable of enhancing driving safety and precision by providing real-time adaptive assistance thereby monitoring driver health and ensuring smooth vehicle movement.
BACKGROUND OF THE INVENTION
[0002] Modern vehicles are increasingly equipped with various driver assistance systems aimed at improving safety and convenience. These systems often include features like parking assistance, adaptive cruise control, and basic collision avoidance. However, existing solutions frequently lack comprehensive integration of real-time environmental awareness, personalized driver assistance based on skill level, continuous driver health monitoring, and proactive intervention mechanisms. There's a clear need for a more holistic and integrated system that dynamically adapt to diverse driving conditions, individual driver behaviour, and critical driver well-being indicators to provide a truly enhanced and safer driving experience.
[0003] Traditionally, driver assistance has focused on providing alerts or limited automatic interventions on the vehicle. While beneficial, they often fall short in offering intuitive, multi-modal assistance that directly addresses the varied challenges faced by both novice and experienced drivers. Furthermore, the capability to monitor critical driver health parameters and respond effectively to potential impairments like intoxication or sudden abnormal health conditions remains an area requiring significant improvement. The limitations of these conventional technologies lead to sub-optimal performance in complex driving scenarios and not fully mitigate risks associated with human error or impairment.
[0004] US8818042B2 discloses a driver assistance system for a vehicle includes a forward facing camera and a processor operable to process image data captured by the camera. Responsive to processing of captured image data, the driver assistance system is operable to determine a lane along which the vehicle is traveling and to detect oncoming vehicles approaching the vehicle in another lane that is to the right or left of the determined lane along which the vehicle is traveling. The driver assistance system is operable to control, at least in part, a light beam emanating from a headlamp of the vehicle and adjusts the light beam emanating from the headlamp to limit directing beam light towards the eyes of a driver of the detected oncoming vehicle. Responsive to processing of captured image data, the driver assistance system is operable to provide lane departure warning to a driver of the vehicle.
[0005] US7873187B2 discloses driver assistance system for a vehicle includes an imaging device having a field of view forward of a vehicle and in a direction of travel of the equipped vehicle, an image processor operable to process image data captured by the imaging device, and a global positioning system operable to determine a geographical location of the vehicle. The equipped vehicle includes an adaptive speed control system for controlling the speed of the equipped vehicle. The adaptive speed control system may reduce the speed of the equipped vehicle responsive at least in part to a detection of a curve in the road ahead of the equipped vehicle via processing by the image processor of image data captured by the imaging device.
[0006] Conventionally, many systems are disclosed within the realm of driver assistance, primarily focusing on singular aspects of vehicle operation or driver monitoring. Existing solutions often include basic collision warning systems utilizing radar or cameras, lane keeping assist functionalities, or adaptive cruise control systems that manage speed based on traffic flow. However, these systems typically operate in isolation, lacking a cohesive, integrated approach to real-time environmental understanding, adaptive physical intervention, and comprehensive driver health monitoring.
[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a system that that integrates real-time, 3D environmental mapping, adaptive physical controls and comprehensive driver health monitoring. Such a system is needed to provide driver assistance, including holographic spatial guidance, intelligent pedal force regulation for collision avoidance, and automated safety protocols upon detecting driver impairment.
OBJECTS OF THE INVENTION
[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.
[0009] An object of the present invention is to develop a system that offers real-time, personalized assistance to users by allowing selection between learner and experienced driving modes, thereby adapting the system's intervention and guidance based on individual skill levels.
[0010] Another object of the present invention is to enhance driving safety and precision by generating a real-time three-dimensional map of the vehicle's surroundings and displaying assistive spatial visuals, thus improving the user's visibility and situational awareness.
[0011] Yet another object of the present invention is to continuously monitor the driver's health parameters and detect intoxication, upon detection of abnormalities, to trigger alerts to pre-registered contacts and safely halt the vehicle, ensuring paramount driving safety.
[0012] Another object of the present invention is to proactively prevent collisions and ensure smooth vehicle movement, and continuously command on pedals movement resistance modules, based on detected excessive force, user mode, and environmental data.
[0013] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0014] The present invention relates to a driver assistance management system for vehicles is capable of enhancing driving safety, precision, and user experience by providing real-time, personalized assistance, continuously monitoring driver health, and intelligently intervening to prevent collisions and ensure smooth vehicle operation.
[0015] According to an embodiment of the present invention, a driver assistance management system for vehicles comprises a computing unit wirelessly associated with the system, including a user interface configured to receive contact details and destination information of a user, the user interface is accessed by the user to select between a learner and an experienced mode, a microcontroller linked with the wirelessly associated computing unit for processing data to create a user profile stored in a linked database, further the microcontroller is connected to an electronic control unit (ECU) of the vehicle for identifying ignition status, based on which the microcontroller activates a rotatable artificial intelligence-based imaging unit for scanning and capturing images of surroundings. The captured image data is then processed by a processor integrated with the imaging unit, employing artificial intelligence protocols, to extract relevant data from the images and transmitted to the microcontroller which further process to generate a real-time three-dimensional map, a holographic projection unit to display assistive spatial visuals derived from the generated map for enhanced visibility, facilitating assistance in driving, at least two pedal resistance modules, each mounted behind accelerator and brake pedals, including a plurality of vacuum-based mounting units for secured positioning, a force sensor installed on each pedal and linked with the resistance modules to measure force applied by the user, and upon detection of excessive force, the microcontroller activates a dump-truck assembly operatively coupled with a hydraulic piston and a locking arrangement to control pedal movement, wherein the microcontroller cross-references the user’s selected mode, visuals, and force to regulate pedal movement for avoiding collisions and ensuring smooth movement, a sensing module including a Fiber Bragg Grating (FBG) sensor and a MQ-3 sensor, for measuring health parameters and detecting intoxication, wherein the microcontroller processes monitored data to detect abnormal health parameters and intoxication for triggering an alert to pre-registered contact details via notifications to computing units wirelessly associated with the microcontroller.
[0016] According to another embodiment of the present invention, a speaker configured with the vehicle and associated with the system, configured to trigger an alert for prompting the user to pull the vehicle over when abnormal health and intoxication are detected, wherein the microcontroller directs the resistance module to safely halt movement of the vehicle, thus ensuring driving safety and precision for the user, the imaging unit is equipped with a motorized sleeve for adjusting its orientation to optimize capturing of surroundings; the microcontroller is configured to execute multiple machine learning protocols for analyzing obtained data and visuals to dynamically control system’s operations, a GPS (Global Positioning System) module is integrated with the microcontroller for providing navigational guidance, enabling the projection unit to display route information; the microcontroller fetches the vehicle’s data regarding speed limit and battery status from the ECU, enabling the holographic projection unit to display visuals to assist the user; a piezoelectric unit is associated with the system and installed on each of the pedals, configured to generate vibrational feedback to notify the driver of limited pedal movement when excessive force is detected; the imaging unit is configured to detect speeding violations, congested traffic, blind spots, and driver drowsiness, based on which the microcontroller regulates operation of the hydraulic piston for limiting accelerator input or engaging braking, the microcontroller is configured to assign the user to the learner or experienced mode based on driving behavior analyzed by the machine learning protocols for adjusting component actuation for collision prevention, and a playback module configured to display live recordings from the imaging unit for post-event analysis from the linked database.
[0017] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates a prospective view of a vehicle incorporating a driver assistance management system for vehicles; and
Figure 2 illustrates an isometric view of a pedal resistance module associated with the proposed system.
DETAILED DESCRIPTION OF THE INVENTION
[0019] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0020] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0021] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0022] The present invention relates to a driver assistance management system for vehicles capable of enhancing driving safety, precision, and the overall user experience by providing real-time, personalized assistance, continuously monitoring driver health, and intelligently intervening to prevent collisions and ensure smooth vehicle operation across various driving conditions.
[0023] Referring to Figure 1 and Figure 2, a prospective view of a vehicle incorporating a driver assistance management system for vehicles and an isometric view of a pedal resistance module associated with the proposed system are illustrated, respectively comprising a rotatable artificial intelligence-based imaging unit 101 equipped with a motorized sleeve 102 installed on the vehicle, a holographic projection unit 103 installed in the vehicle, a speaker 104 configured with the vehicle, at least two pedal resistance module 201 associated with the system, each mounted behind accelerator and brake pedals of the vehicle, the resistance module 201 includes a plurality of vacuum based mounting units 202 for secured positioning, a dump-truck assembly 203 operatively coupled with a hydraulic piston 203a and a locking arrangement 203b is assembled on the pedals, a piezoelectric unit 204 installed on each of the pedal.
[0024] The system disclosed herein includes a computing unit wirelessly associated with the system, comprising a user interface (UI) to receive contact details and destination information of a user. The UI is accessed by the user to select between a learner and an experienced mode. The computing unit comprises a central processing unit (CPU), random access memory (RAM), and various input/output (I/O) controllers, all interconnected via a motherboard. When the user interacts with the UI, their inputs, contact details, destination, mode selection are received by the I/O controllers and temporarily stored in a short-term memory. The CPU then processes the data to validate inputs, format information, and prepare it for transmission to a microcontroller which is linked with the computing unit for processing the data to create the user profile stored in a linked database.
[0025] The microcontroller is configured to assign the user to the learner or experienced mode, based on driving behavior analyzed by the machine learning protocols for adjusting component actuation for collision prevention. For instance, when the user selects the “learner mode”, the CPU processes the selection and prepares a corresponding digital signal to be sent wirelessly. The computing unit's internal storage flash memory, house the operating system, the UI application, and potentially a local cache of the user profile, allowing for quick access and a responsive user experience.
[0026] An electronic control unit (ECU) of the vehicle is connected to the microcontroller to constantly monitors critical vehicle parameters via sensors. The microcontroller processes the sensor’s data according to pre-programmed logic to control actuators like fuel injectors, braking system and many others. The microcontroller is configured to execute multiple machine learning protocols for analyzing the obtained data and visuals to dynamically control the system’s operations. The ECU acts as a data gateway, when the microcontroller sends requests for identifying ignition status of the vehicle, the ECU by using its internal communication protocols retrieves and transmits the information back. These allows our system to react to the vehicle's operational state without directly controlling its fundamental functions.
[0027] Based on ignition status of the vehicle, the microcontroller activates a rotatable artificial intelligence-based imaging unit 101 equipped with a motorized sleeve 102 for adjusting its orientation to optimize capturing of surroundings installed on the vehicle to generate a real-time three-dimensional (3D) map. The imaging unit 101 integrates a camera, with a specialized processing unit. The optical sensor captures raw visual data from the vehicle's surroundings. The data is then fed to the processing unit, where pre-trained artificial intelligence models, specifically convolutional neural networks, analyze the incoming images. These models are configured to perform tasks like object detection, semantic segmentation, and depth estimation. The imaging unit's internal software processes these analyses to construct a real-time three-dimensional map, identifying potential hazards, traffic conditions, and blind spots. A motor control unit within the device adjusts the motorized sleeve 102, orienting the camera based on input from the artificial intelligence (AI) for optimal scanning coverage.
[0028] The motorized sleeve 102 internally houses a small electric stepper motor, connected to a gearing arrangement. The arrangement is precisely engineered to translate the motor's rotational motion into controlled angular movement of the imaging unit 101. A motor driver circuit, often controlled by the imaging unit's own processor, receives commands regarding desired orientation. In response, a driver sends electrical pulses to the motor, causing it to rotate by a specific angle. Integrated potentiometers provide feedback on the sleeve's current position, ensuring accurate and stable adjustment for optimal environmental scanning.
[0029] On the basis of vehicle’s data regarding speed limit and battery status from the ECU of the vehicle, the microcontroller enables a holographic projection unit 103 installed in the vehicle. The projection unit 103 configured to display assistive spatial visuals which are derived from the generated map for enhanced visibility to the user, facilitating assistance to the user in driving the vehicle. The projection unit 103 internally utilizes a laser light source, which directs light towards a spatial light modulator. The spatial light modulator, a Liquid Crystal on Silicon device, dynamically manipulates the phase and amplitude of the incoming light based on the real-time 3D map data received from the microcontroller. The modulated light then passes through optical elements, such as lenses and diffusers, precisely arranged to reconstruct a wave front that forms a 3D image or visual in a specific focal plane within the user's field of view. An integrated control board manages the spatial light modulator and laser light source, ensuring precise and dynamic display of assistive spatial visuals.
[0030] At least, two pedal resistance modules 201 are associated with the system, each mounted behind accelerator and brake pedals of the vehicle. The resistance modules 201 include a plurality of vacuum based mounting units 202, for secured positioning. Each unit is connected to a miniature vacuum pump and a valve system. When activated by the microcontroller, the vacuum pump creates a partial vacuum within the mounting unit 202, causing it to firmly adhere to the surface behind the pedal. To increase pedal resistance, specific vacuum units are engaged, effectively creating a controlled frictional or physical impediment to the pedal's free movement.
[0031] A force sensor installed on each pedal and linked with the resistance module 201 to measure force applied by the user on the respective pedals. The force sensor is typically a strain gauge which operates on the principle of converting mechanical force into an electrical signal. Internally, a strain gauge comprises a thin wire bonded to a flexible backing. When force is applied to the pedal, it deforms the sensor, causing a minute change in the electrical resistance of the grid. These changes in resistance is precisely measured by an internal conditioning circuit, which then amplifies and converts it into a calibrated electrical signal proportional to the applied force. The signal is then transmitted to the microcontroller for processing.
[0032] Upon detection of excessive force, the microcontroller activates a dump-truck assembly 203 operatively coupled with a hydraulic piston 203a and a locking arrangement 203b, to control pedal movement. The hydraulic piston 203a operates on the principle of converting hydraulic fluid pressure into linear mechanical force. Internally, it consists of a cylinder barrel housing a movable piston 203a attached to a rod. When a miniature hydraulic pump, activated by the microcontroller, pressurizes hydraulic fluid into one chamber of the cylinder, it pushes against the piston 203a. This force causes the piston 203a and its attached rod to extend or retract, generating a powerful, controlled linear movement. Seals around the piston 203a ensure that the fluid pressure is contained, preventing leaks and maximizing the force generated. Also, on the basis of data detected by the imaging unit 101 regarding speeding violations, congested traffic, blind spots and driver drowsiness, the microcontroller regulates operation of the hydraulic piston 203a for limiting accelerator input or engage breaking.
[0033] The locking arrangement 203b works in conjunction with the hydraulic piston 203a to precisely control pedal movement. This mechanical system consists of robust interlocking components, such as a series of cogs, a clutch, or a pawl. As the hydraulic piston 203a extends or retracts, it mechanically engages or disengages these components with the pedal's linkage. For instance, when excessive force is detected, the piston 203a extends to activate the locking arrangement 203b, creating a physical barrier or engagement that restricts further pedal depression. When the system needs to release control, the piston 203a retracts, disengaging the lock and allowing the pedal to move freely again.
[0034] Further, a piezoelectric unit 204 is associated with the system and installed on each of the pedals to generate a vibrational feedback to notify the driver of limited pedal movement when excessive force is detected. The piezoelectric unit 204, often a piezoelectric ceramic element, works internally by exploiting the piezoelectric effect. When mechanical stress is applied to this special material, its internal crystalline structure deforms. The deformation causes a displacement of electrical charges within the material, resulting in the generation of an electric voltage across its surfaces. Conversely, applying an electric voltage across the material causes it to mechanically deform, producing a physical vibration. For the pedal application, when excessive force on the pedal strains the piezoelectric unit 204, it generates a proportional electrical charge. This electrical signal is then detected by an internal conditioning circuit, which converts it into a voltage signal that the microcontroller interprets, thereby providing vibrational feedback to the user.
[0035] A sensing module which includes a FBG (Fiber Bragg Grating) sensor and a MQ-3 sensor, integrated in a steering wheel cover of the vehicle, for measuring health parameters of the user, along with detection of intoxication. The FBG sensor works internally by having a periodic variation in the refractive index within the core of an optical fiber. When broadband light travels through this grating structure, a specific narrow band of wavelengths is reflected back, while others are transmitted. This reflected wavelength, known as the Bragg wavelength, is highly sensitive to changes in strain and temperature. An interrogator unit connected to the FBG precisely measures the wavelength shift, which is then correlated to the applied strain or temperature changes, allowing for the inference of physiological parameters like heart rate or respiration.
[0036] The MQ-3 sensor is a metal oxide semiconductor gas sensor specifically designed to detect alcohol vapor. Internally, it consists of a sensing element, typically made of tin dioxide, coated onto a ceramic tube, with a heater coil running through its center. When the sensor is heated to a specific temperature, oxygen molecules from the surrounding air adsorb onto the surface of the tin dioxide, creating a depletion layer that increases the sensor's electrical resistance. When alcohol vapor is present in the air, the alcohol molecules react with the adsorbed oxygen on the sensor's surface. This reaction releases electrons back into the tin dioxide, causing its electrical conductivity to increase and its resistance to decrease. The change in resistance is inversely proportional to the alcohol concentration, providing an analog output voltage that varies with the amount of alcohol detected.
[0037] Upon receiving data from both the FBG and MQ-3 sensors, the microcontroller performs critical processing. For the FBG sensor, the microcontroller will analyse the wavelength shift data to calculate and interpret the user's health parameters, such as heart rate variability, even respiratory patterns based on subtle movements. For the MQ-3 sensor, the microcontroller will convert the analog voltage output into a quantifiable alcohol concentration level. The microcontroller then applies thresholds to this processed data. In case, the microcontroller detects abnormal health parameters and intoxication detection, such as a dangerously high or low heart rate, or if the alcohol concentration from the MQ-3 sensor exceeds a pre-defined intoxication threshold, it will trigger an immediate alert. This involves initiating notifications to pre-registered contact details via wirelessly associated computing units and activating a speaker 104 configured with the vehicle to prompt the user to pull over, ultimately directing the resistance modules 201 to safely halt the vehicle's movement by ensuring driving safety and precision for user.
[0038] The speaker 104 converts electrical audio signals into audible sound waves. The speaker 104 comprises a permanent magnet, a voice coil, and a diaphragm attached to the voice coil. When the microcontroller sends an electrical alert signal to the speaker 104, this signal is an alternating current that flows through the voice coil. The interaction between the magnetic field generated by the current in the voice coil and the static magnetic field of the permanent magnet causes the voice coil to move rapidly back and forth. Since the diaphragm is rigidly connected to the voice coil, it vibrates in unison. These vibrations create pressure waves in the surrounding air, which our ears perceive as sound, thus delivering the audible alert to the user.
[0039] A GPS (Global Positioning System) module integrated with the microcontroller for providing navigational guidance. The GPS module internally contains a highly sensitive radio receiver and a precise timing unit. It continuously listens for weak radio signals transmitted by multiple orbiting GPS satellites. Each satellite broadcasts signals that include its precise orbital position and the exact time the signal was sent. Upon receiving signals from at least four satellites, the GPS module's internal processor calculates the time difference between when the signals were sent and when they were received. By knowing the exact time each signal was sent and the speed of light, the module determines the distance to each satellite. Through a process called trilateration, the GPS module’s processor uses these distances to pinpoint its own precise geographical coordinates. This positional data is then formatted and transmitted to the microcontroller with the help of which microcontroller enables the projection unit 103 to display information regarding routes.
[0040] Additionally, a playback module is integrated with the wirelessly associated computing unit’s UI to display live recordings from the imaging unit 101 for post-event analysis, from the linked database. The playback module consists of a dedicated video decoder circuit, and local memory buffers. When activated by the microcontroller, it receives requests for recorded footage from the imaging unit 101. It then accesses the linked database, retrieving the relevant compressed video data, which are streamed wirelessly. The video decoder circuit then processes this compressed data, decompressing it into raw video frames. These frames are temporarily stored in a buffer before being rendered by the computing unit's graphics processor for display on the user interface. An internal control logic manages playback functions like pause, rewind, and fast-forward, ensuring smooth and responsive post-event analysis.
[0041] The present invention works best in the following manner, where the computing unit, wirelessly linked to the system, first receives the user's contact details and destination information through its user interface. Here, the user actively selects their driving proficiency, choosing between the learner mode and the experienced mode, which tailors the system's assistance level. Simultaneously, the microcontroller, connected to the vehicle's ECU, identifies the vehicle's ignition status. Upon ignition, the microcontroller activates the rotatable artificial intelligence-based imaging unit 101. The imaging unit 101, often adjusting its orientation via a motorized sleeve 102 for optimal capture, continuously scans the vehicle's surroundings to capture image and the image data is then processed by a processor integrated with the imaging unit 101, employing artificial intelligence protocols, to extract relevant data from the images and transmitted to the microcontroller which further process to generate a real-time three-dimensional map. The generated 3D map is then leveraged by a holographic projection unit 103, installed within the vehicle. The projection unit 103 displays assistive spatial visuals directly in the user's field of view, providing enhanced visibility and facilitating driving assistance. Concurrently, the system actively monitors the user's pedal inputs. The Force sensors, installed on both the accelerator and brake pedals, continuously measure the force applied by the user. If excessive force is detected, the microcontroller cross-references the user's selected mode, the real-time visuals, and the applied force. Based on this analysis, it activates the dump-truck assembly 203, comprising the hydraulic piston 203a and the locking arrangement 203b, to regulate pedal movement via associated pedal resistance modules 201.
[0042] In continuation, this intelligent regulation prevents collisions and ensures smooth vehicle operation. The piezoelectric units 204 on the pedals provide vibrational feedback to alert the driver of limited pedal movement when excessive force is detected. The imaging unit 101 further enhances safety by detecting speeding violations, congested traffic, blind spots, and driver drowsiness, allowing the microcontroller to regulate the hydraulic piston 203a to limit acceleration or engage braking as needed. The sensing module, integrated into the steering wheel cover, incorporates the FBG sensor for health parameter monitoring and the MQ-3 sensor for intoxication detection. The microcontroller processes this data to identify abnormal health parameters or intoxication. Upon detection, the microcontroller immediately triggers the alert to pre-registered contact details via notifications to wirelessly associated computing units. Simultaneously, the speaker 104 within the vehicle prompts the user to pull over, and the microcontroller directs the resistance modules 201 to safely halt the vehicle's movement, ensuring paramount driving safety. Throughout operation, the microcontroller executes multiple machine learning protocols to analyze all obtained data and visuals, dynamically controlling the system's operations. The GPS module, integrated with the microcontroller, provides navigational guidance, with the holographic projection unit 103 displaying route information. The microcontroller also fetches vehicle data like speed limits and battery status from the ECU, displaying relevant visuals on the holographic projection unit 103 for user assistance. For post-event analysis, the user interface includes the playback module, allowing display of live recordings from the imaging unit 101 stored in the linked database.
[0043] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A driver assistance management system for vehicles, comprising:
a) a computing unit wirelessly associated with the system, comprising an user interface configured to receive contact details and destination information of a user, wherein the user interface is accessed by the user to select between a learner and an experienced mode;
b) a microcontroller linked with the wirelessly associated computing unit for processing the data to create a user profile stored in a linked database, wherein the microcontroller is connected to an electronic control unit (ECU) of the vehicle, for identifying ignition status of the vehicle, based on which the microcontroller activates a rotatable artificial intelligence-based imaging unit 101 installed on the vehicle for scanning surroundings of the vehicle to generate a real-time three-dimensional map;
c) a holographic projection unit 103 installed in the vehicle configured to display assistive spatial visuals, derived from the generated map, for enhanced visibility to the user, facilitating assistance to the user in driving the vehicle, wherein at least two pedal resistance module 201, associated with the system, each mounted behind accelerator and brake pedals of the vehicle, respectively, the resistance modules 201 includes a plurality of vacuum based mounting units 202, for secured positioning;
d) a force sensor installed on each pedal, and linked with the resistance modules 201, to measure force applied by the user on the respective pedals, and upon detection of excessive force, the microcontroller activates a dump-truck assembly 203 operatively coupled with a hydraulic piston 203a and a locking arrangement 203b, to control pedal movement, wherein the microcontroller cross references the user’s selected mode, visuals, force to regulate the pedal movement for avoiding collisions and ensuring smooth movement of the vehicle;
e) a sensing module including a FBG (Fiber Bragg Grating) sensor and a MQ-3 sensor, integrated in a steering wheel cover of the vehicle, for measuring health parameters of the user, along with detection of intoxication, wherein the microcontroller processes the monitored data to detect abnormal health parameters and intoxication detection for triggering an alert to pre-registered contact details via notifications to computing units wirelessly associated with the microcontroller; and
f) a speaker 104 configured with the vehicle and associated with the system, configured to trigger an alert for prompting the user to pull the vehicle over, when the abnormal health and intoxication is detected, wherein the microcontroller directs the resistance modules 201 to safely halt movement of the vehicle, thus ensuring driving safety and precision for user.
2) The driver assistance management system as claimed in claim 1, wherein imaging unit 101 is equipped with a motorized sleeve 102 for adjusting orientation of the imaging unit 101 to optimize capturing of surroundings.
3) The driver assistance management system as claimed in claim 1, wherein the microcontroller is configured to execute multiple machine learning protocols for analyzing the obtained data and visuals to dynamically control system’s operations.
4) The driver assistance management system as claimed in claim 1, wherein a GPS (Global Positioning System) module integrated with the microcontroller for providing navigational guidance, enabling the projection unit 103 to display route information.
5) The driver assistance management system as claimed in claim 1, wherein the microcontroller fetches the vehicle’s data regarding speed limit and battery status from the ECU of the vehicle, and enables the holographic projection unit 103 to display visuals to assist the user.
6) The driver assistance management system as claimed in claim 1, wherein a piezoelectric unit 204 is associated with the system and installed on each of the pedals, configured to generate a vibrational feedback to notify the driver of limited pedal movement, when excessive force is detected.
7) The driver assistance management system as claimed in claim 1, wherein the imaging unit 101 is configured to detect speeding violations, congested traffic, blind spots and driver drowsiness, based on which the microcontroller regulates operation of the hydraulic piston 203a for limiting accelerator input or engage braking.
8) The driver assistance management system as claimed in claim 1, wherein the microcontroller is configured to assign the user to the learner or experienced mode, based on driving behaviour analysed by the machine learning protocols for adjusting component actuation for collision prevention.
9) The driver assistance management system as claimed in claim 1, wherein the user interface is integrated with a playback module configured to display live recordings from the imaging unit 101 for post-event analysis, from the linked database.
| # | Name | Date |
|---|---|---|
| 1 | 202521052758-STATEMENT OF UNDERTAKING (FORM 3) [30-05-2025(online)].pdf | 2025-05-30 |
| 2 | 202521052758-REQUEST FOR EXAMINATION (FORM-18) [30-05-2025(online)].pdf | 2025-05-30 |
| 3 | 202521052758-REQUEST FOR EARLY PUBLICATION(FORM-9) [30-05-2025(online)].pdf | 2025-05-30 |
| 4 | 202521052758-PROOF OF RIGHT [30-05-2025(online)].pdf | 2025-05-30 |
| 5 | 202521052758-POWER OF AUTHORITY [30-05-2025(online)].pdf | 2025-05-30 |
| 6 | 202521052758-FORM-9 [30-05-2025(online)].pdf | 2025-05-30 |
| 7 | 202521052758-FORM FOR SMALL ENTITY(FORM-28) [30-05-2025(online)].pdf | 2025-05-30 |
| 8 | 202521052758-FORM 18 [30-05-2025(online)].pdf | 2025-05-30 |
| 9 | 202521052758-FORM 1 [30-05-2025(online)].pdf | 2025-05-30 |
| 10 | 202521052758-FIGURE OF ABSTRACT [30-05-2025(online)].pdf | 2025-05-30 |
| 11 | 202521052758-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [30-05-2025(online)].pdf | 2025-05-30 |
| 12 | 202521052758-EVIDENCE FOR REGISTRATION UNDER SSI [30-05-2025(online)].pdf | 2025-05-30 |
| 13 | 202521052758-EDUCATIONAL INSTITUTION(S) [30-05-2025(online)].pdf | 2025-05-30 |
| 14 | 202521052758-DRAWINGS [30-05-2025(online)].pdf | 2025-05-30 |
| 15 | 202521052758-DECLARATION OF INVENTORSHIP (FORM 5) [30-05-2025(online)].pdf | 2025-05-30 |
| 16 | 202521052758-COMPLETE SPECIFICATION [30-05-2025(online)].pdf | 2025-05-30 |
| 17 | Abstract.jpg | 2025-06-18 |
| 18 | 202521052758-FORM-26 [01-07-2025(online)].pdf | 2025-07-01 |