Sign In to Follow Application
View All Documents & Correspondence

Smart Vehicle Driver Alertness System And Method Thereof

Abstract: Disclosed herein is a smart vehicle driver alertness system and method thereof (100) that comprises an artificial intelligence-enabled smart sunglasses (102) equipped with an infrared eye-tracking sensor (104), a pupil dilation and gaze tracking sensor (106), and a real-time yawn detection sensor (108) to monitor drowsiness-related behaviours. A haptic feedback mechanism (110) provides immediate driver alerts, while a wireless communication network (112) transmits data to a cloud storage (114). The system (100) further includes an in-car artificial intelligence camera unit (116) and a smart steering wheel sensor (126) for comprehensive fatigue analysis. A processing unit (134), integrating a multi-sensor integration module (136), an adaptive machine learning algorithm (138), and a real-time alert generation module (140), triggers safety mechanisms including an emergency intervention module (142), an emergency notification module (144), an alarm unit (146), a vibration enabled seat unit (148), and an automated vehicle speed reduction unit (150).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
03 April 2025
Publication Number
17/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SR UNIVERSITY
ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA

Inventors

1. NAFIS UDDIN KHAN
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA

Specification

Description:FIELD OF DISCLOSURE
[0001] The present disclosure generally relates to the field of automotive safety systems and artificial intelligence, more specifically, relates to smart vehicle driver alertness system and method thereof.
BACKGROUND OF THE DISCLOSURE
[0002] One of the advantages of the smart vehicle driver alertness system and method thereof is the integration of artificial intelligence-driven multi-sensor technology, which continuously enhances accuracy in detecting early signs of drowsiness. The combination of infrared eye-tracking sensors, pupil dilation monitoring, gaze tracking, and yawn detection ensures that the system effectively identifies driver fatigue in real-time. The inclusion of low-light and infrared night vision technology ensures that the system remains functional in all lighting conditions, making it highly reliable for nighttime driving and long-haul journeys. Unlike conventional eye-monitoring systems that often generate false alarms, the artificial intelligence-based approach refines detection by analysing multiple physiological and behavioural indicators, reducing unnecessary distractions and improving overall safety.
[0003] Another advantage of the smart vehicle driver alertness system and method thereof is its pre-emptive alert mechanism, which actively prevents accidents by providing timely warnings before drowsiness impairs driving performance. The integration of a haptic feedback system in the smart sunglasses delivers gentle vibrations, prompting the driver to regain focus without requiring manual intervention. The steering wheel sensors monitor grip pressure, steering patterns, and heart rate variability, ensuring that any sign of fatigue is immediately detected. The artificial intelligence-based risk assessment system processes data from multiple sources to predict drowsiness, generating customized alerts, including visual, auditory, and vibration-based warnings, to ensure the driver stays alert.
[0004] Additionally, the smart vehicle driver alertness system and method thereof significantly improves road safety by incorporating an emergency response and prevention system. If extreme drowsiness is detected, the system triggers an in-car alarm or activates vibrations in the seat to immediately wake the driver. The vehicle’s speed automatically reduces when integrated with the car's system, minimizing the risk of high-speed collisions. The system proactively notifies emergency contacts or road safety authorities if necessary, ensuring immediate assistance in case of an emergency. The artificial intelligence-powered continuous learning mechanism adapts to individual driving patterns, making the system more efficient and personalized over time, thereby preventing fatigue-related accidents with greater precision.
[0005] One of the disadvantages of the similar kind of existing inventions people are using today is the reliance on single-sensor-based detection systems, which limits the accuracy and reliability of drowsiness detection. Many conventional systems only focus on monitoring eye closure, which often leads to false positives and false negatives. Factors such as wearing glasses, different eye shapes, and individual variations in blinking patterns create inconsistencies in detection. Additionally, some systems use steering pattern monitoring, which fails to differentiate between genuine drowsiness and external factors such as road conditions, vehicle type, and driver habits. The absence of multi-sensor integration reduces the overall efficiency of existing inventions in preventing drowsy driving accidents.
[0006] Another disadvantage of the similar kind of existing inventions people are using today is the lack of real-time adaptive response mechanisms that ensure immediate intervention when a driver exhibits signs of fatigue. Many current drowsiness detection systems only provide visual or auditory alerts, which drivers often ignore due to habituation or distractions. These systems do not incorporate haptic feedback mechanisms, such as vibration-based alerts in smart sunglasses or steering wheels, that provide a more direct and effective warning. Moreover, most existing inventions do not utilize artificial intelligence-powered predictive analysis, limiting their ability to anticipate drowsiness before it becomes critical. This delayed response increases the risk of accidents by failing to engage the driver at the earliest stage of fatigue.
[0007] Additionally, the similar kind of existing inventions people are using today lack an integrated emergency response and prevention system, which restricts their ability to actively prevent accidents in high-risk situations. Most conventional drowsiness detection systems do not feature automatic speed reduction, making them ineffective in cases where the driver is unable to respond to warnings. The absence of emergency contact notifications and real-time road safety authority alerts further reduces the practicality of these systems in ensuring driver safety. Furthermore, the majority of existing inventions do not include heart rate monitoring, grip pressure analysis, or artificial intelligence-based continuous learning, preventing the system from adapting to individual driving patterns over time. This lack of personalization and automation decreases the efficiency of drowsiness detection, making traditional systems less effective in real-world driving scenarios.
[0008] Thus, in light of the above-stated discussion, there exists a need for a smart vehicle driver alertness system and method thereof.
SUMMARY OF THE DISCLOSURE
[0009] The following is a summary description of illustrative embodiments of the invention. It is provided as a preface to assist those skilled in the art to more rapidly assimilate the detailed design discussion which ensues and is not intended in any way to limit the scope of the claims which are appended hereto in order to particularly point out the invention.
[0010] According to illustrative embodiments, the present disclosure focuses on a smart vehicle driver alertness system and method thereof which overcomes the above-mentioned disadvantages or provide the users with a useful or commercial choice.
[0011] An objective of the present disclosure is to develop an artificial intelligence-powered multi-sensor system that continuously monitors driver alertness using integrated data from smart sunglasses, in-car cameras, and steering wheel sensors.
[0012] Another objective of the present disclosure is to enhance real-time drowsiness detection accuracy by utilizing infrared eye-tracking sensors, facial recognition algorithms, and grip pressure analysis to identify fatigue indicators with high precision.
[0013] Another objective of the present disclosure is to provide a non-intrusive method for detecting driver fatigue by eliminating the need for wearable headbands or direct-contact sensors, ensuring a comfortable and distraction-free driving experience.
[0014] Another objective of the present disclosure is to implement a proactive risk assessment system that continuously analyses eye blinking patterns, pupil dilation, head movements, and heart rate variability to predict drowsiness before it leads to an accident.
[0015] Another objective of the present disclosure is to introduce a haptic feedback mechanism in smart sunglasses and the steering wheel, delivering immediate vibration alerts to the driver upon detecting early signs of fatigue.
[0016] Another objective of the present disclosure is to integrate an artificial intelligence-powered adaptive learning model that continuously refines detection accuracy based on individual driving habits and environmental conditions.
[0017] Another objective of the present disclosure is to incorporate a real-time emergency response feature that automatically triggers an in-car alarm, reduces vehicle speed, and notifies emergency contacts or road safety authorities upon detecting severe drowsiness.
[0018] Another objective of the present disclosure is to ensure reliable operation in all lighting conditions by employing low-light and infrared night vision technology, enabling accurate detection of fatigue-related behaviours during nighttime driving.
[0019] Yet another objective of the present disclosure is to develop a mobile application that synchronizes with the driver monitoring system, providing real-time alerts, personalized drowsiness reports, and predictive analytics for improved road safety.
[0020] Yet another objective of the present disclosure is to enhance overall vehicle safety by integrating the driver alertness system with existing automotive technologies, such as adaptive cruise control and lane-keeping assistance, for automated intervention in high-risk scenarios.
[0021] In light of the above, in one aspect of the present disclosure, a smart vehicle driver alertness system and method thereof is disclosed herein. The system comprises an artificial intelligence enabled smart sunglasses configured to detect and analyse driver drowsiness parameters in real time, wherein the artificial intelligence-enabled smart sunglasses comprises. The system includes an infrared eye-tracking sensor configured to detect eye closure frequency and blinking patterns indicative of drowsiness. The system also includes a pupil dilation and gaze tracking sensor configured to monitor eye movement, gaze deviation, and focus shifts for early detection of fatigue. The system also includes a real-time yawn detection sensor configured to detect mouth movements and yawning frequency through an embedded microphone and motion sensors. The system also includes a haptic feedback mechanism configured to generate vibration alerts upon detecting drowsiness-related behaviours, thereby providing immediate driver feedback. The system also includes a wireless communication network configured to transmit real-time drowsiness detection data to a cloud storage. The system also includes an in-car artificial intelligence camera unit connected to the cloud storage through the wireless communication network and being configured to process facial recognition data, head movements, and eye-tracking information for advanced drowsiness detection, wherein the in-car artificial intelligence camera unit comprises. The system also includes a facial recognition module configured to detect head tilts, nodding patterns, and micro-sleep episodes indicative of drowsiness. The system also includes a head pose estimation module configured to analyse head position deviations that correlate with fatigue-related symptoms. The system also includes a driver posture monitoring module configured for continuously detecting unnatural body movements, shoulder slouching, and head displacement patterns indicative of fatigue-related symptoms. The system also includes a low-light and infrared night vision module configured to ensure accurate detection of fatigue-related behaviours in varying lighting conditions, including nighttime driving. The system also includes a smart steering wheel sensor connected to the the cloud storage through the wireless communication network and being configured to detect variations in driving patterns and physiological responses of the driver, wherein the smart steering wheel sensor comprises. The system also includes a grip pressure monitoring module configured to detect reduced grip strength associated with fatigue and loss of alertness. The system also includes a steering pattern analysis module configured to identify irregular lane drifting and inconsistent steering behaviours associated with drowsiness. The system also includes a touch-based heart rate sensor configured to measure heart rate variability and assess physiological indicators of fatigue. The system also includes a processing unit connected to the cloud storage through the wireless communication network and being configured to evaluate real-time drowsiness data and generate proactive warnings, wherein the processing unit comprises. The system also includes a multi-sensor integration module configured to combine data from the artificial intelligence-enabled smart sunglasses, the in-car artificial intelligence camera unit, and the smart steering wheel sensor for comprehensive drowsiness analysis. The system also includes an adaptive machine learning algorithm configured to analyse driver-specific fatigue patterns and improve prediction accuracy over time. The system also includes a real-time alert generation module configured to issue fatigue warnings via vibrations, alarms, and visual alerts based on detected drowsiness thresholds. The system also includes an emergency intervention module configured to activate safety measures when extreme drowsiness is detected, including vehicle speed reduction and driver wake-up alerts. The system also includes an emergency notification module configured to transmit distress alerts to emergency contacts or road safety authorities upon detecting prolonged and extreme drowsiness. The system also includes an alarm unit connected to the processing unit and being configured to generate high-intensity auditory alerts upon detecting severe drowsiness. The system also includes a vibration enabled seat unit connected to the processing unit and being configured to provide haptic feedback to the driver for immediate alertness restoration. The system also includes an automated vehicle speed reduction unit connected to the processing unit and being configured to dynamically control the vehicle’s acceleration to minimize collision risks.
[0022] In one embodiment, the infrared eye-tracking sensor of the artificial intelligence-enabled smart sunglasses is continuously processing real-time eye movement data and transmitting the detected eye closure frequency and blinking patterns to the processing unit through the wireless communication network.
[0023] In one embodiment, the pupil dilation and gaze tracking sensor of the artificial intelligence-enabled smart sunglasses is continuously detecting gaze shifts and eye focus deviations, transmitting the fatigue-related parameters to the multi-sensor integration module through the wireless communication network for enhanced drowsiness detection accuracy.
[0024] In one embodiment, the haptic feedback mechanism of the artificial intelligence-enabled smart sunglasses is generating vibration alerts upon receiving a drowsiness detection signal from the processing unit via the wireless communication network, ensuring immediate driver response.
[0025] In one embodiment, the low-light and infrared night vision module of the in-car artificial intelligence camera unit is processing drowsiness detection data in varying lighting conditions and transmitting enhanced visual recognition data to the processing unit via the wireless communication network to maintain accurate monitoring.
[0026] In one embodiment, the grip pressure monitoring module of the smart steering wheel sensor is continuously detecting grip strength variations and transmitting real-time grip pressure data to the processing unit via the wireless communication network for fatigue pattern correlation.
[0027] In one embodiment, the real-time alert generation module of the processing unit is issuing customized warnings through multiple alert mechanisms, including vibration alerts via the smart sunglasses, auditory alarms through the alarm unit, and visual alerts on the vehicle’s dashboard display.
[0028] In one embodiment, the emergency intervention module of the processing unit is activating the automated vehicle speed reduction unit to dynamically adjust the vehicle’s acceleration and notifying emergency contacts or road safety authorities upon detecting prolonged and extreme drowsiness.
[0029] In one embodiment, the system further includes a driver respiration rate monitoring sensor is connected to the smart steering wheel sensor, continuously detecting variations in breathing patterns, identifying irregular or slowed respiration as a potential fatigue indicator, and transmitting the respiration data to the processing unit via the wireless communication network for real-time correlation with eye-tracking, heart rate, and steering behaviour to enhance predictive drowsiness detection.
[0030] In light of the above, in one aspect of the present disclosure, a smart vehicle driver alertness system and method thereof is disclosed herein. The method comprises detecting eye closure frequency and blinking patterns indicative of drowsiness through an infrared eye-tracking sensor integrated into an artificial intelligence-enabled smart sunglasses. The method includes monitoring eye movement, gaze deviation, and focus shifts for early detection of fatigue through a pupil dilation and gaze tracking sensor integrated into the artificial intelligence-enabled smart sunglasses. The method also includes identifying mouth movements and yawning frequency through a real-time yawn detection sensor comprising an embedded microphone and motion sensors integrated into the artificial intelligence-enabled smart sunglasses. The method also includes providing vibration alerts upon detecting fatigue-related behaviours through a haptic feedback mechanism integrated into the artificial intelligence-enabled smart sunglasses. The method also includes transmitting real-time drowsiness detection data to a cloud storage through a wireless communication network integrated into the artificial intelligence-enabled smart sunglasses. The method also includes processing facial recognition data, head movements, and eye-tracking information for advanced drowsiness detection through an in-car artificial intelligence camera unit connected to the cloud storage via the wireless communication network. The method also includes detecting head tilts, nodding patterns, and micro-sleep episodes indicative of drowsiness through a facial recognition module integrated into the in-car artificial intelligence camera unit. The method also includes analysing head position deviations that correlate with fatigue-related symptoms through a head pose estimation module integrated into the in-car artificial intelligence camera unit. The method also includes detecting unnatural body movements, shoulder slouching, and head displacement patterns indicative of fatigue-related symptoms through a driver posture monitoring module integrated into the in-car artificial intelligence camera unit. The method also includes ensuring accurate detection of fatigue-related behaviours in varying lighting conditions, including nighttime driving, through a low-light and infrared night vision module integrated into the in-car artificial intelligence camera unit. The method also includes detecting variations in grip strength, steering behaviour, and physiological responses of the driver through a smart steering wheel sensor connected to the cloud storage via the wireless communication network. The method also includes measuring reductions in grip strength associated with fatigue and loss of alertness through a grip pressure monitoring module integrated into the smart steering wheel sensor. The method also includes identifying irregular lane drifting and inconsistent steering behaviours associated with drowsiness through a steering pattern analysis module integrated into the smart steering wheel sensor. The method also includes measuring heart rate variability and assessing physiological indicators of fatigue through a touch-based heart rate sensor integrated into the smart steering wheel sensor. The method also includes aggregating real-time drowsiness detection data from multiple sources for comprehensive drowsiness analysis through a multi-sensor integration module integrated into a processing unit connected to the cloud storage via the wireless communication network. The method also includes analysing driver-specific fatigue patterns and improving prediction accuracy over time through an adaptive machine learning algorithm integrated into the processing unit. The method also includes issuing fatigue warnings via vibrations, alarms, and visual alerts based on detected drowsiness thresholds through a real-time alert generation module integrated into the processing unit. The method also includes generating high-intensity auditory alerts upon detecting severe drowsiness through an alarm unit integrated into the emergency intervention module. The method also includes providing haptic feedback to the driver for immediate alertness restoration through a vibration-enabled seat unit integrated into the emergency intervention module. The method also includes dynamically controlling the vehicle’s acceleration to minimize collision risks through an automated vehicle speed reduction unit integrated into the emergency intervention module. The method also includes transmitting distress alerts to emergency contacts or road safety authorities upon detecting prolonged and extreme drowsiness through an emergency notification module integrated into the processing unit.
[0031] These and other advantages will be apparent from the present application of the embodiments described herein.
[0032] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
[0033] These elements, together with the other aspects of the present disclosure and various features are pointed out with particularity in the claims annexed hereto and form a part of the present disclosure. For a better understanding of the present disclosure, its operating advantages, and the specified object attained by its uses, reference should be made to the accompanying drawings and descriptive matter in which there are illustrated exemplary embodiments of the present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0034] To describe the technical solutions in the embodiments of the present disclosure or in the prior art more clearly, the following briefly describes the accompanying drawings required for describing the embodiments or the prior art. Apparently, the accompanying drawings in the following description merely show some embodiments of the present disclosure, and a person of ordinary skill in the art can derive other implementations from these accompanying drawings without creative efforts. All of the embodiments or the implementations shall fall within the protection scope of the present disclosure.
[0035] The advantages and features of the present disclosure will become better understood with reference to the following detailed description taken in conjunction with the accompanying drawing, in which:
[0036] FIG. 1 illustrates a block diagram of a smart vehicle driver alertness system and method thereof, in accordance with an exemplary embodiment of the present disclosure;
[0037] FIG. 2 illustrates a flow chart of a smart vehicle driver alertness system, in accordance with an exemplary embodiment of the present disclosure;
[0038] FIG. 3 illustrates a flow chart of a method for real-time, non-intrusive driver drowsiness detection and alert generation, in accordance with an exemplary embodiment of the present disclosure;
[0039] FIG. 4 illustrates a perspective view of a smart vehicle driver alertness system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0040] Like reference, numerals refer to like parts throughout the description of several views of the drawing.
[0041] The smart vehicle driver alertness system and method thereof is illustrated in the accompanying drawings, which like reference letters indicate corresponding parts in the various figures. It should be noted that the accompanying figure is intended to present illustrations of exemplary embodiments of the present disclosure. This figure is not intended to limit the scope of the present disclosure. It should also be noted that the accompanying figure is not necessarily drawn to scale.
DETAILED DESCRIPTION OF THE DISCLOSURE
[0042] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure.
[0043] In the following description, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the present disclosure. It may be apparent to one skilled in the art that embodiments of the present disclosure may be practiced without some of these specific details.
[0044] Various terms as used herein are shown below. To the extent a term is used, it should be given the broadest definition persons in the pertinent art have given that term as reflected in printed publications and issued patents at the time of filing.
[0045] The terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items.
[0046] The terms “having”, “comprising”, “including”, and variations thereof signify the presence of a component.
[0047] Referring now to FIG. 1 to FIG. 4 to describe various exemplary embodiments of the present disclosure. FIG. 1 illustrates a perspective view of a smart vehicle driver alertness system and method thereof 100, in accordance with an exemplary embodiment of the present disclosure.
[0048] The system 100 may include an artificial intelligence enabled smart sunglasses 102 configured to detect and analyse driver drowsiness parameters in real time, wherein the artificial intelligence enabled smart sunglasses 102 comprises, an infrared eye-tracking sensor 104 configured to detect eye closure frequency and blinking patterns indicative of drowsiness, a pupil dilation and gaze tracking sensor 106 configured to monitor eye movement, gaze deviation, and focus shifts for early detection of fatigue, a real-time yawn detection sensor 108 configured to detect mouth movements and yawning frequency through an embedded microphone and motion sensors, a haptic feedback mechanism 110 configured to generate vibration alerts upon detecting drowsiness-related behaviours, thereby providing immediate driver feedback, a wireless communication network 112 configured to transmit real-time drowsiness detection data to a cloud storage 114, an in-car artificial intelligence camera unit 116 connected to the cloud storage 114 through the wireless communication network 112 and being configured to process facial recognition data, head movements, and eye-tracking information for advanced drowsiness detection, wherein the in-car artificial intelligence camera unit 116 comprises, a facial recognition module 118 configured to detect head tilts, nodding patterns, and micro-sleep episodes indicative of drowsiness, a head pose estimation module 120 configured to analyse head position deviations that correlate with fatigue-related symptoms, a driver posture monitoring module 122 configured for continuously detecting unnatural body movements, shoulder slouching, and head displacement patterns indicative of fatigue-related symptoms, a low-light and infrared night vision module 124 configured to ensure accurate detection of fatigue-related behaviours in varying lighting conditions, including nighttime driving, a smart steering wheel sensor 126 connected to the cloud storage 114 through the wireless communication network 112 and being configured to detect variations in driving patterns and physiological responses of the driver, wherein the smart steering wheel sensor 126 comprises, a grip pressure monitoring module 128 configured to detect reduced grip strength associated with fatigue and loss of alertness, a steering pattern analysis module 130 configured to identify irregular lane drifting and inconsistent steering behaviours associated with drowsiness, a touch-based heart rate sensor 132 configured to measure heart rate variability and assess physiological indicators of fatigue, a processing unit 134 connected to the cloud storage 114 through the wireless communication network 112 and being configured to evaluate real-time drowsiness data and generate proactive warnings, wherein the processing unit 134 comprises, a multi-sensor integration module 136 configured to combine data from the artificial intelligence enabled smart sunglasses 102, the in-car artificial intelligence camera unit 116, and the smart steering wheel sensor 126 for comprehensive drowsiness analysis, an adaptive machine learning algorithm 138 configured to analyse driver-specific fatigue patterns and improve prediction accuracy over time, a real-time alert generation module 140 configured to issue fatigue warnings via vibrations, alarms, and visual alerts based on detected drowsiness thresholds, an emergency intervention module 142 configured to activate safety measures when extreme drowsiness is detected, including vehicle speed reduction and driver wake-up alerts, an emergency notification module 144 configured to transmit distress alerts to emergency contacts or road safety authorities upon detecting prolonged and extreme drowsiness, an alarm unit 146 connected to the processing unit 134 and being configured to generate high-intensity auditory alerts upon detecting severe drowsiness, a vibration enabled seat unit 148 connected to the processing unit 134 and being configured to provide haptic feedback to the driver for immediate alertness restoration, an automated vehicle speed reduction unit 150 connected to the processing unit 134 and being configured to dynamically control the vehicle’s acceleration to minimize collision risks.
[0049] The infrared eye-tracking sensor 104 of the artificial intelligence enabled smart sunglasses 102 is continuously processing real-time eye movement data and transmitting the detected eye closure frequency and blinking patterns to the processing unit 134 through the wireless communication network 112.
[0050] The pupil dilation and gaze tracking sensor 106 of the artificial intelligence enabled smart sunglasses 102 is continuously detecting gaze shifts and eye focus deviations, transmitting the fatigue-related parameters to the multi-sensor integration module 136 through the wireless communication network 112 for enhanced drowsiness detection accuracy.
[0051] The haptic feedback mechanism 110 of the artificial intelligence enabled smart sunglasses 102 is generating vibration alerts upon receiving a drowsiness detection signal from the processing unit 134 via the wireless communication network 112, ensuring immediate driver response.
[0052] The low-light and infrared night vision module 124 of the in-car artificial intelligence camera unit 116 is processing drowsiness detection data in varying lighting conditions and transmitting enhanced visual recognition data to the processing unit 134 via the wireless communication network 112 to maintain accurate monitoring.
[0053] The grip pressure monitoring module 128 of the smart steering wheel sensor 126 is continuously detecting grip strength variations and transmitting real-time grip pressure data to the processing unit 134 via the wireless communication network 112 for fatigue pattern correlation.
[0054] The real-time alert generation module 140 of the processing unit 134 is issuing customized warnings through multiple alert mechanisms, including vibration alerts via the smart sunglasses, auditory alarms through the alarm unit 146, and visual alerts on the vehicle’s dashboard display.
[0055] The emergency intervention module 142 of the processing unit 134 is activating the automated vehicle speed reduction unit 150 to dynamically adjust the vehicle’s acceleration and notifying emergency contacts or road safety authorities upon detecting prolonged and extreme drowsiness.
[0056] The system 100 further includes a driver respiration rate monitoring sensor is connected to the smart steering wheel sensor 126, continuously detecting variations in breathing patterns, identifying irregular or slowed respiration as a potential fatigue indicator, and transmitting the respiration data to the processing unit 134 via the wireless communication network 112 for real-time correlation with eye-tracking, heart rate, and steering behaviour to enhance predictive drowsiness detection.
[0057] The method 100 may include detecting eye closure frequency and blinking patterns indicative of drowsiness through an infrared eye-tracking sensor 104 integrated into an artificial intelligence enabled smart sunglasses 102, monitoring eye movement, gaze deviation, and focus shifts for early detection of fatigue through a pupil dilation and gaze tracking sensor 106 integrated into the artificial intelligence enabled smart sunglasses 102, identifying mouth movements and yawning frequency through a real-time yawn detection sensor 108 comprising an embedded microphone and motion sensors integrated into the artificial intelligence enabled smart sunglasses 102, providing vibration alerts upon detecting fatigue-related behaviours through a haptic feedback mechanism 110 integrated into the artificial intelligence enabled smart sunglasses 102, transmitting real-time drowsiness detection data to a cloud storage 114 through a wireless communication network 112 integrated into the artificial intelligence enabled smart sunglasses 102, processing facial recognition data, head movements, and eye-tracking information for advanced drowsiness detection through an in-car artificial intelligence camera unit 116 connected to the cloud storage 114 via the wireless communication network 112, detecting head tilts, nodding patterns, and micro-sleep episodes indicative of drowsiness through a facial recognition module 118 integrated into the in-car artificial intelligence camera unit 116, analysing head position deviations that correlate with fatigue-related symptoms through a head pose estimation module 120 integrated into the in-car artificial intelligence camera unit 116, detecting unnatural body movements, shoulder slouching, and head displacement patterns indicative of fatigue-related symptoms through a driver posture monitoring module 122 integrated into the in-car artificial intelligence camera unit 116, ensuring accurate detection of fatigue-related behaviours in varying lighting conditions, including nighttime driving, through a low-light and infrared night vision module 124 integrated into the in-car artificial intelligence camera unit 116, detecting variations in grip strength, steering behaviour, and physiological responses of the driver through a smart steering wheel sensor 126 connected to the cloud storage 114 via the wireless communication network 112, measuring reductions in grip strength associated with fatigue and loss of alertness through a grip pressure monitoring module 128 integrated into the smart steering wheel sensor 126, identifying irregular lane drifting and inconsistent steering behaviours associated with drowsiness through a steering pattern analysis module 130 integrated into the smart steering wheel sensor 126, measuring heart rate variability and assessing physiological indicators of fatigue through a touch-based heart rate sensor 132 integrated into the smart steering wheel sensor 126, aggregating real-time drowsiness detection data from multiple sources for comprehensive drowsiness analysis through a multi-sensor integration module 136 integrated into a processing unit 134 connected to the cloud storage 114 via the wireless communication network 112, analysing driver-specific fatigue patterns and improving prediction accuracy over time through an adaptive machine learning algorithm 138 integrated into the processing unit 134, issuing fatigue warnings via vibrations, alarms, and visual alerts based on detected drowsiness thresholds through a real-time alert generation module 140 integrated into the processing unit 134, activating safety measures upon detecting extreme drowsiness through an emergency intervention module 142 integrated into the processing unit 134, generating high-intensity auditory alerts upon detecting severe drowsiness through an alarm unit 146 integrated into the emergency intervention module 142, providing haptic feedback to the driver for immediate alertness restoration through a vibration-enabled seat unit integrated into the emergency intervention module 142, dynamically controlling the vehicle’s acceleration to minimize collision risks through an automated vehicle speed reduction unit 150 integrated into the emergency intervention module 142, transmitting distress alerts to emergency contacts or road safety authorities upon detecting prolonged and extreme drowsiness through an emergency notification module 144 integrated into the processing unit 134.
[0058] The artificial intelligence enabled smart sunglasses 102 continuously monitor and analyze real-time driver drowsiness parameters using multiple integrated sensors and communication mechanisms. The artificial intelligence enabled smart sunglasses 102 operate as a non-intrusive wearable device designed to detect early signs of fatigue and alert the driver through haptic feedback. The artificial intelligence enabled smart sunglasses 102 integrate seamlessly with the cloud storage 114 via the wireless communication network 112 to transmit and process data for advanced drowsiness detection and prevention.
[0059] The infrared eye-tracking sensor 104 embedded within the artificial intelligence enabled smart sunglasses 102 actively detects eye closure frequency and blinking patterns that indicate drowsiness. The infrared eye-tracking sensor 104 processes real-time eye movement data and transmits detected eye closure frequency and blinking patterns to the processing unit 134 via the wireless communication network 112. The infrared eye-tracking sensor 104 ensures accurate and uninterrupted monitoring of eye activity regardless of external lighting conditions.
[0060] The pupil dilation and gaze tracking sensor 106 integrated into the artificial intelligence enabled smart sunglasses 102 continuously detects eye movement, gaze deviation, and focus shifts indicative of fatigue. The pupil dilation and gaze tracking sensor 106 processes gaze-related parameters in real-time and transmits data to the multi-sensor integration module 136 within the processing unit 134 through the wireless communication network 112. The pupil dilation and gaze tracking sensor 106 enhances drowsiness detection by correlating gaze stability and movement patterns with established fatigue indicators.
[0061] The real-time yawn detection sensor 108 comprising an embedded microphone and motion sensors within the artificial intelligence enabled smart sunglasses 102 actively detects mouth movements and yawning frequency. The real-time yawn detection sensor 108 continuously transmits yawning-related data to the processing unit 134 via the wireless communication network 112. The real-time yawn detection sensor 108 ensures that drowsiness assessments incorporate multiple physiological fatigue indicators for enhanced accuracy.
[0062] The haptic feedback mechanism 110 embedded in the artificial intelligence enabled smart sunglasses 102 generates vibration alerts upon detecting drowsiness-related behaviors. The haptic feedback mechanism 110 receives drowsiness detection signals from the processing unit 134 through the wireless communication network 112 and provides immediate driver feedback. The haptic feedback mechanism 110 ensures that the driver receives real-time tactile alerts to maintain alertness and reduce the risk of accidents.
[0063] The wireless communication network 112 enables seamless data transmission between the artificial intelligence enabled smart sunglasses 102, the in-car artificial intelligence camera unit 116, the smart steering wheel sensor 126, and the processing unit 134. The wireless communication network 112 establishes a continuous connection with the cloud storage 114 to ensure real-time analysis and storage of drowsiness detection data. The wireless communication network 112 facilitates uninterrupted system functionality by maintaining consistent communication across all integrated components.
[0064] The cloud storage 114 serves as the central repository for real-time drowsiness detection data collected from the artificial intelligence enabled smart sunglasses 102, the in-car artificial intelligence camera unit 116, and the smart steering wheel sensor 126. The cloud storage 114 enables advanced processing and analysis of driver fatigue patterns. The cloud storage 114 ensures secure storage and accessibility of historical drowsiness detection data for improved machine learning-driven accuracy over time.
[0065] The in-car artificial intelligence camera unit 116 processes facial recognition data, head movements, and eye-tracking information for advanced drowsiness detection. The in-car artificial intelligence camera unit 116 continuously transmits data to the processing unit 134 through the wireless communication network 112. The in-car artificial intelligence camera unit 116 enhances drowsiness detection by integrating multiple visual fatigue indicators to identify early warning signs.
[0066] The facial recognition module 118 within the in-car artificial intelligence camera unit 116 detects head tilts, nodding patterns, and micro-sleep episodes indicative of drowsiness. The facial recognition module 118 continuously transmits facial recognition data to the processing unit 134 for comprehensive analysis. The facial recognition module 118 enhances the detection of behavioral fatigue patterns that contribute to real-time alert generation.
[0067] The head pose estimation module 120 embedded within the in-car artificial intelligence camera unit 116 analyzes head position deviations that correlate with fatigue-related symptoms. The head pose estimation module 120 continuously transmits head movement data to the processing unit 134 for integration with other fatigue parameters. The head pose estimation module 120 ensures that subtle variations in driver posture are accurately assessed for drowsiness detection.
[0068] The driver posture monitoring module 122 within the in-car artificial intelligence camera unit 116 continuously detects unnatural body movements, shoulder slouching, and head displacement patterns indicative of fatigue-related symptoms. The driver posture monitoring module 122 transmits real-time posture data to the processing unit 134 for integration with multi-sensor fatigue analysis. The driver posture monitoring module 122 enhances predictive drowsiness detection by incorporating full-body movement tracking.
[0069] The low-light and infrared night vision module 124 integrated into the in-car artificial intelligence camera unit 116 ensures accurate detection of fatigue-related behaviors in varying lighting conditions, including nighttime driving. The low-light and infrared night vision module 124 transmits enhanced visual recognition data to the processing unit 134 through the wireless communication network 112. The low-light and infrared night vision module 124 enables uninterrupted drowsiness detection in low-visibility conditions.
[0070] The smart steering wheel sensor 126 detects variations in driving patterns and physiological responses of the driver and transmits real-time data to the processing unit 134 through the wireless communication network 112. The smart steering wheel sensor 126 integrates multiple monitoring mechanisms to provide comprehensive fatigue assessments.
[0071] The grip pressure monitoring module 128 within the smart steering wheel sensor 126 detects reduced grip strength associated with fatigue and loss of alertness. The grip pressure monitoring module 128 transmits real-time grip pressure data to the processing unit 134 for fatigue pattern correlation. The grip pressure monitoring module 128 enhances the detection of drowsiness-related physiological responses.
[0072] The steering pattern analysis module 130 integrated into the smart steering wheel sensor 126 identifies irregular lane drifting and inconsistent steering behaviors associated with drowsiness. The steering pattern analysis module 130 continuously transmits steering behavior data to the processing unit 134 for real-time correlation with fatigue indicators. The steering pattern analysis module 130 ensures that abnormal driving patterns are promptly identified for alert generation.
[0073] The touch-based heart rate sensor 132 embedded in the smart steering wheel sensor 126 measures heart rate variability and assesses physiological indicators of fatigue. The touch-based heart rate sensor 132 transmits heart rate data to the processing unit 134 for integration with other drowsiness detection parameters. The touch-based heart rate sensor 132 ensures that physiological fatigue markers contribute to comprehensive drowsiness detection.
[0074] The processing unit 134 connects to the cloud storage 114 via the wireless communication network 112 and continuously evaluates real-time drowsiness data collected from multiple sensors. The processing unit 134 serves as the core computational hub, executing various algorithms for detecting and responding to driver fatigue. The processing unit 134 receives data streams from the artificial intelligence enabled smart sunglasses 102, the in-car artificial intelligence camera unit 116, and the smart steering wheel sensor 126. By processing this real-time data, the processing unit 134 determines the driver’s drowsiness level based on pre-established thresholds. The processing unit 134 integrates multiple sensors inputs, filters out noise, and ensures accurate fatigue detection by employing predictive modelling techniques. The processing unit 134 also categorizes the severity of drowsiness episodes and dynamically adjusts the system's response accordingly. The processing unit 134 operates without delay, ensuring immediate driver feedback through various alert mechanisms and intervention protocols. The processing unit 134 utilizes its computational efficiency to continuously refine its analysis based on historical data, thereby enhancing the precision of its predictions over time.
[0075] The multi-sensor integration module 136, housed within the processing unit 134, consolidates data from all sensors embedded in the artificial intelligence enabled smart sunglasses 102, the in-car artificial intelligence camera unit 116, and the smart steering wheel sensor 126. The multi-sensor integration module 136 cross-verifies different drowsiness indicators to prevent false positives and improve detection accuracy. The multi-sensor integration module 136 applies advanced sensor fusion techniques, ensuring seamless data synchronization and correlation. The multi-sensor integration module 136 optimizes power consumption by selectively prioritizing data streams that indicate high-risk drowsiness episodes. The multi-sensor integration module 136 continuously updates its analytical models by adapting to real-world driving conditions and driver-specific behavioural patterns. The multi-sensor integration module 136 enhances reliability by integrating redundant detection methods, minimizing system failure risks.
[0076] The adaptive machine learning algorithm 138, embedded within the processing unit 134, refines drowsiness detection capabilities over time by learning from real-time driver behaviour patterns. The adaptive machine learning algorithm 138 employs pattern recognition techniques to identify subtle variations in eye movements, steering behaviours, and physiological parameters indicative of fatigue. The adaptive machine learning algorithm 138 continuously updates its detection models based on historical data, ensuring improved accuracy for different driving environments. The adaptive machine learning algorithm 138 dynamically adjusts drowsiness detection thresholds based on environmental factors, vehicle speed, and driving duration. The adaptive machine learning algorithm 138 enhances personalized driver monitoring by recognizing individual fatigue tendencies and issuing targeted alerts accordingly. The adaptive machine learning algorithm 138 ensures that the system becomes more efficient with prolonged use, reducing the likelihood of false alarms and improving overall safety outcomes.
[0077] The real-time alert generation module 140, integrated within the processing unit 134, issues immediate warnings based on drowsiness detection thresholds. The real-time alert generation module 140 triggers a combination of alert mechanisms, including vibration alerts via the haptic feedback mechanism 110 of the artificial intelligence enabled smart sunglasses 102, auditory alarms through the alarm unit 146, and visual notifications on the vehicle’s dashboard display. The real-time alert generation module 140 ensures that warnings are delivered promptly, allowing the driver to take corrective action before drowsiness compromises vehicle control. The real-time alert generation module 140 dynamically adjusts alert intensity based on the severity of detected drowsiness patterns. The real-time alert generation module 140 prioritizes non-intrusive alerts under mild fatigue conditions and escalates warning intensity in extreme drowsiness scenarios. The real-time alert generation module 140 ensures seamless driver interaction by maintaining an intuitive and responsive alert system that effectively prevents driver inattention.
[0078] The emergency intervention module 142, integrated into the processing unit 134, activates advanced safety measures upon detecting extreme drowsiness. The emergency intervention module 142 assesses prolonged drowsiness indicators and determines whether automated intervention is required to prevent accidents. The emergency intervention module 142 dynamically engages various countermeasures, including activating the automated vehicle speed reduction unit 150 to safely decelerate the vehicle, issuing immediate wake-up alerts, and notifying emergency contacts. The emergency intervention module 142 ensures driver safety by preventing potential collisions caused by uncontrolled fatigue-induced driving errors. The emergency intervention module 142 operates within milliseconds, ensuring swift corrective action without compromising vehicle stability.
[0079] The emergency notification module 144, embedded within the processing unit 134, transmits distress alerts to designated emergency contacts or road safety authorities upon detecting prolonged drowsiness episodes. The emergency notification module 144 ensures timely intervention by providing real-time location tracking and driver status updates. The emergency notification module 144 integrates seamlessly with the wireless communication network 112 to ensure reliable data transmission under all driving conditions. The emergency notification module 144 optimizes emergency response times by delivering precise information regarding the severity and duration of drowsiness-related incidents.
[0080] The alarm unit 146, connected to the processing unit 134, generates high-intensity auditory alerts upon detecting severe drowsiness. The alarm unit 146 ensures immediate driver awareness by producing distinct warning sounds that effectively interrupt drowsiness-induced inattention. The alarm unit 146 dynamically adjusts volume and frequency based on the severity of the detected fatigue levels. The alarm unit 146 maintains optimal alert effectiveness by utilizing sound patterns that maximize driver responsiveness.
[0081] The vibration enabled seat unit 148, connected to the processing unit 134, provides targeted haptic feedback to the driver for immediate alertness restoration. The vibration enabled seat unit 148 activates upon detecting drowsiness-related behaviours and delivers vibration pulses to prompt driver attention. The vibration enabled seat unit 148 enhances safety by ensuring that the driver remains engaged and responsive during prolonged journeys. The vibration enabled seat unit 148 maintains ergonomic comfort while delivering effective stimulation to counteract fatigue.
[0082] The automated vehicle speed reduction unit 150, connected to the processing unit 134, dynamically controls the vehicle’s acceleration to minimize collision risks associated with drowsy driving. The automated vehicle speed reduction unit 150 intervenes upon detecting extreme drowsiness episodes, ensuring a gradual and controlled deceleration to prevent abrupt braking. The automated vehicle speed reduction unit 150 synchronizes with other intervention measures, allowing for a coordinated response to fatigue-induced driving impairment. The automated vehicle speed reduction unit 150 maintains vehicular stability by integrating with the vehicle’s onboard control systems, ensuring that speed adjustments occur smoothly and safely.
[0083] The best mode of operation of the smart vehicle driver alertness system 100 begins with the artificial intelligence enabled smart sunglasses 102 continuously detecting and analysing driver drowsiness parameters in real time. The infrared eye-tracking sensor 104 detects eye closure frequency and blinking patterns indicative of drowsiness, transmitting the data to the processing unit 134 through the wireless communication network 112. Simultaneously, the pupil dilation and gaze tracking sensor 106 monitors eye movement, gaze deviation, and focus shifts, sending real-time fatigue indicators to the multi-sensor integration module 136 of the processing unit 134. The real-time yawn detection sensor 108 identifies mouth movements and yawning frequency through an embedded microphone and motion sensors, further contributing to the assessment of drowsiness levels. The haptic feedback mechanism 110, upon receiving drowsiness detection signals from the processing unit 134 via the wireless communication network 112, generates vibration alerts to provide immediate driver feedback and prevent potential accidents.
[0084] The wireless communication network 112 serves as the backbone of the system, enabling seamless data transmission between all components and ensuring that real-time drowsiness detection data reaches the cloud storage 114 without latency. The cloud storage 114 collects and stores all driver drowsiness-related parameters, allowing the processing unit 134 to access and analyse historical data for improved fatigue prediction accuracy. The in-car artificial intelligence camera unit 116, connected to the cloud storage 114 through the wireless communication network 112, processes facial recognition data, head movements, and eye-tracking information to enhance the overall drowsiness detection capabilities. The facial recognition module 118 detects head tilts, nodding patterns, and micro-sleep episodes, ensuring that even the slightest signs of fatigue-related symptoms are recognized. The head pose estimation module 120 analyses head position deviations to correlate movement inconsistencies with fatigue levels. The driver posture monitoring module 122 continuously detects unnatural body movements, shoulder slouching, and head displacement patterns that are indicative of fatigue. The low-light and infrared night vision module 124 ensures that the system maintains accurate detection performance in varying lighting conditions, including nighttime driving, by processing visual data in low-visibility environments.
[0085] The smart steering wheel sensor 126, also connected to the cloud storage 114 through the wireless communication network 112, plays a crucial role in detecting variations in driving patterns and physiological responses of the driver. The grip pressure monitoring module 128 detects reduced grip strength associated with fatigue and loss of alertness, transmitting real-time data to the processing unit 134 for drowsiness correlation. The steering pattern analysis module 130 identifies irregular lane drifting and inconsistent steering behaviours that are indicative of drowsiness, ensuring that deviations in driving performance are accounted for in the overall assessment. The touch-based heart rate sensor 132 continuously measures heart rate variability, assessing physiological indicators of fatigue and relaying this data to the multi-sensor integration module 136 of the processing unit 134 for further evaluation.
[0086] The processing unit 134 is the central computational hub of the smart vehicle driver alertness system 100, integrating data from all sensors and executing real-time drowsiness detection and intervention protocols. The multi-sensor integration module 136 combines data from the artificial intelligence enabled smart sunglasses 102, the in-car artificial intelligence camera unit 116, and the smart steering wheel sensor 126, ensuring a comprehensive drowsiness analysis that takes multiple fatigue indicators into account. The adaptive machine learning algorithm 138 continuously processes historical driver behaviour patterns to improve prediction accuracy over time, enabling the system to personalize its detection and intervention strategies for different individuals. The real-time alert generation module 140 issues fatigue warnings via multiple alert mechanisms, including vibration alerts through the haptic feedback mechanism 110, auditory alarms through the alarm unit 146, and visual notifications on the vehicle’s dashboard display.
[0087] When extreme drowsiness is detected, the emergency intervention module 142 activates critical safety measures to prevent accidents. The emergency intervention module 142 triggers the automated vehicle speed reduction unit 150, dynamically adjusting the vehicle’s acceleration to minimize collision risks and enhance road safety. The emergency notification module 144 transmits distress alerts to emergency contacts or road safety authorities, ensuring that external assistance is available when prolonged and extreme drowsiness is detected. The alarm unit 146 generates high-intensity auditory alerts, forcing the driver to regain alertness through an immediate and unmistakable stimulus. The vibration-enabled seat unit 148 provides additional haptic feedback to restore driver awareness, reinforcing the urgency of the detected drowsiness-related behaviours.
[0088] By integrating all these components into a single, cohesive system, the smart vehicle driver alertness system 100 ensures real-time, non-intrusive monitoring and proactive intervention to mitigate the risks associated with driver fatigue. The continuous operation of the artificial intelligence enabled smart sunglasses 102, the in-car artificial intelligence camera unit 116, and the smart steering wheel sensor 126 ensures that multiple data sources are analysed simultaneously, allowing the processing unit 134 to generate precise fatigue assessments. The use of adaptive machine learning through the adaptive machine learning algorithm 138 further refines the system’s ability to detect early signs of drowsiness and provide tailored interventions for different drivers. By leveraging cloud-based data storage through the cloud storage 114, the system maintains access to historical fatigue data, improving its ability to detect patterns and prevent accidents before they occur. The integration of real-time alerts, haptic feedback mechanisms, and automated vehicle speed reduction ensures that drivers receive immediate warnings while also enabling automatic intervention when necessary. The smart vehicle driver alertness system 100 effectively addresses the critical issue of driver fatigue by combining cutting-edge artificial intelligence technology, real-time monitoring, and proactive safety measures to ensure safer driving conditions for all road users.
[0089] FIG. 2 illustrates a flow chart of a smart vehicle driver alertness system, in accordance with an exemplary embodiment of the present disclosure.
[0090] At 202, The infrared eye-tracking sensor continuously monitors eye closure frequency and blinking patterns. The pupil dilation and gaze tracking sensor track eye movement, gaze deviation, and focus shifts. The real-time yawn detection sensor uses the embedded microphone and motion sensors to detect mouth movements and yawning frequency. If the sunglasses detect drowsiness-related behaviours, the haptic feedback mechanism generates vibration alerts to provide immediate feedback to the driver. The smart sunglasses transmit the captured and processed data to the cloud storage via a wireless communication network.
[0091] At 204, The in-car ai camera unit continuously monitors the driver's facial expressions and head movements. The facial recognition module detects head tilts, nodding patterns, and micro-sleep episodes. The head pose estimation module analyses head position deviations. The driver posture monitoring module detects unnatural body movement. The low-light and infrared night vision module ensures accurate detection in all lighting conditions. The in-car ai camera unit transmits the processed data to the cloud storage via the wireless communication network.
[0092] At 206, The smart steering wheel sensor continuously monitors the driver's driving patterns and physiological responses. The grip pressure monitoring module detects reduced grip strength. The steering pattern analysis module identifies irregular lane drifting and inconsistent steering behaviours. The touch-based heart rate sensor measures heart rate variability. The smart steering wheel sensor transmits the captured data to the cloud storage via the wireless communication network.
[0093] At 208, The processing unit retrieves the data from the cloud storage. The multi-sensor integration module combines the data from the smart sunglasses, in-car camera unit, and smart steering wheel sensor. The adaptive machine learning algorithm analyses the combined data to identify driver-specific fatigue patterns and improve prediction accuracy. The real-time alert generation module issues fatigue warnings based on detected drowsiness thresholds, warnings may include vibrations, alarms, and visual alerts. If extreme drowsiness is detected, the emergency intervention module activates safety measures this may include vehicle speed reduction via the automated vehicle speed reduction unit. If prolonged and extreme drowsiness is detected, the emergency notification module transmits distress alerts to emergency contacts or road safety authorities.
[0094] At 210, The alarm unit generates high-intensity auditory alerts upon detecting severe drowsiness.
[0095] At 212, The vibration enabled seat unit provides haptic feedback to wake up the driver.
[0096] At 214, The automated vehicle speed reduction unit dynamically controls the vehicle’s acceleration to minimize collision risks.
[0097] FIG. 3 illustrates a flow chart of a method for real-time, non-intrusive driver drowsiness detection and alert generation, in accordance with an exemplary embodiment of the present disclosure.
[0098] At 302, detecting eye closure frequency and blinking patterns indicative of drowsiness through an infrared eye-tracking sensor integrated into an artificial intelligence-enabled smart sunglasses.
[0099] At 304, monitoring eye movement, gaze deviation, and focus shifts for early detection of fatigue through a pupil dilation and gaze tracking sensor integrated into the artificial intelligence-enabled smart sunglasses.
[0100] At 306, identifying mouth movements and yawning frequency through a real-time yawn detection sensor comprising an embedded microphone and motion sensors integrated into the artificial intelligence-enabled smart sunglasses.
[0101] At 308, providing vibration alerts upon detecting fatigue-related behaviours through a haptic feedback mechanism integrated into the artificial intelligence-enabled smart sunglasses.
[0102] At 310, transmitting real-time drowsiness detection data to a cloud storage through a wireless communication network integrated into the artificial intelligence-enabled smart sunglasses.
[0103] At 312, processing facial recognition data, head movements, and eye-tracking information for advanced drowsiness detection through an in-car artificial intelligence camera unit connected to the cloud storage via the wireless communication network.
[0104] At 314, detecting head tilts, nodding patterns, and micro-sleep episodes indicative of drowsiness through a facial recognition module integrated into the in-car artificial intelligence camera unit.
[0105] At 316, analysing head position deviations that correlate with fatigue-related symptoms through a head pose estimation module integrated into the in-car artificial intelligence camera unit.
[0106] At 318, detecting unnatural body movements, shoulder slouching, and head displacement patterns indicative of fatigue-related symptoms through a driver posture monitoring module integrated into the in-car artificial intelligence camera unit.
[0107] At 320, ensuring accurate detection of fatigue-related behaviours in varying lighting conditions, including nighttime driving, through a low-light and infrared night vision module integrated into the in-car artificial intelligence camera unit.
[0108] At 322, detecting variations in grip strength, steering behaviour, and physiological responses of the driver through a smart steering wheel sensor connected to the cloud storage via the wireless communication network.
[0109] At 324, measuring reductions in grip strength associated with fatigue and loss of alertness through a grip pressure monitoring module integrated into the smart steering wheel sensor.
[0110] At 326, identifying irregular lane drifting and inconsistent steering behaviours associated with drowsiness through a steering pattern analysis module integrated into the smart steering wheel sensor.
[0111] At 328, measuring heart rate variability and assessing physiological indicators of fatigue through a touch-based heart rate sensor integrated into the smart steering wheel sensor.
[0112] At 330, aggregating real-time drowsiness detection data from multiple sources for comprehensive drowsiness analysis through a multi-sensor integration module integrated into a processing unit connected to the cloud storage via the wireless communication network.
[0113] At 332, analysing driver-specific fatigue patterns and improving prediction accuracy over time through an adaptive machine learning algorithm integrated into the processing unit.
[0114] At 334, issuing fatigue warnings via vibrations, alarms, and visual alerts based on detected drowsiness thresholds through a real-time alert generation module integrated into the processing unit.
[0115] At 336, activating safety measures upon detecting extreme drowsiness through an emergency intervention module integrated into the processing unit.
[0116] At 338, generating high-intensity auditory alerts upon detecting severe drowsiness through an alarm unit integrated into the emergency intervention module.
[0117] At 340, providing haptic feedback to the driver for immediate alertness restoration through a vibration-enabled seat unit integrated into the emergency intervention module.
[0118] At 342, dynamically controlling the vehicle’s acceleration to minimize collision risks through an automated vehicle speed reduction unit integrated into the emergency intervention module.
[0119] At 344, transmitting distress alerts to emergency contacts or road safety authorities upon detecting prolonged and extreme drowsiness through an emergency notification module integrated into the processing unit.
[0120] FIG. 4 illustrates a perspective view of a smart vehicle driver alertness system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0121] The initialize libraries 402 step involves loading all necessary software libraries required for facial landmark detection, image processing, and drowsiness detection. These libraries provide essential functions for processing real-time video feeds and performing complex calculations, ensuring the efficient operation of the smart vehicle driver alertness system 100.
[0122] the initialize counter 404 step sets up a numerical counter to track the duration of detected drowsiness signs. This counter plays a crucial role in determining whether an alert should be triggered based on continuous detection of fatigue-related symptoms.
[0123] The initialize camera 406 step activates the visual input source, allowing real-time video feed acquisition for processing. A properly initialized camera is essential for the subsequent steps in the drowsiness detection process.
[0124] The is camera initialized 408 step verifies whether the camera has been successfully activated. If the camera does not initialize correctly, the abort the process and raise an exception 410 step halts the execution of the system and generates an error message, preventing the system from running without proper video input. Once the camera is initialized, the start capturing 412 step begins recording frames from the real-time video feed. These frames serve as the primary data source for analysing driver drowsiness indicators.
[0125] The extract frames 414 step isolates individual images from the continuous video feed, enabling frame-by-frame analysis for detecting fatigue-related signs.
[0126] The colour to gray conversion 416 step transforms the captured frames into grayscale images. Grayscale conversion reduces computational complexity and enhances contrast, improving the accuracy of facial landmark detection.
[0127] The apply facial landmark to face region 418 step identifies key points on the driver’s face, enabling precise monitoring of eye, mouth, and head movements.
[0128] The apply facial landmark to eye region 420 step focuses on detecting the driver’s eye position, tracking changes in eye openness and movement.
[0129] The apply facial landmark to open eye region 422 step determines whether the driver’s eyes are open, providing essential data for detecting signs of drowsiness.
[0130] The calculate eye aspect ratio (ear) 424 step computes the relationship between vertical and horizontal distances of the eye region. A decreasing ear value suggests that the driver’s eyes are closing, indicating possible drowsiness.
[0131] The compare with threshold 426 step assesses the calculated ear value against a predefined threshold, helping determine whether the driver’s eyes remain closed beyond a safe limit. If the ear >= threshold 428 condition is met, the reset counter 430 step ensures that the numerical counter does not increase, as the driver is not exhibiting signs of drowsiness. If the ear >= threshold 428 condition is not met, the drowsiness alert 432 step generates an early warning, signalling that signs of fatigue are emerging.
[0132] The increment counter 434 step increases the numerical counter value each time a low ear value is detected, tracking the duration of eye closure. If the counter >= eye frame threshold 436 condition is satisfied, the alarm on and alert driver 438 step activates an alarm system, notifying the driver to regain focus. The alarm system serves as an immediate intervention mechanism to prevent potential accidents. If the counter >= eye frame threshold 436 condition is not met, the system continues monitoring the driver’s eye movements.
[0133] The stop the alarm 440 step deactivates the alert system once the driver regains alertness and the ear value returns to a normal range.
[0134] The apply facial landmark to mouth region 442 step identifies key points around the driver’s mouth, enabling real-time monitoring of yawning behaviour.
[0135] The calculate lip distance 444 step measures the distance between the upper and lower lips, detecting mouth openings indicative of yawning. The abs (lip distance) > 0 446 step evaluates whether the measured lip distance exceeds a predefined threshold. If the detected yawning frequency remains within acceptable limits, the system does not trigger an alarm. However, if excessive yawning is detected along with prolonged eye closure, the smart vehicle driver alertness system 100 enhances alert intensity to ensure driver safety.
[0136] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it will be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0137] A person of ordinary skill in the art may be aware that, in combination with the examples described in the embodiments disclosed in this specification, units and algorithm steps may be implemented by electronic hardware, computer software, or a combination thereof.
[0138] The foregoing descriptions of specific embodiments of the present disclosure have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the present disclosure to the precise forms disclosed, and many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described to best explain the principles of the present disclosure and its practical application, and to thereby enable others skilled in the art to best utilize the present disclosure and various embodiments with various modifications as are suited to the particular use contemplated. It is understood that various omissions and substitutions of equivalents are contemplated as circumstances may suggest or render expedient, but such omissions and substitutions are intended to cover the application or implementation without departing from the scope of the present disclosure.
[0139] Disjunctive language such as the phrase “at least one of X, Y, Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (e.g., X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present.
[0140] In a case that no conflict occurs, the embodiments in the present disclosure and the features in the embodiments may be mutually combined. The foregoing descriptions are merely specific implementations of the present disclosure, but are not intended to limit the protection scope of the present disclosure. Any variation or replacement readily figured out by a person skilled in the art within the technical scope disclosed in the present disclosure shall fall within the protection scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.
, Claims:I/We Claim:
1. A smart vehicle driver alertness system (100), the system (100) comprises:
an artificial intelligence enabled smart sunglasses (102) configured to detect and analyse driver drowsiness parameters in real time, wherein the artificial intelligence enabled smart sunglasses (102) comprises:
an infrared eye-tracking sensor (104) configured to detect eye closure frequency and blinking patterns indicative of drowsiness;
a pupil dilation and gaze tracking sensor (106) configured to monitor eye movement, gaze deviation, and focus shifts for early detection of fatigue;
a real-time yawn detection sensor (108) configured to detect mouth movements and yawning frequency through an embedded microphone and motion sensors;
a haptic feedback mechanism (110) configured to generate vibration alerts upon detecting drowsiness-related behaviours, thereby providing immediate driver feedback;
a wireless communication network (112) configured to transmit real-time drowsiness detection data to a cloud storage (114);
an in-car artificial intelligence camera unit (116) connected to the cloud storage (114) through the wireless communication network (112) and being configured to process facial recognition data, head movements, and eye-tracking information for advanced drowsiness detection, wherein the in-car artificial intelligence camera unit (116) comprises:
a facial recognition module (118) configured to detect head tilts, nodding patterns, and micro-sleep episodes indicative of drowsiness;
a head pose estimation module (120) configured to analyse head position deviations that correlate with fatigue-related symptoms;
a driver posture monitoring module (122) configured for continuously detecting unnatural body movements, shoulder slouching, and head displacement patterns indicative of fatigue-related symptoms;
a low-light and infrared night vision module (124) configured to ensure accurate detection of fatigue-related behaviours in varying lighting conditions, including nighttime driving;
a smart steering wheel sensor (126) connected to the the cloud storage (114) through the wireless communication network (112) and being configured to detect variations in driving patterns and physiological responses of the driver, wherein the smart steering wheel sensor (126) comprises:
a grip pressure monitoring module (128) configured to detect reduced grip strength associated with fatigue and loss of alertness;
a steering pattern analysis module (130) configured to identify irregular lane drifting and inconsistent steering behaviours associated with drowsiness;
a touch-based heart rate sensor (132) configured to measure heart rate variability and assess physiological indicators of fatigue;
a processing unit (134) connected to the cloud storage (114) through the wireless communication network (112) and being configured to evaluate real-time drowsiness data and generate proactive warnings, wherein the processing unit (134) comprises:
a multi-sensor integration module (136) configured to combine data from the artificial intelligence enabled smart sunglasses (102), the in-car artificial intelligence camera unit (116), and the smart steering wheel sensor (126) for comprehensive drowsiness analysis;
an adaptive machine learning algorithm (138) configured to analyse driver-specific fatigue patterns and improve prediction accuracy over time;
a real-time alert generation module (140) configured to issue fatigue warnings via vibrations, alarms, and visual alerts based on detected drowsiness thresholds;
an emergency intervention module (142) configured to activate safety measures when extreme drowsiness is detected, including vehicle speed reduction and driver wake-up alerts;
an emergency notification module (144) configured to transmit distress alerts to emergency contacts or road safety authorities upon detecting prolonged and extreme drowsiness;
an alarm unit (146) connected to the processing unit (134) and being configured to generate high-intensity auditory alerts upon detecting severe drowsiness;
a vibration enabled seat unit (148) connected to the processing unit (134) and being configured to provide haptic feedback to the driver for immediate alertness restoration;
an automated vehicle speed reduction unit (150) connected to the processing unit (134) and being configured to dynamically control the vehicle’s acceleration to minimize collision risks;
2. The system (100) as claimed in claim 1, wherein the infrared eye-tracking sensor (104) of the artificial intelligence enabled smart sunglasses (102) is continuously processing real-time eye movement data and transmitting the detected eye closure frequency and blinking patterns to the processing unit (134) through the wireless communication network (112).
3. The system (100) as claimed in claim 1, wherein the pupil dilation and gaze tracking sensor (106) of the artificial intelligence enabled smart sunglasses (102) is continuously detecting gaze shifts and eye focus deviations, transmitting the fatigue-related parameters to the multi-sensor integration module (136) through the wireless communication network (112) for enhanced drowsiness detection accuracy.
4. The system (100) as claimed in claim 1, wherein the haptic feedback mechanism (110) of the artificial intelligence enabled smart sunglasses (102) is generating vibration alerts upon receiving a drowsiness detection signal from the processing unit (134) via the wireless communication network (112), ensuring immediate driver response.
5. The system (100) as claimed in claim 1, wherein the low-light and infrared night vision module (124) of the in-car artificial intelligence camera unit (116) is processing drowsiness detection data in varying lighting conditions and transmitting enhanced visual recognition data to the processing unit (134) via the wireless communication network (112) to maintain accurate monitoring.
6. The system (100) as claimed in claim 1, wherein the grip pressure monitoring module (128) of the smart steering wheel sensor (126) is continuously detecting grip strength variations and transmitting real-time grip pressure data to the processing unit (134) via the wireless communication network (112) for fatigue pattern correlation.
7. The system (100) as claimed in claim 1, wherein the real-time alert generation module (140) of the processing unit (134) is issuing customized warnings through multiple alert mechanisms, including vibration alerts via the smart sunglasses, auditory alarms through the alarm unit (146), and visual alerts on the vehicle’s dashboard display.
8. The system (100) as claimed in claim 1, wherein the emergency intervention module (142) of the processing unit (134) is activating the automated vehicle speed reduction unit (150) to dynamically adjust the vehicle’s acceleration and notifying emergency contacts or road safety authorities upon detecting prolonged and extreme drowsiness.
9. The system (100) as claimed in claim 1, wherein the system (100) further includes a driver respiration rate monitoring sensor is connected to the smart steering wheel sensor (126), continuously detecting variations in breathing patterns, identifying irregular or slowed respiration as a potential fatigue indicator, and transmitting the respiration data to the processing unit (134) via the wireless communication network (112) for real-time correlation with eye-tracking, heart rate, and steering behaviour to enhance predictive drowsiness detection.
10. A method for real-time, non-intrusive driver drowsiness detection and alert generation (100), the method (100) comprising:
detecting eye closure frequency and blinking patterns indicative of drowsiness through an infrared eye-tracking sensor (104) integrated into an artificial intelligence enabled smart sunglasses (102);
monitoring eye movement, gaze deviation, and focus shifts for early detection of fatigue through a pupil dilation and gaze tracking sensor (106) integrated into the artificial intelligence enabled smart sunglasses (102);
identifying mouth movements and yawning frequency through a real-time yawn detection sensor (108) comprising an embedded microphone and motion sensors integrated into the artificial intelligence enabled smart sunglasses (102);
providing vibration alerts upon detecting fatigue-related behaviours through a haptic feedback mechanism (110) integrated into the artificial intelligence enabled smart sunglasses (102);
transmitting real-time drowsiness detection data to a cloud storage (114) through a wireless communication network (112) integrated into the artificial intelligence enabled smart sunglasses (102);
processing facial recognition data, head movements, and eye-tracking information for advanced drowsiness detection through an in-car artificial intelligence camera unit (116) connected to the cloud storage (114) via the wireless communication network (112);
detecting head tilts, nodding patterns, and micro-sleep episodes indicative of drowsiness through a facial recognition module (118) integrated into the in-car artificial intelligence camera unit (116);
analysing head position deviations that correlate with fatigue-related symptoms through a head pose estimation module (120) integrated into the in-car artificial intelligence camera unit (116);
detecting unnatural body movements, shoulder slouching, and head displacement patterns indicative of fatigue-related symptoms through a driver posture monitoring module (122) integrated into the in-car artificial intelligence camera unit (116);
ensuring accurate detection of fatigue-related behaviours in varying lighting conditions, including nighttime driving, through a low-light and infrared night vision module (124) integrated into the in-car artificial intelligence camera unit (116);
detecting variations in grip strength, steering behaviour, and physiological responses of the driver through a smart steering wheel sensor (126) connected to the cloud storage (114) via the wireless communication network (112);
measuring reductions in grip strength associated with fatigue and loss of alertness through a grip pressure monitoring module (128) integrated into the smart steering wheel sensor (126);
identifying irregular lane drifting and inconsistent steering behaviours associated with drowsiness through a steering pattern analysis module (130) integrated into the smart steering wheel sensor (126);
measuring heart rate variability and assessing physiological indicators of fatigue through a touch-based heart rate sensor (132) integrated into the smart steering wheel sensor (126);
aggregating real-time drowsiness detection data from multiple sources for comprehensive drowsiness analysis through a multi-sensor integration module (136) integrated into a processing unit (134) connected to the cloud storage (114) via the wireless communication network (112);
analysing driver-specific fatigue patterns and improving prediction accuracy over time through an adaptive machine learning algorithm (138) integrated into the processing unit (134);
issuing fatigue warnings via vibrations, alarms, and visual alerts based on detected drowsiness thresholds through a real-time alert generation module (140) integrated into the processing unit (134);
activating safety measures upon detecting extreme drowsiness through an emergency intervention module (142) integrated into the processing unit (134);
generating high-intensity auditory alerts upon detecting severe drowsiness through an alarm unit (146) integrated into the emergency intervention module (142);
providing haptic feedback to the driver for immediate alertness restoration through a vibration-enabled seat unit integrated into the emergency intervention module (142);
dynamically controlling the vehicle’s acceleration to minimize collision risks through an automated vehicle speed reduction unit (150) integrated into the emergency intervention module (142);
transmitting distress alerts to emergency contacts or road safety authorities upon detecting prolonged and extreme drowsiness through an emergency notification module (144) integrated into the processing unit (134).

Documents

Application Documents

# Name Date
1 202541032886-STATEMENT OF UNDERTAKING (FORM 3) [03-04-2025(online)].pdf 2025-04-03
2 202541032886-REQUEST FOR EARLY PUBLICATION(FORM-9) [03-04-2025(online)].pdf 2025-04-03
3 202541032886-POWER OF AUTHORITY [03-04-2025(online)].pdf 2025-04-03
4 202541032886-FORM-9 [03-04-2025(online)].pdf 2025-04-03
5 202541032886-FORM FOR SMALL ENTITY(FORM-28) [03-04-2025(online)].pdf 2025-04-03
6 202541032886-FORM 1 [03-04-2025(online)].pdf 2025-04-03
7 202541032886-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [03-04-2025(online)].pdf 2025-04-03
8 202541032886-DRAWINGS [03-04-2025(online)].pdf 2025-04-03
9 202541032886-DECLARATION OF INVENTORSHIP (FORM 5) [03-04-2025(online)].pdf 2025-04-03
10 202541032886-COMPLETE SPECIFICATION [03-04-2025(online)].pdf 2025-04-03
11 202541032886-Proof of Right [08-04-2025(online)].pdf 2025-04-08