Abstract: A Real time system for detecting the drowsy condition and fatigue condition of a driver of a vehicle using a video camera located inside the vehicle focussing on the eye and mouth. The system also includes a processor for processing the images acquired. The processor , monitors both eyes and determines whether the eye is in an open position or closed state using the eye aspect ratio calculation and the duration of the closure. The processor monitors the mouth and determines whether the driver is yawning. Further if the drowsiness is detected, speed control is initiated which will detach the acceleration vehicle by cutting the accelerator control and by gradually applying brakes for the control of the driver. If fatigue or yawn is detected, an alarm is triggered. An emergency warning at the rear side will be triggered along with the vehicle speed reduction. System works on real time basis.
FIELD OF INVENTION
[0001] The present invention generally relates to real time eye and mouth monitoring using a video camera and, more particularly, relates to detecting a drowsiness and fatigue condition of a person using the pictures captured using camera, and controlling t
RELATED ART
[0002] India ranks one in the number of road accident deaths across the 199 countries reported in the World Road Statistics 2018 followed by China and US. As per the WHO Global Report on Road Safety 2018, India accounts for almost 11% of the accident-related deaths in the World. Road traffic continues to be a major developmental issue, a public health concern and is a leading cause of death and injury across the world killing more than 1.35 million globally as reported in the global status report on road safety 2018 with 90% of these casualties taking place in the developing countries. As per the World Health Organization, accident-related deaths, are known to be the eighth leading cause of death and the first largest cause of death among children aged 5-14 and adults in the age 15-29.
[0003] Drowsy driving is an important, but often unrecognized, traffic safety problem. Every year, traffic accidents due to human errors cause increasing amounts of deaths and injuries globally. Driver drowsiness is recognized as an important factor in vehicle accidents. It was demonstrated that driving performance deteriorates with increased drowsiness with resulting crashes constituting more than 20% of all vehicle accidents. Drowsiness appears in situations of stress and fatigue in an unexpected and inopportune way and may be produced by sleep disorders, certain types of medications, and even, boredom, for example, driving for long periods of time. The sleeping sensation reduces the level of vigilante producing dangerous situations and increases the probability of an accident occurring. Fatigued driving has always been a driver's occupational hazard, which is a significant cause of road traffic accidents and has an important impact on road safety. The drivers' attention and decision-making ability during fatigue driving would be affected and cause accidents
[0004] The driver fatigue detection method based on monitoring physiological parameters is closely related to the physiological status of the driver. Some physiological features can be utilized as fatigue representation, such as electrocardiogram (ECG), electroencephalogram (EEG) electromyography (EMG) and electrooculogram (EOG). These methods have been shown to have good accuracy. However, to measure these parameters, the driver is required to wear the appropriate detection equipment during the driving process, which is a driver's intrusion mechanism that interferes with the driver's normal driving.
[0005) Fatigue driving detection method by analyzing vehicle data is an indirect detection method. Data information sharing between vehicles is used to detect the driver's abnormal behavior. Analyzed time-series data such as vehicle speed and steering wheel angle for fatigue driving detection, which can predict fatigue-related lane departure six seconds in advance. Obtaining and analysing the real¬time data on the vehicle is susceptible to the driver's driving habits and the external environment, so the detection accuracy is closely related to the driver and the driving environment.
[0006] Facial fatigue characteristics include head posture, yawning cycle, blink frequency, etc. are based on the adaptive integration of multiple models for detecting eyes. It can be used to estimate a driver's fatigue state but the multiple models are very complicated. Yawning is used for driver fatigue detection. Facial features can be learned through deep learning techniques. The development of technologies for detecting and preventing drowsiness while driving is a major challenge in the field of accident-avoidance systems. Existing drowsiness detection systems employed in Automotive industry focuses only on audio or visual alerts, which may not prevent accidents as it may not be enough to alert the driver within few seconds (below one or two seconds) as even for a vehicle moving at 60kmph (16.67 m/s). The reaction time for the driver from the warning will be more and thereby accidents cannot be prevented.
SUMMARY
[0007] A Real time system for detecting the drowsy condition and fatigue condition of a driver of a vehicle using a video camera located inside the vehicle which generates images of the driver mainly focusing
on the eye and mouth. The system also includes a processor for processing the images acquired by the video imaging camera. The processor monitors both eyes and determines whether the eye is in an open position or a closed state using the eye aspect ratio calculation and the duration of the closure.
[0008] The processor monitors the mouth and determines whether the driver is yawning or not using the aspect ratio calculation. Further if the drowsiness is detected, speed control is initiated which will detach the acceleration vehicle by cutting the accelerator control and by gradually applying brakes for the control of the driver. If fatigue or yawn is detected, an alarm is triggered. Simultaneously, an emergency warning at the rear side will be triggered along with the vehicle speed reduction. The system allows the driver to regain control once he blows horn or apply brakes. All the methods, system and steps works in real time.
[0009] Several aspects are described below, with reference to diagrams. It should be understood that numerous specific details, relationships, and methods are set forth to provide a full understanding of the present disclosure. One who skilled in the relevant art, however, will readily recognize that the present disclosure can be practiced without one or more of the specific details, or with other methods, etc. In other instances, well-known structures or operations are not shown in detail to avoid obscuring the features of the present disclosure.
BRIEF DESCRIPTION OF DRAWINGS
[0010] FIG.l Eye Aspect Ratio Calculation
[0011 ] FIG.2 Yawn Detection
[0012] FIG.3 Functioning of the system
[0013] FIG.4 Yawn detection real time
[0014] FIG.5 drowsiness detection
[0015] FIG.6 Circuit Diagram of the system
[0016] FIG.7 Alarm installed
[0017] FIG.8 Exhaust Brake
[0018] FIG.9 Speed Limiter
DETAILED DESCRIPTION
[0019] Record the face images of driver by using a night vision camera. A 5 Mega Pixel night vision High Definition camera as the input device and installed to the windscreen of the car using a suitable vacuum cup. This camera is connected to microcomputer as a Programmable Logical Controller and programming language Python is used. Using Dlib, a pre-trained program trained on the HELEN dataset to detect human faces using the pre-defined 68 landmarks. After passing video from the night vision camera feed to the 'dlib' frame by frame, detection of left eye, right eye lower lip and upper lip features of the face is done.
[0020] Draw contours around it and calculate mean 'Eye Aspect Ratio'(E.A.R.) which is the mean of ratio of two distinct vertical distances between the eyelids to its horizontal distances of both eyes as shown in figure 1. Similarly calculate vertical distance between centers of lower lip and upper lip for yawn detection as shown in figure 2. The yawn detected in real time is shown in the figure 4. If the Yawn is detected then a warning in the form of alarm with warning lamp will be given to driver and other passengers in the vehicle. The alarm of installed is shown in figure 7. The drowsiness detected in real time is shown in figure 5. If drowsiness is detected then system will actuate the speed limiting device and engine exhaust brake along with the warnings like alarm with warning lamp, "Emergency Braking" display on rear side of vehicle, stop lamp and hazard warning lamp. Speed limiting function( Accelerator cut off) is working with the help of Speed governor.
[0021] Input signal is gathering from Microcomputer when drowsiness is detected. By using a vacuum operated, Butterfly valve type exhaust brake system for this as shown in figure 8. It will be activated when drowsiness is detected .It is used to ensure that no further acceleration is provided to the vehicle after drowsiness is detected as shown in figure 9. The vacuum is taken from the engine inlet manifold itself and it will store in a vacuum reservoir with a check valve. The vacuum to the engine exhaust brake is controlled by a 12 V
DC direct acting solenoid-spring direction control valve. Deactivate all the warnings.and actuators ( speed limiting function and engine exhaust brake ) if a 12V DC signal from Stop Switch/Horn Switch are obtained. Shutdown the system safely when ignition is switched off. The functioning of the system is shown in the figure 3. The circuit diagram of the whole system is shown in figure 6.
[0022] While various embodiments of the present disclosure have been described above, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-discussed embodiments, but should be defined only in accordance with the following claims and their equivalents.
| # | Name | Date |
|---|---|---|
| 1 | 202141040263-Form-3_As Filed_06-09-2021.pdf | 2021-09-06 |
| 1 | 202141040263-Written submissions and relevant documents [11-09-2024(online)].pdf | 2024-09-11 |
| 1 | 202141040263-Written submissions and relevant documents [14-01-2025(online)].pdf | 2025-01-14 |
| 2 | 202141040263-Correspondence to notify the Controller [17-12-2024(online)].pdf | 2024-12-17 |
| 2 | 202141040263-Form-1_As Filed_06-09-2021.pdf | 2021-09-06 |
| 2 | 202141040263-FORM-26 [23-08-2024(online)].pdf | 2024-08-23 |
| 3 | 202141040263-Correspondence to notify the Controller [20-08-2024(online)].pdf | 2024-08-20 |
| 3 | 202141040263-Form 2(Title Page)Complete_06-09-2021.pdf | 2021-09-06 |
| 3 | 202141040263-US(14)-ExtendedHearingNotice-(HearingDate-30-12-2024)-1100.pdf | 2024-12-10 |
| 4 | 202141040263-Written submissions and relevant documents [11-09-2024(online)].pdf | 2024-09-11 |
| 4 | 202141040263-US(14)-HearingNotice-(HearingDate-29-08-2024).pdf | 2024-07-26 |
| 4 | 202141040263-Drawing_As Filed_06-09-2021.pdf | 2021-09-06 |
| 5 | 202141040263-FORM-26 [23-08-2024(online)].pdf | 2024-08-23 |
| 5 | 202141040263-FORM 13 [19-08-2022(online)].pdf | 2022-08-19 |
| 5 | 202141040263-Description Complete_As Filed_06-09-2021.pdf | 2021-09-06 |
| 6 | 202141040263-POA [19-08-2022(online)].pdf | 2022-08-19 |
| 6 | 202141040263-Correspondence_As Filed_06-09-2021.pdf | 2021-09-06 |
| 6 | 202141040263-Correspondence to notify the Controller [20-08-2024(online)].pdf | 2024-08-20 |
| 7 | 202141040263-US(14)-HearingNotice-(HearingDate-29-08-2024).pdf | 2024-07-26 |
| 7 | 202141040263-RELEVANT DOCUMENTS [19-08-2022(online)].pdf | 2022-08-19 |
| 7 | 202141040263-Claims_As Filed_06-09-2021.pdf | 2021-09-06 |
| 8 | 202141040263-ABSTRACT [11-08-2022(online)].pdf | 2022-08-11 |
| 8 | 202141040263-Abstract_As Filed_06-09-2021.pdf | 2021-09-06 |
| 8 | 202141040263-FORM 13 [19-08-2022(online)].pdf | 2022-08-19 |
| 9 | 202141040263-CLAIMS [11-08-2022(online)].pdf | 2022-08-11 |
| 9 | 202141040263-Form-18_Examination request _18-11-2021.pdf | 2021-11-18 |
| 9 | 202141040263-POA [19-08-2022(online)].pdf | 2022-08-19 |
| 10 | 202141040263-COMPLETE SPECIFICATION [11-08-2022(online)].pdf | 2022-08-11 |
| 10 | 202141040263-Form 9_Early Publication_18-11-2021.pdf | 2021-11-18 |
| 10 | 202141040263-RELEVANT DOCUMENTS [19-08-2022(online)].pdf | 2022-08-19 |
| 11 | 202141040263-ABSTRACT [11-08-2022(online)].pdf | 2022-08-11 |
| 11 | 202141040263-Correspondence_ Form-18_18-11-2021.pdf | 2021-11-18 |
| 11 | 202141040263-DRAWING [11-08-2022(online)].pdf | 2022-08-11 |
| 12 | 202141040263-CLAIMS [11-08-2022(online)].pdf | 2022-08-11 |
| 12 | 202141040263-FER.pdf | 2022-05-02 |
| 12 | 202141040263-FER_SER_REPLY [11-08-2022(online)].pdf | 2022-08-11 |
| 13 | 202141040263-OTHERS [11-08-2022(online)].pdf | 2022-08-11 |
| 13 | 202141040263-COMPLETE SPECIFICATION [11-08-2022(online)].pdf | 2022-08-11 |
| 14 | 202141040263-DRAWING [11-08-2022(online)].pdf | 2022-08-11 |
| 14 | 202141040263-FER.pdf | 2022-05-02 |
| 14 | 202141040263-FER_SER_REPLY [11-08-2022(online)].pdf | 2022-08-11 |
| 15 | 202141040263-Correspondence_ Form-18_18-11-2021.pdf | 2021-11-18 |
| 15 | 202141040263-DRAWING [11-08-2022(online)].pdf | 2022-08-11 |
| 15 | 202141040263-FER_SER_REPLY [11-08-2022(online)].pdf | 2022-08-11 |
| 16 | 202141040263-COMPLETE SPECIFICATION [11-08-2022(online)].pdf | 2022-08-11 |
| 16 | 202141040263-Form 9_Early Publication_18-11-2021.pdf | 2021-11-18 |
| 16 | 202141040263-OTHERS [11-08-2022(online)].pdf | 2022-08-11 |
| 17 | 202141040263-FER.pdf | 2022-05-02 |
| 17 | 202141040263-Form-18_Examination request _18-11-2021.pdf | 2021-11-18 |
| 17 | 202141040263-CLAIMS [11-08-2022(online)].pdf | 2022-08-11 |
| 18 | 202141040263-Abstract_As Filed_06-09-2021.pdf | 2021-09-06 |
| 18 | 202141040263-Correspondence_ Form-18_18-11-2021.pdf | 2021-11-18 |
| 18 | 202141040263-ABSTRACT [11-08-2022(online)].pdf | 2022-08-11 |
| 19 | 202141040263-Claims_As Filed_06-09-2021.pdf | 2021-09-06 |
| 19 | 202141040263-Form 9_Early Publication_18-11-2021.pdf | 2021-11-18 |
| 19 | 202141040263-RELEVANT DOCUMENTS [19-08-2022(online)].pdf | 2022-08-19 |
| 20 | 202141040263-Correspondence_As Filed_06-09-2021.pdf | 2021-09-06 |
| 20 | 202141040263-Form-18_Examination request _18-11-2021.pdf | 2021-11-18 |
| 20 | 202141040263-POA [19-08-2022(online)].pdf | 2022-08-19 |
| 21 | 202141040263-Abstract_As Filed_06-09-2021.pdf | 2021-09-06 |
| 21 | 202141040263-Description Complete_As Filed_06-09-2021.pdf | 2021-09-06 |
| 21 | 202141040263-FORM 13 [19-08-2022(online)].pdf | 2022-08-19 |
| 22 | 202141040263-Claims_As Filed_06-09-2021.pdf | 2021-09-06 |
| 22 | 202141040263-Drawing_As Filed_06-09-2021.pdf | 2021-09-06 |
| 22 | 202141040263-US(14)-HearingNotice-(HearingDate-29-08-2024).pdf | 2024-07-26 |
| 23 | 202141040263-Correspondence to notify the Controller [20-08-2024(online)].pdf | 2024-08-20 |
| 23 | 202141040263-Correspondence_As Filed_06-09-2021.pdf | 2021-09-06 |
| 23 | 202141040263-Form 2(Title Page)Complete_06-09-2021.pdf | 2021-09-06 |
| 24 | 202141040263-Description Complete_As Filed_06-09-2021.pdf | 2021-09-06 |
| 24 | 202141040263-Form-1_As Filed_06-09-2021.pdf | 2021-09-06 |
| 24 | 202141040263-FORM-26 [23-08-2024(online)].pdf | 2024-08-23 |
| 25 | 202141040263-Written submissions and relevant documents [11-09-2024(online)].pdf | 2024-09-11 |
| 25 | 202141040263-Form-3_As Filed_06-09-2021.pdf | 2021-09-06 |
| 25 | 202141040263-Drawing_As Filed_06-09-2021.pdf | 2021-09-06 |
| 26 | 202141040263-US(14)-ExtendedHearingNotice-(HearingDate-30-12-2024)-1100.pdf | 2024-12-10 |
| 26 | 202141040263-Form 2(Title Page)Complete_06-09-2021.pdf | 2021-09-06 |
| 27 | 202141040263-Form-1_As Filed_06-09-2021.pdf | 2021-09-06 |
| 27 | 202141040263-Correspondence to notify the Controller [17-12-2024(online)].pdf | 2024-12-17 |
| 28 | 202141040263-Written submissions and relevant documents [14-01-2025(online)].pdf | 2025-01-14 |
| 28 | 202141040263-Form-3_As Filed_06-09-2021.pdf | 2021-09-06 |
| 1 | searchE_29-04-2022.pdf |