Abstract: A system and a method is envisaged for detecting a driver falling asleep while driving. The present invention captures the facial image of the driver using an IR camera and further performs the steps of face detection, binarization, pupil detection and pupil tracking for determining whether the driver is sleeping. Particularly, the invention determines the difference between the centroids of the pupils to detect whether the driver is sleeping and if the driver is found asleep, the invention raises an alarm for alerting the driver and the passengers in the vehicle.
FORM -2
THE PATENTS ACT, 1970 (39 of 1970) & THE PATENTS RULES, 2003
PROVISIONAL
Specification
(See Section 10 and rule 13)
METHODS AND APPARATUS FOR DETECTING A DRIVER FALLING ASLEEP WHILE DRIVING
TATA CONSULTANCY SERVICES LTD.,
an Indian Company of Nirmal Building, 9th floor, Nariman Point, Mumbai 400 021, Maharashtra, India
THE FOLLOWING SPECIFICATION DESCRIBES THE INVENTION
Field of the Invention:
This invention relates to methods and apparatus for detecting a driver falling asleep while driving.
Background of the Invention:
Several means have been suggested to detect driver's drowsiness while driving. Most of these detection devices rely on sensor technology. Although some of the approaches have been made using visual technology, these detection devices use complex methods in order to detect drowsiness and also these devices are costly.
Devices disclosed in existing literature are listed in Table 1.
Table 1: Different approaches for sleep detection solutions
No. Title Approach Comments
1 US Patent 5689241 - Sleep detection and driver alert apparatus This talks about device which monitors driver via the infrared camera the thermal image changes in pixel color of open versus closed eyes of the driver via the temperature sensitive infrared portion of the digitized photographic image passed it uses sensor technology other than vision computing. Is costly and not accurate.
through a video charge coupling device. The combination of non movement and a decrease in breath temperature, which is a physiological response to hypoventilation thus initiating drowsiness, will trigger the infrared camera to zoom onto the eye region of the driver.
2 Patent - 4297685 Apparatus and method for sleep detection This talks about onset of sleep by a drop in brain temperature as measured within the auditory canal. Auditory canal temperature is thus measured and its value used to activate an audible or visible alarm to prevent the subject's falling asleep This uses sensor technology.
3 US Patent 7189204 - Sleep detection using an adjustable This talks about adjusting a sleep threshold associated with a first sleep-related This uses sensor technology.
threshold signal using a second sleep-related signal. The first sleep-related signal is compared to the adjusted threshold and sleep is detected based on the comparison. The sleep-related signals may be derived from implantable or external sensors.
4 US Patent 5900819 - Drowsy driver
detection system This talks about detect drowsiness by measuring vehicle behaviors including speed, lateral position, turning angle. This uses sensor technology.
5 US Patent 6243015 Driver's drowsiness detection method of drowsy driving warning system This talks about driver's
drowsiness detection
method by determining a
vertical width of a driver's
eye
from a driver's face image
input from a CCD camera This uses visual computing
technology, based on the vertical width of the driver's eye.
6 US Patent 6236968 - Sleep prevention dialog based car system This talks about detecting driver's drowsiness by carrying a conversation with the driver on various This is an empirical method and not accurate.
topics, analyzes a driver's answer and the contents of the answer together with his voice patterns to determine if he is alert while driving.
7 US Patent 6822573 Drowsiness detection system This talks about detecting driver' s drowsiness by measuring heart rate and monitoring head movements of driver This uses sensor technology.
8 CA 2201694 -Alertness and drowsiness detection and
tracking system. This talks about detecting driver driver drowsiness by measuring amplitude, energy, or power contribution of EEG signal which is collected by connecting electrodes to human body. This uses sensor technology.
Summary of the Invention:
This invention aims at detecting sleep or drowsiness of a driver by computer vision technology. The invention consists of series of steps as mentioned below. i. Binarization using Mean subtraction and histogram thresholding, ii. Face detection using euler number and template matching
iii. Pupil detection in the face using simple binary operations. iv. Pupil tracking using nearest neighborhood concept.
In particular, this invention envisages a simple way of detecting face region in an IR camera using mean subtraction followed by histogram thresholding.
In particular, this invention envisages a simple way of detecting pupils in IR camera from face region by performing XOR operation between the euler number component binary image and thresholded binary image.
Still particularly, this invention envisages a simple way of tracking pupils in an IR camera focused on the face region using the nearest neighborhood concept.
Brief Description of the Accompanying Drawings:
Error! Reference source not found, detection means according to this
invention;
Figure 2(a) shows a typical input frame whose mean subtracted image is shown
in Figure 2(b);
Figure 2 (c) shows the plotted Histogram for mean subtracted image .
Figure 2(d) shows a threshold image which contains a face along with some non
face region.
Figure.3 shows only a face region excluding pupils.
Figure 4 shows a typical template face model.
Figure.5 shows a typical template matching scenario where Figure.4 is being
imposed on Figure.2 (a).
Figure.6 is resultant image obtained after performing XOR operation between
Figure.2(c) and Figure.3.
Figure.7 (a), shows typical examples where black filled circles represent stray
images and white circles represent pupils.
Figure 7 shows the computation of the euclidean distance between each
centroid of circle.
Figure.7(c) shows that the difference between pupil's centroids is always fixed,
irrespective of whatever may be the position of driver.
Detailed Description of Invention:
The sleep detection solution implemented consists of following components as shown in Error! Reference source not found.: 1) Binarization :
In order to separate driver's face from background using IR camera the mean is subtracted from the grayscale image. Then dynamic thresholding is used using histogram technique for a mean subtracted image M[I,J]. The histogram for a mean subtracted M[ij] is plotted. Figure 2(a) shows a typical input frame whose mean subtracted image is shown in Figure 2(b). Histogram for mean subtracted image is plotted as shown in Figure 2(c). In order to detect threshold (T), the standard deviation (a) is computed for the distance between all the peaks in histogram. If any adjacent peak falls within ±30, then that peak line is discarded. The maximum peaks are searched which is a face and it is considered as threshold (T). If H[/,y] is the output image for an input imageM[i,j], then
H[i,j]= 255 If M[i,j]>T
= 0 Otherwise.
Figure 2(d) shows threshold image which contains face along with some non
face region.
2) Face Detection
After thresholding, there is a need to distinguish face and non face regions. To detect the face region in the thresholded image it is assumed that in the first frame of a video driver is not sleeping, i.e. eyes are opened. In the Face region, eyes will come as holes and these holes play significant role to classify region as a face or non face euler Number is used to find the number of holes in a region.
2.1 Euler Number
The Euler number of an image is defined as i.e.,E = C-.H
Where E: The Euler number
C: The number of connected components
H: The number of holes in a region. As the face region has at least two holes for eyes, if any component having less than two holes then that component is rejected. Figure.3 shows only a face region excluding pupils.
The component where there are at least two holes in an image are selected. But every component having two or more holes may not be a face. A template matching means is used to classify further whether the component is a face or non face.
2.2 Template Matching
Template matching uses a pre-defined model i.e. a human face. The model must be such that it rejects non-face regions. The template face has to be positioned and rotated in the same coordinate as the face image in order to have higher degree of matching. A typical template face model is shown in Figure.4.
Template matching technique needs to have knowledge of orientation and position of the region with which template has to be match. The position is characterized by the region centroid is given by equation (1) and equation (2.).
Where B: Matrix of size [n x m] of the blob. A : Area in pixels of the region.
After getting the position, where template should be imposed, another important criterion is to find the orientation of the blob in order to have higher degree of matching with template model. The orientation of blob is calculated as in equation (3)
Figure.5 shows a typical template matching scenario where Figure.4 is being imposed on Figure.2 (a). After calculating the position and orientation, the cross-correlation between template face model and human face region is correlated to know the degree of matching between them. A good threshold value for classifying a region as a face should be greater than or equal to 0.6.
3 Pupil Detection
A simple technique is employed in order to obtain pupils from face region. Pupils can be obtained by performing XOR operation between thresholded binary image and euler number component binary image. This gives results of dark spots which indicate Pupils, but it also contains stray images. In order to distinguish Pupils from stray images, following rules are checked:
1) Axis of dark spots (Pupils) must lies on same line.
2) Radius of dark spots (Pupils) must have almost same area.
Figure.6 is resultant image obtained after performing XOR operation between Figure.2(c) and Figure.3. Some stray images that lie outside the bounding box of euler number component are filtered in this process.
4 Pupil Tracking
A simple technique is employed for pupil tracking which has reasonable accuracy and less computationally expensive. Once the pupils are detected, the left pupil area is noted, right pupil area, their centroids & euclidean distance between these 2 centroids are also noted. Now a boundary box is created around these two pupils with ±10% of width and height of an image. In the remaining frames of a video, components are searched whose area is approximately 2/3rd of right eye area or left eye area lies within the boundary box obtained from pupil detection part.
Consider an example as in Figure.7 (a), where black filled circles represent stray images and white circles represent pupils. Now compute euclidean distance between each centroid of circles, as shown in Figure.7 (b).
The difference between pupil's centroids is always fixed, irrespective of whatever may be the position of driver. This indicates pupils as shown in Figure.7(c). This difference value is calculated in first frame of a video. When driver closes his eyes, then this distance will not be found in any component around the previous pupil positions and it can be concluded that the driver is sleeping.
This invention therefore, provides a sleep detection solution which is simple, gives better accuracy at lower cost.
Although the invention has been described in terms of particular embodiments and applications, one of ordinary skill in the art, in light of this teaching, can generate additional embodiments and modifications without departing from the
spirit of or exceeding the scope of the chained invention. Accordingly, it is to be understood that the drawings and descriptions herein are offered by way of example to facilitate comprehension of the invention and should not be construed to limit the scope thereof.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 2036-MUM-2008-FORM 1(17-10-2008).pdf | 2008-10-17 |
| 1 | 2036-MUM-2008-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |
| 2 | 2036-MUM-2008-RELEVANT DOCUMENTS [26-09-2022(online)].pdf | 2022-09-26 |
| 2 | 2036-MUM-2008-CORRESPONDENCE(17-10-2008).pdf | 2008-10-17 |
| 3 | 2036-MUM-2008-RELEVANT DOCUMENTS [30-09-2021(online)].pdf | 2021-09-30 |
| 3 | 2036-MUM-2008-FORM 18(18-11-2010).pdf | 2010-11-18 |
| 4 | 2036-MUM-2008-IntimationOfGrant26-02-2020.pdf | 2020-02-26 |
| 4 | 2036-MUM-2008-CORRESPONDENCE(18-11-2010).pdf | 2010-11-18 |
| 5 | abstract1.jpg | 2018-08-09 |
| 5 | 2036-MUM-2008-PatentCertificate26-02-2020.pdf | 2020-02-26 |
| 6 | 2036-MUM-2008-Response to office action [14-02-2020(online)].pdf | 2020-02-14 |
| 6 | 2036-MUM-2008-FORM 5(24-9-2009).pdf | 2018-08-09 |
| 7 | 2036-MUM-2008-Written submissions and relevant documents (MANDATORY) [13-12-2019(online)].pdf | 2019-12-13 |
| 7 | 2036-mum-2008-form 3.pdf | 2018-08-09 |
| 8 | 2036-MUM-2008-ORIGINAL UR 6(1A) FORM 26---------------271119.pdf | 2019-11-29 |
| 8 | 2036-mum-2008-form 26.pdf | 2018-08-09 |
| 9 | 2036-MUM-2008-ORIGINAL UR 6(1A) FORM 26-271119.pdf | 2019-11-29 |
| 9 | 2036-mum-2008-form 2.pdf | 2018-08-09 |
| 10 | 2036-MUM-2008-FORM-26 [21-11-2019(online)].pdf | 2019-11-21 |
| 11 | 2036-mum-2008-form 2(title page).pdf | 2018-08-09 |
| 11 | 2036-MUM-2008-FORM-26 [20-11-2019(online)].pdf | 2019-11-20 |
| 12 | 2036-MUM-2008-HearingNoticeLetter-(DateOfHearing-29-11-2019).pdf | 2019-11-18 |
| 12 | 2036-MUM-2008-FORM 2(TITLE PAGE)-(24-9-2009).pdf | 2018-08-09 |
| 13 | 2036-mum-2008-form 2(24-9-2009).pdf | 2018-08-09 |
| 13 | 2036-MUM-2008-ORIGINAL UR 6(1A) FORM 26-071218.pdf | 2019-05-06 |
| 14 | 2036-MUM-2008-ABSTRACT [06-12-2018(online)].pdf | 2018-12-06 |
| 14 | 2036-mum-2008-form 1.pdf | 2018-08-09 |
| 15 | 2036-MUM-2008-CLAIMS [06-12-2018(online)].pdf | 2018-12-06 |
| 15 | 2036-MUM-2008-FER.pdf | 2018-08-09 |
| 16 | 2036-MUM-2008-COMPLETE SPECIFICATION [06-12-2018(online)].pdf | 2018-12-06 |
| 16 | 2036-mum-2008-drawing.pdf | 2018-08-09 |
| 17 | 2036-MUM-2008-DRAWING(24-9-2009).pdf | 2018-08-09 |
| 17 | 2036-MUM-2008-FER_SER_REPLY [06-12-2018(online)].pdf | 2018-12-06 |
| 18 | 2036-mum-2008-description(provisional).pdf | 2018-08-09 |
| 18 | 2036-MUM-2008-OTHERS [06-12-2018(online)].pdf | 2018-12-06 |
| 19 | 2036-MUM-2008-FORM-26 [30-11-2018(online)].pdf | 2018-11-30 |
| 20 | 2036-MUM-2008-ABSTRACT(24-9-2009).pdf | 2018-08-09 |
| 20 | 2036-MUM-2008-DESCRIPTION(COMPLETE)-(24-9-2009).pdf | 2018-08-09 |
| 21 | 2036-MUM-2008-CLAIMS(24-9-2009).pdf | 2018-08-09 |
| 21 | 2036-mum-2008-correspondence.pdf | 2018-08-09 |
| 22 | 2036-MUM-2008-CORRESPONDENCE(22-9-2009).pdf | 2018-08-09 |
| 23 | 2036-MUM-2008-CLAIMS(24-9-2009).pdf | 2018-08-09 |
| 23 | 2036-mum-2008-correspondence.pdf | 2018-08-09 |
| 24 | 2036-MUM-2008-ABSTRACT(24-9-2009).pdf | 2018-08-09 |
| 24 | 2036-MUM-2008-DESCRIPTION(COMPLETE)-(24-9-2009).pdf | 2018-08-09 |
| 25 | 2036-MUM-2008-FORM-26 [30-11-2018(online)].pdf | 2018-11-30 |
| 26 | 2036-mum-2008-description(provisional).pdf | 2018-08-09 |
| 26 | 2036-MUM-2008-OTHERS [06-12-2018(online)].pdf | 2018-12-06 |
| 27 | 2036-MUM-2008-FER_SER_REPLY [06-12-2018(online)].pdf | 2018-12-06 |
| 27 | 2036-MUM-2008-DRAWING(24-9-2009).pdf | 2018-08-09 |
| 28 | 2036-MUM-2008-COMPLETE SPECIFICATION [06-12-2018(online)].pdf | 2018-12-06 |
| 28 | 2036-mum-2008-drawing.pdf | 2018-08-09 |
| 29 | 2036-MUM-2008-CLAIMS [06-12-2018(online)].pdf | 2018-12-06 |
| 29 | 2036-MUM-2008-FER.pdf | 2018-08-09 |
| 30 | 2036-MUM-2008-ABSTRACT [06-12-2018(online)].pdf | 2018-12-06 |
| 30 | 2036-mum-2008-form 1.pdf | 2018-08-09 |
| 31 | 2036-mum-2008-form 2(24-9-2009).pdf | 2018-08-09 |
| 31 | 2036-MUM-2008-ORIGINAL UR 6(1A) FORM 26-071218.pdf | 2019-05-06 |
| 32 | 2036-MUM-2008-FORM 2(TITLE PAGE)-(24-9-2009).pdf | 2018-08-09 |
| 32 | 2036-MUM-2008-HearingNoticeLetter-(DateOfHearing-29-11-2019).pdf | 2019-11-18 |
| 33 | 2036-mum-2008-form 2(title page).pdf | 2018-08-09 |
| 33 | 2036-MUM-2008-FORM-26 [20-11-2019(online)].pdf | 2019-11-20 |
| 34 | 2036-MUM-2008-FORM-26 [21-11-2019(online)].pdf | 2019-11-21 |
| 35 | 2036-mum-2008-form 2.pdf | 2018-08-09 |
| 35 | 2036-MUM-2008-ORIGINAL UR 6(1A) FORM 26-271119.pdf | 2019-11-29 |
| 36 | 2036-mum-2008-form 26.pdf | 2018-08-09 |
| 36 | 2036-MUM-2008-ORIGINAL UR 6(1A) FORM 26---------------271119.pdf | 2019-11-29 |
| 37 | 2036-mum-2008-form 3.pdf | 2018-08-09 |
| 37 | 2036-MUM-2008-Written submissions and relevant documents (MANDATORY) [13-12-2019(online)].pdf | 2019-12-13 |
| 38 | 2036-MUM-2008-FORM 5(24-9-2009).pdf | 2018-08-09 |
| 38 | 2036-MUM-2008-Response to office action [14-02-2020(online)].pdf | 2020-02-14 |
| 39 | 2036-MUM-2008-PatentCertificate26-02-2020.pdf | 2020-02-26 |
| 39 | abstract1.jpg | 2018-08-09 |
| 40 | 2036-MUM-2008-IntimationOfGrant26-02-2020.pdf | 2020-02-26 |
| 40 | 2036-MUM-2008-CORRESPONDENCE(18-11-2010).pdf | 2010-11-18 |
| 41 | 2036-MUM-2008-RELEVANT DOCUMENTS [30-09-2021(online)].pdf | 2021-09-30 |
| 41 | 2036-MUM-2008-FORM 18(18-11-2010).pdf | 2010-11-18 |
| 42 | 2036-MUM-2008-RELEVANT DOCUMENTS [26-09-2022(online)].pdf | 2022-09-26 |
| 42 | 2036-MUM-2008-CORRESPONDENCE(17-10-2008).pdf | 2008-10-17 |
| 43 | 2036-MUM-2008-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |
| 43 | 2036-MUM-2008-FORM 1(17-10-2008).pdf | 2008-10-17 |
| 1 | 2036_MUM_2008_search_08-06-2018.pdf |