Sign In to Follow Application
View All Documents & Correspondence

Sleep Detection System

Abstract: A system and method for detecting drowsiness of a driver while driving a vehicle has been disclosed wherein an inexpensive IR camera 102 is used to capture a facial image of the driver. The system 100 detects the outline of the driver"s face from the captured image, followed by recognizing the dynamic structural elements of the face. The system 100 efficiently traces the shape of the sclera of the driver"s eyes to deduce whether the driver is awake or asleep. A Support Vector Machine (SVM) / Artificial Neural Network 130 is employed by the present invention to automatically detect if the driver is awake or asleep.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 May 2009
Publication Number
48/2010
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2021-02-10
Renewal Date

Applicants

TATA CONSULTANCY SERVICES LIMITED
NIRMAL BUILDING, 9TH FLOOR, NARIMAN POINT, MUMBAI-400021, MAHARASHTRA, INDIA.

Inventors

1. BROJESHWAR BHOWMICK
TATA CONSULTANCY SERVICES, BENGAL INTELLIGENT PARK, BUILDING-D, PLOT NO. A2 M2 & N2, BLOCK-EP, SALT LAKE ELECTRONICS COMPLEX, SECTOR -V, KOLKATA-700091, WEST BENGAL, INDIA.
2. KS CHIDANAND KUMAR
TATA CONSULTANCY SERVICES, BENGAL INTELLIGENT PARK, BUILDING-D, PLOT NO. A2 M2 & N2, BLOCK-EP, SALT LAKE ELECTRONICS COMPLEX, SECTOR -V, KOLKATA-700091, WEST BENGAL, INDIA.

Specification

FORM-2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2006
PROVISIONAL SPECIFICATION
(See section 10; rule 13)


SLEEP DETECTION SYSTEM
TATA CONSULTANCY SERVICES LIMITED,
an Indian Company of Nirmal Building, 9th Floor, Nariman point, Mumbai -400 021,
Maharashtra, India.

THE FOLLOWING SPECIFICATION DESCRIBES THE INVENTION


FIELD OF THE INVENTION
The present invention relates to the field of computer vision techniques.
Particularly, the present invention relates to sleep detection using computer vision techniques.
BACKGROUND OF THE INVENTION
Computer vision relates to the science and technology of machines that can see. As a scientific discipline, computer vision is concerned with the theory for building artificial systems that obtain information from images. The image data can take many forms, such as a video sequence, views from multiple cameras or multi-dimensional data from a medical scanner.
Examples of applications of computer vision include systems for:
• controlling processes (e.g. an industrial robot or an autonomous vehicle);
• detecting events (e.g. for visual surveillance or people counting);
• organizing information (e.g. for indexing databases of images and image sequences);
• modeling objects or environments (e.g. industrial inspection, medical image analysis or topographical modeling); and
• interaction (e.g. as the input to a device for computer-human interaction).
Computer vision can also be described as a complement (but not necessarily the opposite) of biological vision. In biological vision, the visual perception of humans and various animals are studied, resulting in models of how these
2

systems operate in terms of physiological processes. Computer vision, on the other hand, studies and describes artificial vision systems that are implemented in software and/or hardware. Interdisciplinary exchange between biological and computer vision has proven increasingly fruitful for both the fields.
Sub-domains of computer vision include scene reconstruction, event detection, tracking, object recognition, learning, indexing, motion estimation, and image restoration.
Sleep detection is one area where computer vision techniques find applicability. Sleep detection is commonly used for detecting the drowsiness of automobile drivers by following their eye movements. Most of the devices used for sleep detection rely more on hardware (i.e., sensor technology). Although some of the devices implement visual technology, these devices uses complex methods in order to detect driver drowsiness. Also, these devices are too costly.
The different sleep detection methods currently available are described in the prior art documents given below.
The United States Patent US5689241 titled 'Sleep detection and driver alert apparatus' discloses a device which monitors driver via an infrared camera. The thermal image changes in pixel color of open versus closed eyes of the driver when the temperature sensitive infrared portion of the digitized photographic image is passed through a video charge coupling device. The combination of non-movement and a decrease in the breath temperature, which is a physiological response to the hypoventilation initiating
3

drowsiness, will trigger the infrared camera to zoom onto the eye region of
the driver.
Similarly, the United States Patent US4297685 titled 'Apparatus and method for sleep detection' discloses a method to detect the onset of sleep by a drop in brain temperature as measured within the auditory canal. Auditory canal temperature is measured and its value is used to activate an audible or visible alarm to prevent the subjects falling asleep.
Further, United States Patent US7189204 titled 'Sleep detection using an adjustable threshold' discloses a method to adjust a sleep threshold associated with a first sleep-related signal using a second sleep-related signal. The first sleep-related signal is compared to the adjusted threshold and sleep is detected based on the comparison. The sleep-related signals may be derived from implantable or external sensors.
Still further, United States Patent US5900819 titled 'Drowsy driver detection system' discloses a method to detect drowsiness by measuring vehicle behavior including speed, lateral position, turning angle and the like.
Still further, United States Patent US6243015 titled 'Driver's drowsiness detection method of drowsy driving warning system' discloses a driver's drowsiness detection method by determining the vertical width of a driver's eye from a driver's face image input obtained from a closed circuit camera.
Also the patent documents US6236968, US6822573, CA2201694, US2004193068, US4967186, US5570698, US5786765, US5682144 and US5878156 disclose different sleep detection techniques.
4

The devices implemented in the above mentioned prior art documents use complex methods in order to detect driver drowsiness. Also, these devices are too costly. Therefore, there is a need for a sleep detection system which:
• is simple;
• has better accuracy; and
• has a lower cost.
OBJECTS OF THE INVENTION
It is an object of the present invention to provide a simple sleep detection system.
It is another object of the present invention to provide a sleep detection system which has better accuracy.
It is yet another object of the present invention to provide a sleep detection system which has a lower cost.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The invention will now be described with reference to the accompanying
drawings, in which:
Figure 1 shows the flow chart of the functioning of the sleep detection
system.
DETAILED DESCRIPTION OF THE INVENTION
The drawings and the description thereto are merely illustrative of a sleep detection system and only exemplify the invention and in no way limit the
scope thereof.
5

The present invention aims at detecting sleep or drowsiness of an automobile driver using the computer vision technology. The invention consists of a series of steps as mentioned below.
i. Getting the face region of the driver using binarization.
ii. Eye detection.
iii. Noise cleaning using morphological analysis.
iv. Classification of open or close eye status using SVM/ANN (Support Vector Machine / Artificial Neural Network)
Figure 1 shows the flow chart of the functioning of the sleep detection system. The face region of the driver is extracted using an optimum threshold, separating the image into two classes so that their combined spread (intra-class variance) is minimal.
The intra-class variance is defined as a weighted sum of variances of the two classes as shown by:

Weights are the probabilities of the two classes separated by a threshold t and a is the variance of the class.

Location of the eyes is done by finding the intensity changes on the face. The first step is to calculate the average intensity for each y - coordinate as
6

in equation 3 given below. The valleys (dips) in the plot of the horizontal values indicate intensity changes. A smoothing method is used to get the speckle noises in the horizontal histograms. After obtaining the horizontal average data, the most significant valleys indicate the eye area. The valleys are found by finding the change in slope from negative to positive. The peaks are found by a change in slope-from positive to negative.

Similarly, a vertical intensity projection has been made through the image. The peak in this graph is the possible center (PC) of the face. From this curve, the two endpoints denote the left (L) and right (R) position of the face. The final face center is determined by

Assuming a gray level image M and a matrix g and h, the equation (4) is used to preprocess the image in gray level as given by.
i
Where, v and A is the supremum and infimum operator of a set and D is the neighborhood. The A operation is done after getting v operation in full image.
The kernel matrix g and h is of size mxn where
n = max(PC~L,R-PC) m = n/2
7

Now equation (3) is again used on matrix G(x) to get the nose position. The nose surrounding is removed once it is detected.
At this stage, the problem of localization is solved from a geometric perspective, where candidate points are grouped into clusters for further localization. The input for the clustering method is expressed as the set of all candidate feature points resulting from the feature segmentation. Clustering method has been implemented to classify the probable eye region. This method aims at minimizing an objective function (J), a squared error function:

where a chosen distance measured between a data point xf' and the
cluster centreCj, is an indicator of the distance of the n data points from their respective cluster centers. Let the cluster centers be c"C2-Cm . They are
pre-assigned as {jij=]. After calculating the objective function *> is obtained from the set of inputs. Then, the cluster centers or centroids are updated as: If
Then
where ^ is a small positive learning rate.
The method stops when cluster centers stop moving (thus convergence is found), or when points are moving from one cluster to another. Here, K=2
for foreground and another one for background. If is the output image
for an image then

8
B..9 MAY 2009

= 0 , otherwise. T is highest value obtained from k-means clustering center.
The thresholded image obtained after clustering includes noise like images of eye brows and nose. The subsequent processing steps are:
9
1) Sometimes it may happen that the nose image may merge with eye image. Since nose height is much larger than eyes' height, these vertical components are separated from nose image using morphological opening operation. This operation includes structuring the element height equal to eye region height and structuring the element width equal to 1.
2) Nose image has vertical components and these vertical components may not be removed via aspect ratio because, after applying k - means threshold method, eyes' region may break. So removing vertical components using aspect ratio is not a good technique. For that purpose the equation(5) which is given below can be utilized. It has been observed that if Ra is less than 0.5, it indicates long vertical component and that component needs to be removed.


3) Aspect ratio is used then to remove long vertical component after opening which will in turn remove nose.
After getting the face image, the equations 6 to 8 are computed to feed into a classifier to classify the status of eyes.

where,


each °l indicates to which class the p-dimensional vector belongs to. Any hyper plane is a set of points x satisfying the equation

w and ^ are to be chosen so as to maximize the margin. Since it has been
proven mathematically that the margin is'lwH, finding the hyper plane
10

which separates the data with maximum margin leads to the following optimization problem:

where are the upper bounds on training error and C is the regularization
parameter.
The dual formulation is
Maximize

For non-linear SVM, the training vectors are mapped to some higher
dimensional data using a transformation function^. Then in equation
(9) becomes
By using a suitable kernel function K satisfying need
not be calculated. Gaussian kernel of the below form is used in the present invention as given by,

TECHNICAL ADVANCEMENTS
The technical advancements of the present invention include realization of a sleep detection system which:
• is simple;
• has better accuracy; and
• has a lower cost.
11

While considerable emphasis has been placed herein on the particular features of this invention, it will be appreciated that various modifications can be made, and that many changes can be made in the preferred embodiments without departing from the principles of the invention. These and other modifications in the nature of the invention or the preferred embodiments will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter is to be interpreted merely as illustrative of the invention and not as a limitation.


NAIR M.R.
Of R. K.DEWAN&CO.
APPLICANTS' PATENT ATTORNEY

Documents

Application Documents

# Name Date
1 1264-MUM-2009-FORM 1(28-10-2009).pdf 2009-10-28
1 1264-MUM-2009-RELEVANT DOCUMENTS [30-09-2023(online)].pdf 2023-09-30
2 1264-MUM-2009-CORRESPONDENCE(28-10-2009).pdf 2009-10-28
2 1264-MUM-2010- AFR.pdf 2022-12-03
3 1264-MUM-2010- U. S. PATENT DOCUMENT.pdf 2022-12-03
3 1264-MUM-2009-FORM 18(30-11-2010).pdf 2010-11-30
4 1264-MUM-2009-RELEVANT DOCUMENTS [26-09-2022(online)].pdf 2022-09-26
4 1264-MUM-2009-CORRESPONDENCE(30-11-2010).pdf 2010-11-30
5 1264-MUM-2009-US(14)-HearingNotice-(HearingDate-15-10-2020).pdf 2021-10-03
5 1264-MUM-2009-CORRESPONDENCE(IPO)-(FER)-(30-12-2015).pdf 2015-12-30
6 1264-MUM-2009-REPLY TO EXAMINATION REPORT-(18-04-2016).pdf 2016-04-18
6 1264-MUM-2009-RELEVANT DOCUMENTS [30-09-2021(online)].pdf 2021-09-30
7 1264-MUM-2009-IntimationOfGrant10-02-2021.pdf 2021-02-10
7 1264-MUM-2009-CLAIMS(MARKED COPY)-(18-04-2016).pdf 2016-04-18
8 1264-MUM-2009-PatentCertificate10-02-2021.pdf 2021-02-10
8 1264-MUM-2009-CLAIMS(AMENDED)-(18-04-2016).pdf 2016-04-18
9 1264-MUM-2009-SPECIFICATION(AMENDED)-(21-04-2016).pdf 2016-04-21
9 1264-MUM-2009-Written submissions and relevant documents [26-10-2020(online)].pdf 2020-10-26
10 1264-MUM-2009-Correspondence to notify the Controller [14-10-2020(online)].pdf 2020-10-14
10 1264-MUM-2009-CORRESPONDENCE-(21-04-2016).pdf 2016-04-21
11 1264-MUM-2009-ABSTRACT-(21-04-2016).pdf 2016-04-21
11 1264-MUM-2009-FORM-26 [14-10-2020(online)].pdf 2020-10-14
12 1264-MUM-2009-ABSTRACT SPECIFICATION(MARKED COPY)-(21-04-2016).pdf 2016-04-21
12 1264-MUM-2009-Response to office action [04-09-2020(online)].pdf 2020-09-04
13 1264-MUM-2009-ABSTRACT(18-5-2010).pdf 2018-08-10
13 abstract1.jpg 2018-08-10
14 1264-MUM-2009-CLAIMS(18-5-2010).pdf 2018-08-10
14 1264-MUM-2009_EXAMREPORT.pdf 2018-08-10
15 1264-MUM-2009-CORRESPONDENCE(18-5-2010).pdf 2018-08-10
15 1264-MUM-2009-FORM 5(18-5-2010).pdf 2018-08-10
16 1264-mum-2009-correspondence.pdf 2018-08-10
16 1264-mum-2009-form 3.pdf 2018-08-10
17 1264-mum-2009-form 26.pdf 2018-08-10
17 1264-MUM-2009-DESCRIPTION(COMPLETE)-(18-5-2010).pdf 2018-08-10
18 1264-mum-2009-form 2.pdf 2018-08-10
19 1264-mum-2009-description(provisional).pdf 2018-08-10
20 1264-MUM-2009-DRAWING(18-5-2010).pdf 2018-08-10
20 1264-mum-2009-form 2(title page).pdf 2018-08-10
21 1264-mum-2009-drawing.pdf 2018-08-10
21 1264-MUM-2009-FORM 2(TITLE PAGE)-(18-5-2010).pdf 2018-08-10
22 1264-mum-2009-form 1.pdf 2018-08-10
22 1264-mum-2009-form 2(18-5-2010).pdf 2018-08-10
23 1264-mum-2009-form 1.pdf 2018-08-10
23 1264-mum-2009-form 2(18-5-2010).pdf 2018-08-10
24 1264-mum-2009-drawing.pdf 2018-08-10
24 1264-MUM-2009-FORM 2(TITLE PAGE)-(18-5-2010).pdf 2018-08-10
25 1264-mum-2009-form 2(title page).pdf 2018-08-10
25 1264-MUM-2009-DRAWING(18-5-2010).pdf 2018-08-10
26 1264-mum-2009-description(provisional).pdf 2018-08-10
27 1264-mum-2009-form 2.pdf 2018-08-10
28 1264-MUM-2009-DESCRIPTION(COMPLETE)-(18-5-2010).pdf 2018-08-10
28 1264-mum-2009-form 26.pdf 2018-08-10
29 1264-mum-2009-correspondence.pdf 2018-08-10
29 1264-mum-2009-form 3.pdf 2018-08-10
30 1264-MUM-2009-CORRESPONDENCE(18-5-2010).pdf 2018-08-10
30 1264-MUM-2009-FORM 5(18-5-2010).pdf 2018-08-10
31 1264-MUM-2009-CLAIMS(18-5-2010).pdf 2018-08-10
31 1264-MUM-2009_EXAMREPORT.pdf 2018-08-10
32 1264-MUM-2009-ABSTRACT(18-5-2010).pdf 2018-08-10
32 abstract1.jpg 2018-08-10
33 1264-MUM-2009-ABSTRACT SPECIFICATION(MARKED COPY)-(21-04-2016).pdf 2016-04-21
33 1264-MUM-2009-Response to office action [04-09-2020(online)].pdf 2020-09-04
34 1264-MUM-2009-ABSTRACT-(21-04-2016).pdf 2016-04-21
34 1264-MUM-2009-FORM-26 [14-10-2020(online)].pdf 2020-10-14
35 1264-MUM-2009-Correspondence to notify the Controller [14-10-2020(online)].pdf 2020-10-14
35 1264-MUM-2009-CORRESPONDENCE-(21-04-2016).pdf 2016-04-21
36 1264-MUM-2009-SPECIFICATION(AMENDED)-(21-04-2016).pdf 2016-04-21
36 1264-MUM-2009-Written submissions and relevant documents [26-10-2020(online)].pdf 2020-10-26
37 1264-MUM-2009-CLAIMS(AMENDED)-(18-04-2016).pdf 2016-04-18
37 1264-MUM-2009-PatentCertificate10-02-2021.pdf 2021-02-10
38 1264-MUM-2009-IntimationOfGrant10-02-2021.pdf 2021-02-10
38 1264-MUM-2009-CLAIMS(MARKED COPY)-(18-04-2016).pdf 2016-04-18
39 1264-MUM-2009-REPLY TO EXAMINATION REPORT-(18-04-2016).pdf 2016-04-18
39 1264-MUM-2009-RELEVANT DOCUMENTS [30-09-2021(online)].pdf 2021-09-30
40 1264-MUM-2009-US(14)-HearingNotice-(HearingDate-15-10-2020).pdf 2021-10-03
40 1264-MUM-2009-CORRESPONDENCE(IPO)-(FER)-(30-12-2015).pdf 2015-12-30
41 1264-MUM-2009-RELEVANT DOCUMENTS [26-09-2022(online)].pdf 2022-09-26
41 1264-MUM-2009-CORRESPONDENCE(30-11-2010).pdf 2010-11-30
42 1264-MUM-2010- U. S. PATENT DOCUMENT.pdf 2022-12-03
42 1264-MUM-2009-FORM 18(30-11-2010).pdf 2010-11-30
43 1264-MUM-2010- AFR.pdf 2022-12-03
43 1264-MUM-2009-CORRESPONDENCE(28-10-2009).pdf 2009-10-28
44 1264-MUM-2009-FORM 1(28-10-2009).pdf 2009-10-28
44 1264-MUM-2009-RELEVANT DOCUMENTS [30-09-2023(online)].pdf 2023-09-30

ERegister / Renewals

3rd: 30 Apr 2021

From 19/05/2011 - To 19/05/2012

4th: 30 Apr 2021

From 19/05/2012 - To 19/05/2013

5th: 30 Apr 2021

From 19/05/2013 - To 19/05/2014

6th: 30 Apr 2021

From 19/05/2014 - To 19/05/2015

7th: 30 Apr 2021

From 19/05/2015 - To 19/05/2016

8th: 30 Apr 2021

From 19/05/2016 - To 19/05/2017

9th: 30 Apr 2021

From 19/05/2017 - To 19/05/2018

10th: 30 Apr 2021

From 19/05/2018 - To 19/05/2019

11th: 30 Apr 2021

From 19/05/2019 - To 19/05/2020

12th: 30 Apr 2021

From 19/05/2020 - To 19/05/2021

13th: 30 Apr 2021

From 19/05/2021 - To 19/05/2022

14th: 02 May 2022

From 19/05/2022 - To 19/05/2023

15th: 19 Apr 2023

From 19/05/2023 - To 19/05/2024

16th: 01 May 2024

From 19/05/2024 - To 19/05/2025

17th: 13 May 2025

From 19/05/2025 - To 19/05/2026