Sign In to Follow Application
View All Documents & Correspondence

"A Personal Human Computer Interaction System Based On Eye Gaze Tracking"

Abstract: The present invention relates generally to eye-tracking technology based interface system. More particularly, the invention relates to an eye-tracking capable system that allows a user to create gaze pattern from his/her eyes-movements that can be associated with user-defined objects and to execute appropriate action of the object whenever similar custom gaze pattern is detected. All these are executed in a user"s own environment as well as in a predefined object-action framework.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
26 April 2012
Publication Number
46/2015
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2022-02-28
Renewal Date

Applicants

INDIAN INSTITUTE OF INFORMATION TECHNOLOGY
DEOGHAT, JHALWA, ALLAHABAD - 211012, UTTAR PRADESH, INDIA

Inventors

1. BARNWAL SANTOSH KUMAR
C/O SANTOSH VASTRALAY, SWARDIH BASTI, SUDAMDIH, DHANBAD, JHARKHAND - 828126
2. TRIPATHI RAMESH CHANDRA
INDIAN INSTITUTE OF INFORMATION TECHNOLOGY DEOGHAT, JHALWA, ALLAHABAD - 211012, UTTAR PARDESH, INDIA.
3. TIWARI MURLI DHAR
INDIAN INSTITUTE OF INFORMATION TECHNOLOGY DEOGHAT, JHALWA, ALLAHABAD - 211012, UTTAR PARDESH, INDIA.

Specification

TECHNICAL FIELD OF THE INVENTION The present invention relates generally to eye-tracking technology based interface system. More particularly, the invention relates to an eye-tracking capable system that allows a user to create gaze pattern from his/her eyes-movements that can be associated with user-defined objects and to execute appropriate action of the object whenever similar custom gaze pattern is detected. All these are executed in a user's own environment as well as in a predefined object-action framework. BACKGROUND AND THE PRIOR ART As we know a human body perceives 80% information from the eyes. Thus, from last decades, researchers are trying to develop a new type of interface between users and computer systems based on tracking eyes-movements. The system which tracks eyes-movements of a user upon a display device (screen) is called eye-tracking system. Since over last few years, wherein the hardware and software prices required to implement an eye-tracking system are falling down, their more robust, high computing speed and smaller in size variety of embodiments have become available in the market. Now, several companies are trying to make the eye-tracking technology as an input device for personal computers, laptops, PDAs, e-book readers, & even for mobile handsets and televisions. Till now, several applications based on this technology have been developed for different purposes; e.g. applications for disabled persons to interact with computer systems, applications for doing market research, applications for usability research, applications for web-based interaction etc. In these applications, eyes are tracked to know where the users are looking and how much time they have spent viewing a displayed item on the screen. The eye-tracking technology can determine with a high degree of accuracy which picture, word, menu item, or even pixel set a user is looking at any given moment in time. Tobii Technology, EyeTech Digital Systems etc. companies produce several products which are smaller in size, low cost and can be easily connected with laptops, desktop computers and even with televisions. 2 A user's eyes-movements contain three types of event, eye-blink, eye- saccade and eye-fixation. Eye-blinl< is the rapid closing and opening of the eyelid. It is an essential function of the eye and happens both as voluntary and non-voluntary. During vision process, interesting area of the visual field is placed on the fovea. For this, fixation and saccade processes happen. An eye-fixation represents a couple of time where the user's attention is fixed on a discrete portion of the visual field. When the fixation ends, the eyes perform a sudden motion called a saccade, moving the user's attention to another fixation point where the user examines another discrete portion of the visual field. In eye-tracking based applications, fixations are determined and detected by defining thresholds to select the gaze points representing a fixation within a spatial-temporal region. The fixation tells when the users are looking at a particular region and saccade tells when they are moving their gaze across regions. One important issue for eyes-movements is that, these are not same for all persons due to several factors like as, eye- disease, age, mental-states, being habitat for their own eyes-movements from a long time etc. For example, some persons cannot concentrate in looking a small region as compared to others. Similarly, these are not always same even in a person for all the time, because of changes happening in mental-state or changes in surrounding environment etc. For example, during morning when a person is in relaxed mood, he/she may concentrate in looking a small region for a long time; but during after-noon when he/she gets tired, wouldn't concentrate in looking the same region for that long time. The prior-art solutions for gaze based interface provide pre-loaded gaze pattern based human-computer interface. These gaze patterns assigned to pre-defined objects, are same for all users operating the system and cannot be changed by the users. Such prior-arts for implementing gaze based interface have following limitations: (1) These solutions do not allow users to personalize gaze pattern based human-computer interface. 3 (2) These solutions do not allow users to modify existing pre-loaded gaze patterns according to their eyes-movements. (3) These solutions do not allow users to create new gaze patterns from their eyes-movements, to assign these to user-defined objects and to store the association in a database. (4) These solutions do not allow users to execute more than one user-defined object by single eyes-movement. (5) In these solutions, gaze patterns cannot be applied universally on the system. Rather, the gaze pattern is bounded to particular applications. Thus, the gaze patterns are only recognized by particular applications. Prior arts, which may be related to a part of present invention somehow, are as follows: In the field of human computer interface a major step is with the development of the Graphical User Interface (GUI) based on the use of a pointing device such as a mouse, windowed on screen environments and icons. This has encouraged a far greater degree of commonality between software packages from different sources and even across different operating systems and enabled users to access their systems more intuitively. One invention for producing a set of non-cursor controlling event output signals, which influence the GUI-components and which are based on eye-tracking data signal that describes a user's point of regard on the display, is disclosed in Patent. No. US 2007/0164990 Al, entitled "arrangement, method and computer program for controlling a computer apparatus based on eye-tracking", which is hereby incorporated by reference. One invention for issuing commands to a computer by a user's gaze at a virtual button with a virtual reality environment is disclosed in U.S. Patent No. 5,859,642 entitled "virtual button interface", which is hereby incorporated by reference. 4 One invention for allowing a user's reading to dictate the speed and position at which content is converted to audio in an audio reader is disclosed in U.S. Patent No. 6,195,640 B1 entitled "audio reader", which is hereby incorporated by reference. One invention for automatically providing reading place-markers to a user reading a textual document upon an electronic display is disclosed in U.S. Patent No. 7,429,108 B2 entitled "gaze-responsive interface to enhance on-screen user reading tasks", which is hereby incorporated by reference. One invention for interpreting eye actuations to indicate when the turn a page, when to provide a pronunciation of a word, when to provide a definition of a word, and when to mark a spot in the text, is disclosed in Patent No. US 2011/10205148 Al entitled "facial tracking electronic reader", which is hereby incorporated by reference. One method for directing computers by eye gaze is described by Heiko Drewes and Albrecht Schmidt in a paper entitled "interacting with the computer using gaze gestures", which is hereby incorporated by reference. One invention for allowing users of touch screen-based devices to create custom gestures on the touch screen that are associated with behaviors and recognized throughout the operation of the device, is disclosed in Patent No. US 2011/10314427 Al entitled "personalization using custom gestures", which is hereby incorporated by reference. One invention for executing an action analogous to a user-defined action in response to receipt of a gesture analogous to a user-defined gesture, is disclosed in Patent No. US 2011/10279384 Al entitled "automatic derivation of analogous touch gestures from a user-defined gesture", which is hereby incorporated by reference. OBJECTS OF THE INVENTION 5 First and foremost object of the present invention is to overcome the disadvantages / drawback of the prior art. A basic object of the present invention is to provide a method for supporting custom gaze pattern in an eye-tracking enabled system for the purpose of customizing eye-tracking based human computer interaction. Another object of the present invention is to provide a device for supporting custom gaze pattern in an eye-tracking enabled system for the purpose of customizing eye-tracking based human computer interaction. These and other advantages of the present invention will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. SUMMARY OF THE INVENTION The following presents a simplified summary of the invention in order to provide a basic understanding of some aspects of the invention. This summary is not an extensive overview of the present invention. It is not intended to identify the key/critical elements of the invention or to delineate the scope of the invention. Its sole purpose is to present some concept of the invention in a simplified form as a prelude to a more detailed description of the invention presented later. To address the above discussed limitations of the prior art, a primary objective is to provide a novel way of user interface framework where users create gaze patterns and assign these gaze patterns to user-defined objects. Also the users can modify or delete existing association stored in the gaze pattern database. The objects mean GUI (Graphical user interface) components, texts, characters, commands, images, passwords, physical components or anything which can be associated with custom gaze pattern by the user. 6 According the present invention provides a metliod for supporting custom gaze pattern in an eye-tracl

Documents

Application Documents

# Name Date
1 1294-del-2012-Form-1-(30-05-2012).pdf 2012-05-30
1 1294-DEL-2012-IntimationOfGrant28-02-2022.pdf 2022-02-28
2 1294-del-2012-Correspondence-Others-(30-05-2012).pdf 2012-05-30
2 1294-DEL-2012-PatentCertificate28-02-2022.pdf 2022-02-28
3 1294-del-2012-GPA-(18-06-2012).pdf 2012-06-18
3 1294-DEL-2012-AMMENDED DOCUMENTS [04-01-2022(online)].pdf 2022-01-04
4 1294-del-2012-Correspondence Others-(18-06-2012).pdf 2012-06-18
4 1294-DEL-2012-Annexure [04-01-2022(online)].pdf 2022-01-04
5 1294-del-2012-Form-5-(04-03-2013).pdf 2013-03-04
5 1294-DEL-2012-Covering Letter [04-01-2022(online)].pdf 2022-01-04
6 1294-del-2012-Form-2-(04-03-2013).pdf 2013-03-04
6 1294-DEL-2012-FORM 13 [04-01-2022(online)].pdf 2022-01-04
7 1294-DEL-2012-PETITION u-r 6(6) [04-01-2022(online)].pdf 2022-01-04
7 1294-del-2012-Drawings-(04-03-2013).pdf 2013-03-04
8 1294-DEL-2012-Written submissions and relevant documents [04-01-2022(online)].pdf 2022-01-04
8 1294-del-2012-Correspondence Others-(04-03-2013).pdf 2013-03-04
9 1294-del-2012-Form-3.pdf 2013-04-04
9 1294-DEL-2012-US(14)-HearingNotice-(HearingDate-21-12-2021).pdf 2021-11-29
10 1294-DEL-2012-FORM 3 [23-04-2021(online)].pdf 2021-04-23
10 1294-del-2012-Form-1.pdf 2013-04-04
11 1294-del-2012-Drawings.pdf 2013-04-04
11 1294-DEL-2012-FORM 3 [30-11-2020(online)].pdf 2020-11-30
12 1294-del-2012-Description (Provisional).pdf 2013-04-04
12 1294-DEL-2012-FORM 3 [13-10-2020(online)].pdf 2020-10-13
13 1294-DEL-2012-ABSTRACT [31-10-2019(online)].pdf 2019-10-31
13 1294-del-2012-Correspondence-others.pdf 2013-04-04
14 1294-del-2012-Abstract.pdf 2013-04-04
14 1294-DEL-2012-CLAIMS [31-10-2019(online)].pdf 2019-10-31
15 1294-del-2012-Correspondence-Others-(24-06-2013).pdf 2013-06-24
15 1294-DEL-2012-FER_SER_REPLY [31-10-2019(online)].pdf 2019-10-31
16 1294-del-2012-Correspondence Others-(12-01-2016).pdf 2016-01-12
16 1294-DEL-2012-OTHERS [31-10-2019(online)].pdf 2019-10-31
17 1294-DEL-2012-FORM 3 [10-09-2019(online)].pdf 2019-09-10
17 1294-del-2012-Abstract-(12-01-2016).pdf 2016-01-12
18 1294-DEL-2012-FER.pdf 2019-05-02
19 1294-del-2012-Abstract-(12-01-2016).pdf 2016-01-12
19 1294-DEL-2012-FORM 3 [10-09-2019(online)].pdf 2019-09-10
20 1294-del-2012-Correspondence Others-(12-01-2016).pdf 2016-01-12
20 1294-DEL-2012-OTHERS [31-10-2019(online)].pdf 2019-10-31
21 1294-del-2012-Correspondence-Others-(24-06-2013).pdf 2013-06-24
21 1294-DEL-2012-FER_SER_REPLY [31-10-2019(online)].pdf 2019-10-31
22 1294-del-2012-Abstract.pdf 2013-04-04
22 1294-DEL-2012-CLAIMS [31-10-2019(online)].pdf 2019-10-31
23 1294-DEL-2012-ABSTRACT [31-10-2019(online)].pdf 2019-10-31
23 1294-del-2012-Correspondence-others.pdf 2013-04-04
24 1294-DEL-2012-FORM 3 [13-10-2020(online)].pdf 2020-10-13
24 1294-del-2012-Description (Provisional).pdf 2013-04-04
25 1294-del-2012-Drawings.pdf 2013-04-04
25 1294-DEL-2012-FORM 3 [30-11-2020(online)].pdf 2020-11-30
26 1294-DEL-2012-FORM 3 [23-04-2021(online)].pdf 2021-04-23
26 1294-del-2012-Form-1.pdf 2013-04-04
27 1294-del-2012-Form-3.pdf 2013-04-04
27 1294-DEL-2012-US(14)-HearingNotice-(HearingDate-21-12-2021).pdf 2021-11-29
28 1294-del-2012-Correspondence Others-(04-03-2013).pdf 2013-03-04
28 1294-DEL-2012-Written submissions and relevant documents [04-01-2022(online)].pdf 2022-01-04
29 1294-del-2012-Drawings-(04-03-2013).pdf 2013-03-04
29 1294-DEL-2012-PETITION u-r 6(6) [04-01-2022(online)].pdf 2022-01-04
30 1294-DEL-2012-FORM 13 [04-01-2022(online)].pdf 2022-01-04
30 1294-del-2012-Form-2-(04-03-2013).pdf 2013-03-04
31 1294-del-2012-Form-5-(04-03-2013).pdf 2013-03-04
31 1294-DEL-2012-Covering Letter [04-01-2022(online)].pdf 2022-01-04
32 1294-del-2012-Correspondence Others-(18-06-2012).pdf 2012-06-18
32 1294-DEL-2012-Annexure [04-01-2022(online)].pdf 2022-01-04
33 1294-del-2012-GPA-(18-06-2012).pdf 2012-06-18
33 1294-DEL-2012-AMMENDED DOCUMENTS [04-01-2022(online)].pdf 2022-01-04
34 1294-DEL-2012-PatentCertificate28-02-2022.pdf 2022-02-28
34 1294-del-2012-Correspondence-Others-(30-05-2012).pdf 2012-05-30
35 1294-DEL-2012-IntimationOfGrant28-02-2022.pdf 2022-02-28
35 1294-del-2012-Form-1-(30-05-2012).pdf 2012-05-30

Search Strategy

1 search_02-05-2019.pdf

ERegister / Renewals