Sign In to Follow Application
View All Documents & Correspondence

Interactive Gaze Controlled Projected Display

Abstract: A multi-modal interface system for an automotive/aviation environment to mitigate operator’s distraction from his primary task is disclosed. It mitigates operator’s distraction by simplifying human-machine interaction by alleviating need to look down towards display and physically touch the interface. The disclosed system incorporates an Interactive Projected Display on a semi-transparent sheet pasted on windscreen, an eye Gaze tracker and a finger tracking device. A pointer is moved on the interactive display based on detected eye gaze and finger movement. The system accounts for inaccuracies in eye gaze tracker by providing an alternative to operator that causes least distraction. In case the eye gaze tracker is unable to track eye gaze accurately, user can correct pointer position by finger movement. The system gives precedence to signal from finger tracker over signal from eye gaze tracker such that when finger is detected, input from gaze tracker is not considered for pointer movement.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
05 November 2016
Publication Number
19/2018
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
docket@khuranaandkhurana.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-05-03
Renewal Date

Applicants

Indian Institute of Science
C V Raman Road, Bangalore-560012, Karnataka, India.

Inventors

1. BISWAS, Pradipta
Centre for Product Design and Manufacturing (CPDM), Indian Institute of Science, C V Raman Road, Bangalore-560012, Karnataka, India.

Specification

DESC:
FIELD OF THE INVENTION
[0001] The present disclosure relates generally to the field of man-machine interface systems. In particular it pertains to multimodal human computer interaction system for automotive and aviation environments.

BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] Rapid advances in the field of in-car information systems (both for driver assistance and driver information) contribute to an increasing cognitive workload. Similarly, pilots of military aircrafts (both fast jets and transport ones) need to undertake a lot of secondary mission control tasks in addition to primary flying tasks. The secondary tasks have considerable chance to distract a driver/pilot from his primary driving/flying task, thereby increasing cognitive workload affecting safety. Thus, easing out human machine interaction (HMI) between operators and electronic user interfaces in automotive and aviation environments can potentially raise safety and help to leverage true potential of those systems.
[0004] In aviation environment, aircrafts use hardware switches, joysticks, auditory feedback and small-screen display (Primary Flight Display or Multi-Function Display) as main input and output modalities. Touchscreen display and head trackers are being explored in fighter aircrafts.
[0005] In automotive environment, to facilitate human machine interaction, new modalities of interaction like eye-gaze [Kern; 2010]; head and hand movement tracking systems [Poitschke; 2011 and Ohn-Bar; 2014]; use of haptic interfaces [Chang; 2011], personalizing instrument displays and predictive models [Feld; 2013 and Normark; 2015] have been explored to help solve problems faced by drivers in regular driving tasks.
[0006] Recent advancement in infra-red based eye gaze trackers has significantly increased research and industrial use of gaze tracking technology. Eye tracking is process of measuring either the point of gaze (where one is looking) or the motion of an eye relative to the head. An eye tracker is a device for measuring eye positions and eye movement. Research on eye tracking dated back to late 18th century when Louis Émile Javal investigated saccadic movements in a reading task. He pioneered in building the first eye tracker which was a contact lens connected with an aluminium pointer.
[0007] With progress in processor speed and image processing algorithms, it is possible to use gaze tracking technology in real time to control a screen pointer in a direct manipulation interface. Gaze controlled interfaces have been used for assistive technology, automotive and aviation environment. User studies involving simulated driving tasks while comparing eye gaze controlled interface with traditional touchscreen system have been reported by Kern et.al. in their paper “Making Use of Drivers’ Glances onto the Screen for Explicit Gaze-Based Interaction” [presented in Proceedings of the Second International Conference on Automotive User Interfaces and Interactive Vehicular Applications, November 11-12, 2010, Pittsburgh, Pennsylvania, USA] and Poitschke et. al. in their paper “Gaze-based interaction on multiple displays in an automotive environment” [published in IEEE International Conference on Systems, Man and Cybernetics (SMC), 2011 Page(s): 543 - 548, ISSN : 1062-922X]
[0008] Eye based control of man-machine interfaces has many advantages over various physical interfaces, such as a touch screen displays, switches, mouse devices, keypads, and keyboards for controlling electronic systems. In these controls a gaze tracking apparatus monitors operator eye orientation while the operator views a video screen. According to the operator's eye orientation, the computer calculates the operator's gaze position, cursor is positioned on a video screen and target is selected. However, one problem faced with gaze tracking devices is accuracy of gaze trackers. It is not difficult to move a screen pointer based on eye gaze but focusing the screen pointer on a screen element remains a challenge in gaze controlled interface. This is because visual search by a user on a two-dimensional screen consists of saccadic and small pursuit eye gaze movements. The saccadic movements take 250 to 350 msec to complete and are ballistic in nature. However, the small pursuit movements keep the eye gaze moving around the point of interest. If a pointer is moved directly following eye gaze, the small pursuit movements create jitters and it becomes difficult to select a target as the pointer is not stable.
[0009] The best available accuracy of eye-gaze tracker presently is 0.4° of visual angle. This accuracy translates to approximately 18 pixels to a standard desktop screen from 65 cm of viewing distance. So a gaze control interface may occasionally require a user to make special efforts to move cursor/pointer to a desired location. For example he may have to focus little-bit off target to bring the cursor on a screen element. Existing gaze controlled software solve this issue by designing special interface with big screen elements to compensate variation and limitation in accuracy. However, any interaction system should not limit interface design and should work for existing interface without limiting size of screen elements.
[0010] Efforts have been made in the art to overcome above limitation of eye gaze tracking based human-machine interaction systems by integrating it with other means such as gesture recognition/hand tracking means. For example PCT publication number WO2013036632 discloses an in-flight entertainment system that includes an eye gaze controlled smart display for passengers. Users may point and select icons on the display by staring at appropriate portion of the screen for at least a threshold time. It further discloses navigation through various menus or selection of a displayed item by moving one of hands/fingers in a defined manner. As would be evident the proposed system requires long staring at the display for at least a certain time. It is thus unsuitable for drivers of fast moving vehicles as selection of target has to be faster than staring away from road.
[0011] Another reference, PCT publication number WO 2014015521, discloses a gaze controlled system where cursor moves based on position of eye gaze and when the gaze is still or moving only slightly a gesture of hand of the user is detected by a second camera for executing further operation that depends on the detected gesture. As can be seen, the disclosed system requires specific gestures for different operations which have to be remembered by the user and should match with the hand gesture template signals stored in the memory.
[0012] Yet another reference, United States Patent application number US 20120174004 discloses use of a transparent windscreen as graphic projection display for drivers and they can select object on the projection display using different input modalities including eye gaze and determining a registered graphic representing the selected feature for display, based upon the monitored hand gesture. The patent does not address problem of accuracy of gaze tracking and accordingly does not provide any method of improving accuracy of the gaze tracking based system.
[0013] There is, therefore, a need in the art for a multi-modal interface system that leverages advantages of each modality and overcomes problem of low accuracy of eye gaze trackers by providing user alternative to move pointer/cursor on display screen.
[0014] All publications herein are incorporated by reference to the same extent as if each individual publication or patent application were specifically and individually indicated to be incorporated by reference. Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies and the definition of that term in the reference does not apply.
[0015] In some embodiments, the numbers expressing quantities of ingredients, properties such as concentration, reaction conditions, and so forth, used to describe and claim certain embodiments of the invention are to be understood as being modified in some instances by the term “about.” Accordingly, in some embodiments, the numerical parameters set forth in the written description and attached claims are approximations that can vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the invention are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable. The numerical values presented in some embodiments of the invention may contain certain errors necessarily resulting from the standard deviation found in their respective testing measurements.
[0016] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[0017] The recitation of ranges of values herein is merely intended to serve as a shorthand method of referring individually to each separate value falling within the range. Unless otherwise indicated herein, each individual value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g. “such as”) provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention.
[0018] Groupings of alternative elements or embodiments of the invention disclosed herein are not to be construed as limitations. Each group member can be referred to and claimed individually or in any combination with other members of the group or other elements found herein. One or more members of a group can be included in, or deleted from, a group for reasons of convenience and/or patentability. When any such inclusion or deletion occurs, the specification is herein deemed to contain the group as modified thus fulfilling the written description of all Markush groups used in the appended claims.
OBJECTS OF THE INVENTION
[0019] A general objective of the present disclosure is to mitigate problem of operator’s distraction from his primary task of driving/flying while interacting with a display for secondary task in automobile/aircraft by alleviating need to look down the display and physically touch the interface.
[0020] An object of the present disclosure is to provide an Interactive Gaze Controlled Projected Display System in an automotive and aviation environment to alleviate need to physically touch the interface.
[0021] An object of the present disclosure is to provide an Interactive Gaze Controlled Projected Display System with displays on the windscreen without obstructing view of the road or the sky ahead.
[0022] Another object of the present disclosure is to provide an Interactive Gaze Controlled Projected Display System that takes into account inaccuracies in eye gaze tracker.
[0023] Another object of the present disclosure is to provide a Projected Display System that integrates finger tracking as an alternative to gaze tracking based control.
[0024] Another object of the present disclosure is to provide a system that integrates dual control for movement of the curser/pointer based on position of eye gaze as well as finger tracking such that inaccuracies in in eye gaze tracker are taken care.

SUMMARY
[0025] Aspects of the present disclosure relate to a system and method for multi-modal interface for interaction with a human-machine interface in an automotive/aviation environment. In particular, the disclosed method and system mitigate driver’s distraction from the driving task by simplifying human-machine interaction by alleviating the need to look down the display and physically touch the interface. More specifically, the system and method of the present disclosure take in to account inaccuracies in eye gaze tracker by providing an alternative to driver/pilot that causes least distraction.
[0026] In an aspect the disclosed system includes an Interactive Projected Display on a semi-transparent sheet, an eye Gaze tracker and a finger tracking device. In an aspect, the semi-transparent sheet providing the Interactive Projected Display can be pasted on windscreen of the automobile/aircraft and allows unhindered view of road/sky ahead to driver/pilot while also providing information in form of a display on the transparent sheet. The information displayed on the sheet can be, for example, various options for selection by the driver being part of infotainment system of the automobile.
[0027] In an aspect, the selection of various options on the display can be based on a pointer or cursor that moves to various options, and once positioned on a desired option, the desired option can be selected by clicking a selection button such as a mouse button, located at a convenient position for the driver to actuate without taking his eye off from primary task of driving.
[0028] In an aspect, pointer/cursor can move on display based on eye gaze tracking as tracked by eye gaze tracker. Thus operator needs to only look at a desired option on windscreen i.e. without having to look down or look up from windscreen, causing minimum distraction from his primary task of driving.
[0029] In an aspect, eye gaze tracker records the eye gaze positions of operator continuously and takes the median of the pixel locations in every 300 msec to estimate the region of interest or saccadic focus points. Position of the gaze determines position of the pointer on display screen and the pointer is moved accordingly for a subsequent selection by the operator.
[0030] In an embodiment, the disclosed system also tracks finger movement of operator through finger movement tracking device as and when hand of the operator is detected.
[0031] In an aspect, the disclosed system caters to eventuality of eye gaze tracker failing to position pointer at desired location on account of low accuracy of eye gaze tracking. In case the eye gaze tracker is unable to track eye gaze accurately, user can make correct the pointer position by finger movement. In this respect, operator can simply position his hand within field of view of finger tracker and move his finger to correct the pointer position. In an aspect, the system, on detection of hand not only starts tracking finger of the operator as stated earlier, but also gives precedence to signal from the finger tracker over signal from eye gaze tracker. Thus input from finger movement override inputs from eye gaze to provide a correct pointer position overcoming deficiency of low accuracy of the gaze tracker.
[0032] In an aspect the disclosed system can further incorporate the Target Prediction system which highlights the nearest target to the eye gaze or finger position and a hardware switch push such as left mouse button to select the target even if the pointer is not exactly on the target button itself.
[0033] In an aspect, the disclosed concept has been tested with a driving simulator to assess efficacy of the proposed system and method. The test was done by asking volunteers to carry out ISO 26022 Lane Changing driving task and deviation of imaginary car from the centre of the lane was observed to quantify performance degradation. Each participant drove once without the proposed projected display and once with the projected display undertaking pointing and selection tasks in the display. The collected data showed that mean deviations between driving with and without the projected display was not statistically significantly different.
[0034] In an aspect, studies were also done to assess cognitive load on participants while using the projected display using NASA TLX scale and their preference using System Usability Scale (SUS). The average cognitive load was below 40 on a scale of 0 to 100 and the SUS was 55.25 against the standard SUS score of 68 for user friendly system and seven out of eight participants rated the system more than 68.
[0035] Various objects, features, aspects and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.

BRIEF DESCRIPTION OF THE DRAWINGS
[0036] The accompanying drawings are included to provide further understanding of the present disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
[0037] FIG. 1 illustrates an exemplary block diagram showing relationship between eye gaze tracking based pointer movement and finger tracking based pointer movement in accordance with embodiments of the present disclosure.
[0038] FIG. 2 illustrates an exemplary block diagram showing working of the proposed Gaze Controlled Projected Display System in accordance with embodiments of the present disclosure.
[0039] FIG. 3 illustrates an exemplary image of experimental set up set up for validating efficacy of the proposed projected display in accordance with embodiments of the present disclosure.
[0040] FIG. 4 illustrates an exemplary interface with hotspots in accordance with embodiments of the present disclosure.
[0041] FIG. 5 illustrates an exemplary bar chart showing comparison of mean deviation using a common reference path with and without the projected display in accordance with embodiments of the present disclosure.
[0042] FIG. 6 illustrates an exemplary bar chart showing comparison of mean deviation using ISO 26022 (Annex E) with and without the projected display in accordance with embodiments of the present disclosure.
[0043] FIG. 7 illustrates an exemplary bar chart showing comparison of cognitive loads of participants while using the projected display for one of the experimental conditions in accordance with embodiments of the present disclosure.
[0044] FIG. 8 illustrates an exemplary bar chart showing preference of users for using the proposed projected display using System Usability Scale (SUS) in accordance with embodiments of the present disclosure.

DETAILED DESCRIPTION
[0045] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0046] Each of the appended claims defines a separate invention, which for infringement purposes is recognized as including equivalents to the various elements or limitations specified in the claims. Depending on the context, all references below to the "invention" may in some cases refer to certain specific embodiments only. In other cases it will be recognized that references to the "invention" will refer to subject matter recited in one or more, but not necessarily all, of the claims.
[0047] Various terms as used herein are shown below. To the extent a term used in a claim is not defined below, it should be given the broadest definition persons in the pertinent art have given that term as reflected in printed publications and issued patents at the time of filing.
[0048] Embodiments explained herein relate to a multi-modal interface system for an automotive environment. In particular, it discloses a method and a system to mitigate driver’s distraction from the driving task by simplifying human-machine interaction in automotive environments by alleviating the need to look down towards display and physically touch the interface. More specifically, the system and method of the present disclosure take in to account inaccuracies in eye gaze tracker by providing an alternative to driver/pilot that causes least distraction.
[0049] It is to be appreciated that though various embodiments have been explained here with reference to application of system and method of the present disclosure in automobile environment for operating an infotainment system, they can with suitable modifications that would be apparent to those skilled in the art, be applied for other similar applications such as in aircrafts and all such applications are well within the scope of the present disclosure without any limitations.
[0050] In an embodiment, system of the present disclosure can incorporate an Interactive Projected Display, an eye Gaze tracker and a finger tracking device. The Interactive Projected Display can be configured on a semi-transparent sheet can be pasted on windscreen of vehicle and can allow unhindered view of road ahead to driver while also providing information in form of a display on the transparent sheet. The information displayed on the sheet can be, for example, various options for selection by the driver being part of infotainment system of the automobile. Thus driver of the vehicle can have access to the displayed information with minimum distraction from the driving task while interacting with the in-vehicle communication and infotainment systems.
[0051] In an embodiment, projected display can also include a pointer or cursor that moves to various options and selection of various options on the display can be based on positioning the pointer within zone of a desired option and thereafter clicking a selection button such as a mouse button, located at a convenient position for the driver to actuate without taking his eye off from primary task of driving.
[0052] In an embodiment, pointer/cursor can move on display based on combination of eye gaze tracking as tracked by eye gaze tracker and finger tracking as tracked by finger tracking device. Thus operator needs to only look at a desired option on windscreen i.e. without having to look down or look up from windscreen, causing minimum distraction from his primary task of driving.
[0053] In an embodiment, while moving the pointer based on eye gaze tracking, the disclosed system caters to eventuality of eye gaze tracker failing to position pointer at desired location on account of limitations of eye gaze tracking process such as low accuracy of eye gaze tracking. In case the eye gaze tracker is unable to track eye gaze accurately, user can make correct the pointer position by finger movement. In this respect, operator can simply position his hand within field of view of finger tracker and move his finger to correct the pointer position. In an aspect, the system, on detection of hand not only starts tracking finger of the operator as stated earlier, but also gives precedence to signal from the finger tracker over signal from eye gaze tracker. Thus input from finger movement overrides input from eye gaze to provide corrected pointer position overcoming deficiency of low accuracy of the gaze tracker.
[0054] FIG. 1 illustrates an exemplary block diagram 100 showing relationship between eye gaze tracking based pointer movement and finger tracking based pointer movement in accordance with embodiments of the present disclosure. As shown if finger movement is simultaneously found with eye gaze signal, the system gives precedence to the finger movement and the system stops moving the pointer based on eye gaze. On the other hand when finger tracker does not locate hand within its field the system can resume to move the pointer based on eye gaze of user.
[0055] Thus a user on finding that pointer is not correctly positioned can simply place his hand/finger on finger tracker and move his finger such as index finger to correct the position and thereafter remove his hand from the finger tracker. The resulting system thus reduces the number of times a user needs to take his eyes off from his primary task of driving and can interact with the projected display simply by looking at it or through moving his index finger.
[0056] FIG. 2 illustrates an exemplary block diagram 200 showing working of the proposed Gaze Controlled Projected Display System in accordance with embodiments of the present disclosure. As shown, input from eye gaze tracker which records the eye gaze positions continuously goes through a median filter 202 configured to take median of the pixel locations (X, Y) in every 300 msec to estimate the region of interest or saccadic focus points. In an embodiment, the median of tracked eye gaze position provides result that is less susceptible to outliers than arithmetic mean in case the eye gaze tracker briefly lost signal. Simultaneously signal from finger tracker goes through 2D orthogonal Projection 204 to get another set of pixel locations (X, Y). As stated earlier, if finger movement is simultaneously found with eye gaze signal, the system gives precedence to the finger movement and accordingly at 206 it is detected if the finger tracker has detected hand within its field, and if so pixel locations (X, Y) are corrected based on the finger tracker else it is based on the eye gaze tracker. At 208 pointer can be moved based on the corrected pixel locations (X, Y).
[0057] In an embodiment the disclosed system can further incorporate a Target Prediction system which highlights as shown at 210, target nearest to corrected pixel locations (X, Y) arrived at based on combination of eye gaze and finger position. The predicted target can now be selected by operator by clicking/pushing a hardware switch similar to a mouse button to select the target even if the pointer is not exactly on the target button itself.
[0058] FIG. 3 illustrates an exemplary image of experimental set-up 300 arranged for validating efficacy of the proposed projected display in accordance with embodiments of the present disclosure. The experimental set-up 300 incorporated a semi-transparent sheet 302, an eye gaze tracker, a finger movement tracker, an off-the shelf computer, a projector and, a driving simulator.
[0059] The semi-transparent sheet 302 was configured to function as an interactive projected display positioned on a stand/holder in front of operator and was sized almost similar to an existing dashboard to maintain legibility of the onscreen labels. The semi-transparent sheet 302 caused minimal occlusion of the situation in front. A driving simulator having a LogiTech GT29 Gaming wheel 304 and a large sized screen 306 was used to simulate driving condition with semi-transparent sheet 302 kept in front of the large sized screen 306 to provide interactive display even as operator carried out his primary driving task through the simulator.
[0060] The desk top computer was configured to control an onscreen pointer using eye gaze and finger movement trackers and using it to operate the projected display. A Leap Motion controller was used for tracking finger and a Tobii EyeX tracker was used for eye gaze tracking. The leap motion controller was used to make corrective movements through finger movement when the eye gaze tracker alone could not bring the pointer on target. If the user puts his hand on the Leap Motion sensor, the pointer stoped moving based on eye gaze. When the user removed his hand from top of the Leap Motion sensor, the pointer resumed to move based on eye gaze of user as explained earlier. Left mouse button was used for selection, and other functions of the mouse were blocked.
[0061] In an embodiment the eye gaze tracker was calibrated using 9-point calibration on a 2-dimensional screen using following set of equations to take an orthogonal projection of 3-dimensional finger position measured using a Leap Motion controller on a 2-dimensionalscreen.

The constants a, b, c, d, w and h were calculated based on the relative screen position with respect to the Leap Motion sensor.
[0062] The set-up recorded eye gaze positions continuously and took median of pixel locations in every 300 msec to estimate the region of interest or saccadic focus points. The median was less susceptible to outliers than arithmetic mean in case the eye gaze tracker briefly lost signal. In accordance with configuration, if finger movement was simultaneously found with eye gaze signal, the system gave precedence to the finger movement signal. A pointer was drawn at the screen 302 based on either eye gaze or finger location. The pointer worked as a feedback to the user and in case the eye gaze tracker could not track accurately, the user could make corrective eye gaze or finger movement to select target.
[0063] The set-up draws a set of spots (referred as hot spots hereafter) on all clickable objects on a graphical user interface. The setup randomly selects a clickable object and also randomly selects a point on the object as its new hotspot. If the new hotspot reduces the value of the cost function defined as , where is the distance between the hotspots on clickable objects/buttons i and j then it is selected and updated. However, even if the new hotspot increases the value of the cost function, it may still be selected based on the following condition:

In the above equation, the value of T runs from 5000 to 1 and reduced by 1 in each iteration. Finally, a set of hotspots are selected those minimizes the value of the cost function. In subsequent user studies, users are instructed to focus on hot spots on their desired clickable object. FIG. 4 shows an exemplary interface with hotspots.
[0064] The set-up included a Target Prediction system that records instantaneous velocity, bearing angle and acceleration of the cursor on screen drawn based on either eye gaze or finger movement of user. When the velocity, bearing angle and acceleration of the cursor on screen indicates that users are trying to select a target, the set up enlarges a clickable object whose hotspot is nearest to the eye gaze or finger position.
[0065] In an embodiment, experimental set-up 300 was used to measure quality of driving while participants used the gaze controlled projected display. For this purpose Lane Change Task in accordance with ISO 26022 was used for this purpose which is a widely used evaluation tool with simple driving scenarios primarily for evaluation of secondary tasks. The participants mimicked driving using LogiTech GT29 Gaming wheel 304. Drivers were seated at a desktop computer with a projected display 302 in front and were required to repeatedly perform lane changes. In the lane changing task, participants were regularly instructed to change lanes in one of three lanes and logging software automatically measured deviation of the imaginary car from the centre of the lane. Each participant drove once without the projected display and once with the projected display undertaking pointing and selection tasks in the display. Data was collected from eleven participants (8 male, 3 female, average age 29.2 years) who conducted the procedure.
[0066] Driving task of lane changes on driving simulator provided matrix for comparison. Lane change performance under dual task conditions of driving and using the telematics system of interest was evaluated against a normative model of single task performance. The mean deviation was measured while participants used the projected display with and without hotspots and touchscreen according to the ISO/TC22/SC13/WG8 Lane Change Test documentation.
[0067] FIG. 5 illustrates an exemplary bar chart showing comparison of mean deviation using a common reference path comparing projected display with and without hotspots and touchscreen. driving performance in terms of mean deviation from designated lane was significantly different in a Kruskal-Wallis H-test [?² (2,28) =10.56, p<0.05]. Pairwise signed rank test also found that driving performance was significantly different between projected gaze controlled systems for using hotspots. It may be noted that using hotspots, mean deviation from designated driving lane was reduced by 41% for projected gaze controlled interface and the mean deviation for hotspot equipped projected gaze controlled interface was even lower than the touchscreen based system..
[0068] FIG. 6 illustrates an exemplary bar chart showing average response time, which was still lowest for touchscreen based system but it is only 2% higher in hotspot equipped projected screen. A one-way ANOVA found significant difference among the reaction times [F(2,257) = 4.84, p < 0.05]. A set of unequal variance t-tests found that touchscreen had significantly lower response times [p<0.05] than projected screen without hotspot while the difference between touchscreen and hotspot equipped projected screen was not significant.
[0069] In an embodiment, to assess the visual and mental distraction caused by the secondary interaction tasks while using the projected display, participant’s cognitive load was studied using NASA TLX scale and their preference using System Usability Scale (SUS) a method for evaluating the usability of system.
[0070] FIG. 7 illustrates an exemplary bar chart showing comparison of cognitive loads of participants while using the projected display for one of the experimental conditions. TLX scores were highest for projected gaze controlled system and lowest for touchscreen. The hotspots reduced the average cognitive load by approximately 6% from the projected gaze controlled system without hotspots. However, we did not find any significant difference among different components of TLX scores for hotspot equipped gaze controlled projected system
[0071] FIG. 8 illustrates an exemplary bar chart showing preference of users for using the proposed projected display using System Usability Scale (SUS). The SUS scores were greater than 68 for all cases and highest for hotspot equipped gaze controlled system, which means all systems were usable with hotspot equipped gaze controlled interfaces was most preferred by users.
[0072] Therefore it can be concluded that driving performance was better than touchscreen for the hotspot equipped projected gaze control system. The average response time was only 2% higher than touchscreen system. The SUS scores also indicated users did not face any serious trouble in using this system. The cognitive load was still higher than touchscreens but it should also be noted that our participants use touchscreen enabled device everyday while they were using the eye gaze controlled interface for first time during the trials..
[0073] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.

ADVANTAGES OF THE INVENTION
[0074] The present disclosure mitigates problem of operator’s distraction from his primary task of driving/flying while interacting with a display for secondary task in automobile/aircraft by alleviating need to look down the display and physically touch the interface.
[0075] The present disclosure provides an Interactive Gaze Controlled Projected Display System in an automotive and aviation environment to alleviate need to look down the display and physically touch the interface.
[0076] The present disclosure provides an Interactive Gaze Controlled Projected Display System with displays on the windscreen without obstructing view of the road or the sky ahead.
[0077] The present disclosure provides an Interactive Gaze Controlled Projected Display System that takes into account inaccuracies in eye gaze tracker.
[0078] The present disclosure provides a Projected Display System that integrates finger tracking as an alternative to gaze tracking based control.
[0079] The present disclosure provides a system that integrates dual control for movement of the curser/pointer based on position of eye gaze as well as finger tracking such that inaccuracies in in eye gaze tracker are taken care.
,CLAIMS:
1. A system for man- machine interaction, the system comprising:
an interactive projected display;
an eye gaze tracker for tracking eye gaze direction of a user, wherein based on detected direction of the eye gaze of the user, a cursor on the display is moved to an area of interest on the display; and
a finger tracking device to detect presence of a finger of the user and track movement of the finger;
wherein, a precedence is given to signal from the finger tracker over signal from eye gaze tracker and position of the cursor on the display is corrected based on detected movement of the finger by moving the cursor in direction of detected movement of the finger.

2. The system as claimed in claim 1, wherein the interactive projected display is on a semi-transparent sheet pasted on a windscreen of an automobile or an aircraft.

3. The system as claimed in claim 1, wherein when the finger tracker does not locate hand within its field, the system resumes to move the cursor based on the detected direction of the eye gaze of the user.

4. The system as claimed in claim 1, wherein the system further includes a median filter, configured to take median of pixels of continuously recorded eye gaze positions on the display, wherein the median is taken after every 300 milli seconds to estimate region of interest for the user.

5. The system as claimed in claim 1, wherein the signal from the finger tracker goes through a 2D orthogonal projection to get pixel locations on the display.

6. The system as claimed in claim 1, wherein the system further incorporates a target prediction system that highlights a clickable object nearest to pixel locations arrived at based on combination of the eye gaze and the detected finger movement.

7. The system as claimed in claim 6, wherein highlighting of a clickable object is based on hotspots associated with each of the clickable objects, wherein a clickable object whose hotspot is closest to the pixel location arrived at based on combination of the eye gaze and the detected finger movement is highlighted.

8. The system as claimed in claim 7, wherein the hotspots associated with different clickable objects are selected based on minimizing value of cost function:

,
where is distance between new hotspots on clickable objects i and j.

9. The system as claimed in claim 8, wherein a new hotspot is selected even if increasing value of the cost function, based on the following condition:

where T runs from 5000 to 1 and reduced by 1 in each iteration.

10. The system as claimed in claim 7, wherein a user is instructed to focus on hot spots on desired clickable object.

Documents

Application Documents

# Name Date
1 Form5_As Filed_05-11-2016.pdf 2016-11-05
2 Form3_As Filed_05-11-2016.pdf 2016-11-05
3 Form2 Title Page_Provisional_05-11-2016.pdf 2016-11-05
4 Drawings_As Filed_05-11-2016.pdf 2016-11-05
5 Description Provisional_As Filed_05-11-2016.pdf 2016-11-05
6 Abstract_As Filed_05-11-2016.pdf 2016-11-05
7 abstract 201641037828.jpg 2016-12-22
8 Other Patent Document [02-01-2017(online)].pdf 2017-01-02
9 Form 26 [02-01-2017(online)].pdf 2017-01-02
10 201641037828-FORM 18 [13-10-2017(online)].pdf 2017-10-13
11 201641037828-DRAWING [13-10-2017(online)].pdf 2017-10-13
12 201641037828-COMPLETE SPECIFICATION [13-10-2017(online)].pdf 2017-10-13
13 201641037828-FORM-26 [14-10-2020(online)].pdf 2020-10-14
14 201641037828-FER_SER_REPLY [14-10-2020(online)].pdf 2020-10-14
15 201641037828-DRAWING [14-10-2020(online)].pdf 2020-10-14
16 201641037828-CORRESPONDENCE [14-10-2020(online)].pdf 2020-10-14
17 201641037828-CLAIMS [14-10-2020(online)].pdf 2020-10-14
18 201641037828-FER.pdf 2021-10-17
19 201641037828-US(14)-HearingNotice-(HearingDate-28-12-2021).pdf 2021-11-29
20 201641037828-Correspondence to notify the Controller [24-12-2021(online)].pdf 2021-12-24
21 201641037828-Written submissions and relevant documents [12-01-2022(online)].pdf 2022-01-12
22 201641037828-Annexure [12-01-2022(online)].pdf 2022-01-12
23 201641037828-Response to office action [03-05-2023(online)].pdf 2023-05-03
24 201641037828-PatentCertificate03-05-2023.pdf 2023-05-03
25 201641037828-IntimationOfGrant03-05-2023.pdf 2023-05-03
26 201641037828-Annexure [03-05-2023(online)].pdf 2023-05-03
27 201641037828-OTHERS [26-07-2023(online)].pdf 2023-07-26
28 201641037828-EDUCATIONAL INSTITUTION(S) [26-07-2023(online)].pdf 2023-07-26

Search Strategy

1 2021-07-1513-52-41AE_15-07-2021.pdf
2 187THFILETPOSEARCHSTRATEGYE_28-08-2020.pdf

ERegister / Renewals

3rd: 27 Jul 2023

From 05/11/2018 - To 05/11/2019

4th: 27 Jul 2023

From 05/11/2019 - To 05/11/2020

5th: 27 Jul 2023

From 05/11/2020 - To 05/11/2021

6th: 27 Jul 2023

From 05/11/2021 - To 05/11/2022

7th: 27 Jul 2023

From 05/11/2022 - To 05/11/2023

8th: 27 Jul 2023

From 05/11/2023 - To 05/11/2024

9th: 27 Jul 2023

From 05/11/2024 - To 05/11/2025

10th: 27 Jul 2023

From 05/11/2025 - To 05/11/2026