Sign In to Follow Application
View All Documents & Correspondence

Method And Device For Tracking Eye Movement Of A Subject

Abstract: A DEVICE FOR TRACKING EYE MOVEMENT OF A SUBJECT AND METHOD THEREOF ABSTRACT The present disclosure relates to a method and an eye movement tracking device (100) for tracking eye movement of a subject. The eye movement tracking device (100) comprises a support structure (101) attached to a plate (103) at one end and the plate is attached to an elastomer layer (105). The support structure (101) is affixed with a Fiber Bragg Grating (FBG) sensor (109) for acquiring strain variations. A probe (107) is attached to another end of support structure (101). The probe (107) is capable to rests adjacent to bottom eyelid of a subject undergoing eye movement tracking. The probe (107) creates strain variations in support structure (101) due to movement of the bottom eyelid of subject while gazing at a predefined pattern displayed on a display unit. Thereafter, the FBG sensor (109) acquires strain variations from support structure (101). The strain variations are transmitted to a computing system for processing to identify one or more characteristics associated with eye movement of the subject.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 January 2018
Publication Number
30/2019
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
bangalore@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-11-23
Renewal Date

Applicants

INDIAN INSTITUTE OF SCIENCE
C V Raman Avenue, Bangalore

Inventors

1. Sharath Umesh
C/o Department of Instrumentation & Applied Physics, C V Raman Avenue, Indian Institute of Science, Bangalore-560012
2. Shweta Pant
C/o Department of Instrumentation & Applied Physics, C V Raman Avenue, Indian Institute of Science, Bangalore-560012
3. Srivani Padma Goggi
C/o Department of Instrumentation & Applied Physics, C V Raman Avenue, Indian Institute of Science, Bangalore-560012
4. Sundarrajan Asokan
C/o Department of Instrumentation & Applied Physics, C V Raman Avenue, Indian Institute of Science, Bangalore-560012
5. Sumitash Jana
Centre for Neurological Sciences, C V Raman Avenue, Indian Institute of Science, Bangalore-560012
6. Varsha Vasudevan
Centre for Neurological Sciences, C V Raman Avenue, Indian Institute of Science, Bangalore-560012
7. Aditya Murthy
Centre for Neurological Sciences, C V Raman Avenue, Indian Institute of Science, Bangalore-560012

Specification

DESC:FORM 2

THE PATENTS ACT 1970
[39 OF 1970]
&
THE PATENTS RULES, 2003

COMPLETE SPECIFICATION

[See section 10; rule 13]

TITLE: “A DEVICE FOR TRACKING EYE MOVEMENT OF A SUBJECT AND METHOD THEREOF”

Name & Address of the applicant:
INDIAN INSTITUE OF SCIENCE (IISc), Indian Institute of Science, C V Raman Avenue, Bangalore 560012

Nationality: India

The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD

Embodiments of the present disclosure are related, in general to ophthalmology and more particularly, but not exclusively to a device and method for tracking eye movement of a subject.
BACKGROUND
Eye movement evaluation is vital for diagnosis of various ophthalmological and neurological disorders. Eye movement is routinely investigated for the assessment of ocular motor functioning and is widely used to study covert processes that are not traceable otherwise. Though there are different types of eye movements such as, saccades, smooth pursuit, vergence and vestibule-ocular movements and the like, research is largely focussed on saccadic eye movements. Saccades are rapid eye movements which focus a target image on a fovea central, which is a region at the centre of the retina possessing highest visual acuity. Today, many covert processes such as, attention, decision making, planning of movements etc., have been studied based on saccades. Tracking of saccadic movements are used for detection of onset and evolution of many psychological and cognitive disorders. For example, saccades are known to be perturbed in numerous neuro-developmental and neuro-psychiatric disorders. Therefore, saccadic eye movements are used as a simple and non-invasive clinical diagnostic tool for studying various disorders.

Previously, clinicians used to rely upon direct observation of eye movements of a subject. However, recent advancement in various measurement methodologies has facilitated clinicians to acquire precise quantitative eye movement characteristics using various types of eye movement trackers. Typically, the eye movement trackers are mainly divided as contact type or non-contact type based on a sensing methodology employed. The contact type eye movement trackers include electrodes mounted around the eye or head mount devices, whereas the non-contact type eye movement trackers use cameras to track the movement of the pupil. Some of the existing eye movement detection techniques include electro-oculography, limbal tracking, video-oculography and the magnetic search coil. The search coils are invasive and hence it is an uncomfortable method of eye tracking. While the camera-based systems are easy to use, one potential problem is that the tracking is accurate only if the eyes are wide open and perform poorly when the subject looks down, which makes the eye aperture smaller. Further, camera-based methods cannot be used to track the eye movements during rapid eye movement sleep.

The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.

SUMMARY
Disclosed herein is an eye movement tracking device for tracking eye movements of a subject. The eye movement tracking device comprises support structure attached to a plate at one end and the plate is attached to an elastomer layer. The support structure is affixed with a Fiber Bragg Grating (FBG) sensor for acquiring strain variations. A probe attached to another end of the support structure. The probe is capable to rests adjacent to bottom eyelid of a subject undergoing eye movement tracking. The probe creates the strain variations in the support structure due to movement of the bottom eyelid of the subject while gazing at a predefined pattern displayed on a display unit. The FBG sensor acquires the strain variations from the support structure. The acquired strain variations are transmitted to an external computing system for processing to identify one or more characteristics associated with the eye movement of the subject.

Further, disclosed herein is a method for tracking eye movements of a subject. The method comprises obtaining displacement movement from bottom eyelid of a subject undergoing eye movement tracking. The displacement movement is obtained while gazing at a predefined pattern displayed on a display unit. The method comprises creating strain variations in a support structure attached to the plate due to the displacement movement and acquiring the strain variations. The acquired strain variations are transmitted to an external computing system for processing to identify one or more characteristics associated with the eye movement of the subject.

The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.

BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of device or system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and with reference to the accompanying figures, in which:

Fig.1a illustrates an exemplary block diagram of an eye movement tracking device in accordance with some embodiments of the present disclosure;

Fig.1b illustrates an exemplary eye movement tracking device in accordance with some embodiments of the present disclosure;

Fig.1c illustrates an exemplary embodiment for transmitting the strain variations in accordance with some embodiments of the present disclosure;

Fig.2a illustrates an exemplary embodiment for placement of FBG probes along with the plane demonstration in accordance with some embodiments of the present disclosure;

Fig.2b illustrates an exemplary embodiment of obtaining eye movement displacement of a subject in accordance with some embodiments of the present disclosure;

Fig.3a illustrates an exemplary experiment setup of validating the eye movement tracking device using an infrared based pupil tracker device along with a display in accordance with some embodiments of the present disclosure;

Fig.3b and Fig.3c illustrates an exemplary embodiment of possible location of target at three eccentricities in different planes in accordance with some embodiments of the present disclosure;

Fig.4 illustrates a flowchart showing a method for tracking eye movement of a subject in accordance with some embodiments of the present disclosure;

Fig.5a shows a graph illustrating a shift in wavelength response from right eye of a subject while following a pattern along plane 1 in accordance with some embodiments of the present disclosure;

Fig.5b and Fig.5c shows graphs illustrating responses of eye movement tracking device for eye movement along plane 1 at right eye and left eye respectively in accordance with some embodiments of the present disclosure;

Fig.6a and Fig.6b shows graphs illustrating angular movement acquired from right eye for movement along plane 1 using an infrared based pupil tracker device and the eye movement tracking device respectively in accordance with some embodiments of the present disclosure;

Fig.6c shows a graph illustrating comparison of angular movement obtained from infrared based pupil tracker device and the eye movement tracking device simultaneously for plane 1 in accordance with some embodiments of the present disclosure;

Fig.7a and Fig.7b shows a graph illustrating responses of the eye movement tracking device for eye movement along plane 2 at left eye and right eye respectively in accordance with some embodiments of the present disclosure;

Fig.7c and Fig.7d shows graphs illustrating the angular movement obtained from right eye of a subject by infrared based pupil tracker device and eye movement tracking device, with respect to the position displayed on the screen respectively in accordance with some embodiments of the present disclosure;

Fig.7e shows a graph illustrating comparison of angular movement obtained from infrared based pupil tracker device and eye movement tracking device simultaneously for plane 2 in accordance with some embodiments of the present disclosure;

Fig.8a and Fig.8b shows a graph illustrating velocity of eye movement for varying saccades acquired from eye movement tracking device and infrared based pupil tracker device respectively in accordance with some embodiments of the present disclosure;

Fig.8c and Fig.8d shows a graph illustrating time duration recorded for varying saccades acquired from eye movement tracking device and infrared based pupil tracker device respectively in accordance with some embodiments of the present disclosure;

Fig.8e and Fig.8f shows a graph illustrating peak velocity recorded for varying saccades acquired from eye movement tracking device and infrared based pupil tracker device respectively in accordance with some embodiments of the present disclosure; and

Fig.9 shows a graph illustrating FBG wavelength obtained from eye movement tracking device starting from 0 degree and 5-degree position in accordance with some embodiments of the present disclosure.

It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.

DETAILED DESCRIPTION
In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.

While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.

The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or method.

In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.

Embodiments of the present disclosure relate to an eye movement tracking device and a method for tracking eye movement of a subject. In an embodiment, the eye movement tracking device is a contact type wearable device. In order to measure the eye movement of a subject, the eye movement tracking device is placed near cheeks of the subject. The eye movement tracking device comprises a support structure attached at one end to a plate which is in turn attached to an elastomer layer. In an embodiment, the support structure is a stainless-steel cantilever. Another end of the support structure is attached to a probe. In an embodiment, the eye movement tracking device is placed near cheeks of the subject such that, the probe rests adjacent to bottom eyelid of the subject. In an embodiment, any movement in the bottom eyelid of the subject is initiated by muscles surrounding the eye. Essentially, the subject is instructed to follow a predefined pattern as displayed on a display unit placed in front of the subject. On occurrence of any movement in the bottom eyelid of the subject while gazing at the predefined pattern, the probe creates strain variations in the support structure which is in turn acquired by an FBG sensor affixed on the support structure. The strain variations are transmitted to an external computing system for processing to identify one or more characteristics associated with the eye movement of the subject. In an embodiment, the acquired strain variations are processed spatially and temporally to identify one or more characteristics. The present disclosure provides the eye movement tracking device which is electrically passive i.e., no electric power required at the sensor end, wearable, compact, light-weight and portable device.

In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.

Fig.1a illustrates an exemplary block diagram of an eye movement tracking device in accordance with some embodiments of the present disclosure.

The Fig.1a illustrates an eye movement tracking device 100. The eye movement tracking device 100 includes a support structure 101 attached to a plate 103 at one end and to a probe 107 at another end. The plate 103 is attached to an elastomer layer 105, for example rubber. Further the support structure 101 is affixed with a Fiber Bragg Grating (FBG) sensor 109 (hereinafter referred to as FBG sensor 109) to acquire strain variations as a result to an eye movement of a subject. In an embodiment, the support structure 101 is a stainless-steel cantilever. In an embodiment, a cantilever is a rigid structural element such as a beam or a plate anchored at one end to a support. For example, a stainless-steel cantilever of dimension 30mm in length, 3mm in width and 0.1mm in thickness may be used at the support structure 101. In an embodiment, the cantilever may also be constructed with any other dimensions not mentioned explicitly in the present disclosure. In an embodiment, the probe 107 may be a plastic probe. The eye movement tracking device 100 is a contact type and a standalone device. Fig.1b illustrates an exemplary eye movement tracking device in accordance with some embodiments of the present disclosure. The subject undergoing eye movement tracking is seated and the eye movement tracking device 100 is placed on cheeks of the subject with assistance of the elastomer layer 105. The eye movement tracking device 100 is placed such that the probe 107 rests adjacent to bottom eyelid of the subject. To track the eye movements, the subject may be instructed to follow a predefined pattern as shown on a display unit (not shown explicitly in Fig.1a) placed in front of the subject. In one embodiment, the display unit may include any device, for example, computer monitors, laptops and the like for displaying images, videos and the like. While gazing at the predefined pattern, the eye movement tracking device 100 captures gaze movements associated with the eye of the subject. In an embodiment, the movement of the eye is initiated by muscles surrounding the eye. Particularly, the bottom eyelid of the subject moves due to eyeball swivels in socket with every movement.

Consequentially, the probe 107 resting on the bottom eyelid moves thereby creating the strain variation over the support structure 101. The FBG sensor 109 affixed on the support structure 101 acquires the strain variations from the support structure 101 and transmits to an external computing system for processing to identify one or more characteristics associated with the eye movement of the subject. Fig.1c illustrates an exemplary embodiment for transmitting the strain variations in accordance with some embodiments of the present disclosure. The strain variations from the eye movement tracking device 100 are received by the computing system 111 for processing. As shown in the Fig.1b, the eye movement tracking device 100 is connected to the computing system 111 through a communication interface 113. In an embodiment, the communication interface may be wired interface or wireless interface. The strain variations are transmitted to the computing system 111 through the communication interface 113. The computing system 111 may include an I/O interface 115, a memory 117 and a processor 119. The I/O interface 115 may be configured to receive the strain variations from the FBG sensor 109. The strain variations received from the I/O interface 115 may be stored in the memory 117. The memory 117 may be communicatively coupled to the processor 119 of the computing system 111. The memory 117 may also store processor instructions which may cause the processor 119 to execute the instructions for processing the strain variations to identify one or more characteristics associated with the eye movement of the subject. In an embodiment, the one or more characteristic associated with the eye movement comprises saccades, fixations, blinks and the like. A person skilled in the art would understand that the one or more characteristics may include any other characteristics, not mentioned explicitly, which may be identified from the processing of the eye movement of the subject. Returning back to Fig.1a, in an embodiment, the eye movement tracking device 100 may be validated by comparing outcome from the eye movement tracking device 100 with outcome of an infrared based pupil tracker. The FBG sensor 109 is bonded on the stainless-steel cantilever to obtain the swivel of the eyeball of the subject during eye gaze movement, in the form of displacement

Fig.2a illustrates an exemplary embodiment for placement of FBG probes along with the plane demonstration in accordance with some embodiments of the present disclosure.

Fig.2a shows exemplary eyes of a subject with placement of the probe 107. In an embodiment, the eye movement may be measured from each eye of the subject, i.e., right eye and left eye sequentially one at a time. In another embodiment, the eye movement may be measured simultaneously from the eyes of the subject by placing two separate eye movement tracking device 100 at cheeks of the subjects. For instance, in order to measure eye movement from both the eyes, two separate eye movement tracking devices are placed on both side of the cheeks of the subject. A probe 201 of the eye movement tracking device 100 placed on the right side of the cheek rests on the bottom eyelid of the right eye. Similarly, a probe 203 of the eye movement tracking device 100 placed on left side of the cheek rests adjacent to bottom eyelid of the left eye as shown in Fig.2a. In an embodiment, the movement of the eyes are initiated by the muscles surrounding the eyes. In an embodiment, eyeball swivels in eye socket with every movement, thus making the bottom eyelid of both the eyes move along the movement. Consequentially, the probe 201 and the probe 203 resting on the bottom eyelids of the right eye and the left eye may also move creating the strain variations. Further, the strain variations created at the bottom eyelids of the right eye is acquired by the FBG sensor 109 associated with the eye movement tracking device 100 placed on the right side of the cheek. Similarly, the strain variations created at the bottom eyelids of the left eye is acquired by the FBG sensor 109 associated with the eye movement tracking device 100 placed on the left side of the cheek. In an embodiment, the probe 201 on the right eye of the subject may capture the gaze movement in a plane 1, with better sensitivity. Similarly, the probe 203 on the left eye of the subject may capture the gaze movement in plane 2, with better sensitivity.

In an embodiment, the FBG sensor 109 is a periodic modulation of a refractive index of a core of a single-mode photosensitive optical fiber, along its axis. Whenever a broadband light is launched into the FBG sensor 109, a narrow band of wavelength satisfying Bragg condition is reflected back while rest of spectrum is transmitted. This reflected Bragg wavelength (?B) of the FBG sensor 109 is given by below equation.

…………………………………………………………………………… (1)

Where, ? is the periodicity of grating;
neff is the effective refractive index of fiber core.

In an embodiment, any external perturbation such as strain, temperature, etc., at the grating site of the FBG sensor 109 may alter periodicity of the grating, which in turn shifts the reflected Bragg wavelength. By interrogating the shift in Bragg wavelength, the parametric external perturbation may be quantified. For example, the strain effect on an FBG sensor 109 is expressed by below equation.
..….....……………………………………….……(2)
Where, P11 and P12 are components of the strain-optic sensor,
is the Poisson’s ratio
E is the axial strain change.

Fig.3a illustrates an exemplary experiment setup of validating the eye movement tracking device 100 using an infrared based pupil tracker device along with a display in accordance with some embodiments of the present disclosure. An infrared based pupil tracker device is mounted on a desk in front of the subject which monitors the eye movements at a predefined frequency such as, for example, 240 Hz and a spatial resolution of for example, ~1o of visual angle. Validation of the eye movement by the eye movement tracking device 100 is performed by comparing the results with the infrared based pupil tracker device, which tracks the movement of the eyes. Fig.3b and Fig.3c illustrates exemplary embodiments of possible location of target at three eccentricities in different planes in accordance with some embodiments of the present disclosure. In an embodiment, a screen is placed, during validation, at a predefined distance, for example, say, 57 cm from the subject such that one-centimetre shift on the screen subtends roughly 1 degree of visual angle at the eyes. Head of the subject is locked in a place in order to ensure that no lateral movement of the head occurs. In an embodiment, calibration for the infrared based pupil tracker device is carried such that the subject looks at coloured squares displayed successively at multiple locations on the screen and the eye gains of the subject are adjusted. Each trial begins with the subject fixating eyes at the centre of the screen on a white fixation square. Further, after a predefined delay, for example, a delay of 500±50 ms, a target is displayed at periphery to which the subject makes a saccade. The sequence of steps in the setup is shown in fig.3b. In an embodiment, the targets are displayed at an eccentricity of 5o, 9o and 13o from the centre. The possible locations of target at the three eccentricities in the two planes which are displayed during the setup is shown in Fig.3b. In an embodiment, the pattern is displayed such that 5o, 9o and 13o angular movement of the eye of the subject is initiated from the fixated center.

Further, the two planes considered are perpendicular to each other as shown in Fig. 3b, where the up-gaze or adduction and down-gaze or abduction with respect to the right eye of the subject is referred to as plane 1, while up-gaze or adduction and down-gaze or abduction with respect to the left eye is referred to as plane 2. Fig.6a and Fig.6b shows graphs illustrating angular movement acquired from right eye for movement along plane 1 using an infrared based pupil tracker device and the eye movement tracking device 100 respectively in accordance with some embodiments of the present disclosure. During the exemplary setup as shown above in Fig.3b and Fig.3c, which consisted of eye movements to three eccentricities along two planes, the saccadic eye movement is recorded simultaneously by infrared based pupil tracker device and the eye movement tracking device 100. Fig.6a shows an exemplary result obtained from the infrared based pupil tracker device for few subjects. Similarly, the results obtained by the eye movement tracking device 100 for the same trials are shown in Fig.6b. Fig.6c shows a graph illustrating comparison of angular movement obtained from infrared based pupil tracker device and the eye movement tracking device 100 simultaneously for plane 1 in accordance with some embodiments of the present disclosure. Particularly, mean amplitude of the angular movement obtained from the eye movement tracking device 100 is compared with corresponding mean amplitude of the angular movement obtained from the pupil tracker device as shown in Fig.6c. In an embodiment, the angular movement obtained from the pupil tracker device as well as the eye movement tracking device 100 are in agreement by a correlation coefficient of 0.99.

Fig.7a and Fig.7b shows a graph illustrating responses of the eye movement tracking device 100 for eye movement along plane 2 at left eye and right eye respectively in accordance with some embodiments of the present disclosure. Fig.7a and Fig.7b show on the same lines of the trial carried out on plane 1, another trial with the eye movement pattern along plane 2. The trail in plane 2 consists of eye movement of up-gaze or adduction and down-gaze or abduction with respect to the left eye. The response obtained from the eye movement tracking device 100 for the eye movement along plane 2 is analysed using the same procedure as plane 1. Further, a linear response is obtained between the shifts in wavelength of eye movement tracking device 100 and the angular movement from both eyes as shown in Fig.7a and Fig.7b. Further, the angular movement obtained from the right eye by pupil tracker device and eye movement tracking device 100, with respect to the position displayed on the screen, as shown in Fig.7c and Fig.7d respectively. Further, the angular movements obtained from eye movement tracking device 100 and pupil tracker device shows good agreement as shown in Fig.7e and hence proves the efficacy of the eye movement tracking device 100 as an eye tracker.

Fig.8a and Fig.8b shows a graph illustrating velocity of eye movement for varying saccades acquired from eye movement tracking device 100 and infrared based pupil tracker device respectively in accordance with some embodiments of the present disclosure. In an embodiment, kinematic profile of saccades is highly stereotypical, where the saccade peak velocity and duration show a monotonic relationship with saccade amplitude, which referred as a main sequence. The main sequence is evaluated from both pupil tracker device and eye movement tracking device 100 from right eye moving along plane 1. In an embodiment, three different saccades of 50, 90, 130 angular movement are chosen for comparison. The responses from pupil tracker device and eye movement tracking device 100 are recorded simultaneously. The eye movement velocity obtained for each of these saccades from eye movement tracking device 100 and pupil tracker device are shown in Fig.8a and Fig.8b respectively. Fig.8c and Fig.8d shows a graph illustrating time duration recorded for varying saccades acquired from eye movement tracking device 100 and infrared based pupil tracker device respectively in accordance with some embodiments of the present disclosure. A small variation in duration may be attributed to the difference in the sampling rate of acquisition between eye movement tracking device 100 and pupil tracker device. Further, a small variation in the velocity profile can be observed from both, for 130 saccades at around 0.058s with eye movement tracking device 100 and 0.07s with pupil tracker device as shown in Fig.8c and Fig.8d. Further, the main sequence graphs of peak velocity with respect to amplitude is compared for five trials carried out by the subject with varying amplitudes as shown in Fig.8e for eye movement tracking device 100 and Fig.8f for pupil tracker device respectively. As shown in Fig.8e and Fig.8f, the peak velocity responses from eye movement tracking device 100 and pupil tracker device are observed to be similar. Further, comparison of peak velocity with respect to amplitude is carried out for the same five trials performed by the subject. The response of the comparison of the eye movement tracking device 100 and pupil tracker device are similar. Also, slopes obtained by comparison are in accordance with each other, i.e. a slope of 15.45 with eye movement tracking device 100 and slope of 16.99 with pupil tracker device. Further, to validate the efficacy of the eye movement tracking device 100 to acquire the angular motion from off-centre positions, a test is carried out where the targets are displayed while starting with a 50 offset. Further, two trials are carried out to validate the eye movement tracking device 100 for offset trials. Firstly, a sequence of eye movement starts from 00 positions (indicated with square box) and secondly the sequence of the eye movement starts from 50 positions (indicated with circle). Based on the validation test, a good concurrence is obtained between the two curves obtained from eye movement tracking device 100, as shown in Fig.9.

Fig.4 illustrates a flowchart showing a method for tracking eye movement of a subject in accordance with some embodiments of the present disclosure.

As illustrated in Fig.4, the method 400 includes one or more blocks for tracking eye movement of a subject. The method 400 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions or implement particular abstract data types.

The order in which the method 400 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.

At block 401, displacement movement from bottom eyelid of the subject undergoing eye movement tracking is obtained by the probe 107. In an embodiment, the displacement movement is obtained while gazing at the predefined pattern displayed on the display unit. Fig.2b illustrates an exemplary embodiment of obtaining eye movement displacement of a subject in accordance with some embodiments of the present disclosure. Fig.2b shows a subject, i.e., a female, for obtaining the displacement in the eye movement. The eye movement tracking device 100 are attached on the cheeks of the subject on either side, such that the probe 107 of the eye movement tracking device 100 is positioned adjacent to the lower eyelid as shown in fig.2b. The subject is required to fix the gaze at centre which is followed by an eye movement to the target as it appears. Further, the responses from both the eyes are recorded. In an embodiment, the swivel of eyeball of the subject, in form of displacement are recorded for both eyes by the eye movement tracking device 100 simultaneously through an interrogator for example, an SM 130-700 FBG Micron Optics Interrogator (MOI). The FBG MOI acquires the displacement with a sampling rate of 1 kHz and with a resolution of 1 pm shift in Bragg wavelength and converts to 0.81µ? strain variation. The strain variation response obtained with respect to both the eyes of the subject is recorded simultaneously by the eye movement tracking device 100. In an embodiment, complete pattern of strain variation is divided among two planes. In an embodiment, the two planes are chosen based on prior knowledge that the right eye and the left eye have dominant movement along plane 1 and plane 2 respectively. The plane 1 consists of eye movement of up-gaze or adduction and down-gaze or abduction with respect to the right eye. The strain variation obtained from the right eye is more than the strain variation obtained from the left eye, when both the eyes follow the movement along plane 1. The shift in wavelength response obtained from the right eye while following the pattern along plane 1, is shown in Fig.5a. Fig.5a shows a graph illustrating a shift in wavelength response from right eye of a subject while following a pattern along plane 1. Fig.5b and Fig.5c show graphs illustrating responses of eye movement tracking device 100 along plane 1 at right eye and left eye respectively. As shown, Fig.5b and Fig.5c show the reflected Bragg wavelength (i.e., strain variation = 1.22 x shift in wavelength) obtained for each saccadic movement from 00 to 50, 90, 130 for ten different trials performed by the said subject with left and right eye as shown in Fig.5a and Fig.5c respectively. Further, a slope obtained from the data is utilized as the scaling factor to convert the wavelength shift obtained from the FBG sensor 109 into an angular movement of the eye. A high correlation coefficient is obtained between the amplitude of eye movement at the target and the resultant shift in wavelength experienced by the eye movement tracking device 100. The wavelength shift exhibits a linear response in case of both the eyes.

At block 403, the probe 107 creates the strain variations due to the displacement movement. In an embodiment, the probe 107 transduces the muscular displacement of the eyes into strain variations over the support structure 101.

At block 405, the strain variations are acquired by the FBG sensor 109 affixed on the support structure 101. In an embodiment, the strain variations are transmitted to the computing system 111 for processing to identify one or more characteristics associated with the eye movement of the subject. In an embodiment, the strain variations for both the eyes is processed spatially and temporally to identify eye movement characteristics. In an embodiment, the eye movement characteristics may comprise blink, saccades, and fixations.

In an embodiment, the eye movement tracking device is a standalone system and is electrically passiveness (no electric power required at the sensor end), wearable, compact, light-weight and portable, thereby being a point-of-care testing device.

An embodiment of the present disclosure provides an effective eye movement tracking device which is insensitive to electromagnetic interference, low fatigue and provides ultra-fast response.

The terms "an embodiment", "embodiment", "embodiments", "the embodiment", "the embodiments", "one or more embodiments", "some embodiments", and "one embodiment" mean "one or more (but not all) embodiments of the invention(s)" unless expressly specified otherwise.

The terms "including", "comprising", “having” and variations thereof mean "including but not limited to", unless expressly specified otherwise.

The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise. The terms "a", "an" and "the" mean "one or more", unless expressly specified otherwise.

A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention. When a single device or article is described herein, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be readily apparent that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.

The illustrated operations of Figure 4 show certain events occurring in a certain order. In alternative embodiments, certain operations may be performed in a different order, modified or removed. Moreover, steps may be added to the above described logic and still conform to the described embodiments. Further, operations described herein may occur sequentially or certain operations may be processed in parallel. Yet further, operations may be performed by a single processing unit or by distributed processing units.

Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.

While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.

REFERRAL NUMERALS:

Reference number Description
100 Eye movement tracking device
101 Support structure
103 Plate
105 Elastomer layer
107 Probe
109 FBG sensor
111 Computing system
113 Communication interface
115 I/O interface
117 Memory
119 Processor
201 Probe on right eye
203 Probe on left eye

,CLAIMS:We claim:

1. An eye movement tracking device (100) for tracking eye movements of a subject, the eye movement tracking device (100) comprises:

a support structure (101) attached to a plate (103) at one end and the plate (103) is attached to an elastomer layer (105), wherein the support structure (101) is affixed with a Fiber Bragg Grating (FBG) sensor (109) for acquiring strain variations; and
a probe (107) attached to another end of the support structure (101), wherein the probe (107) is capable to rests adjacent to bottom eyelid of a subject undergoing eye movement tracking, wherein the probe (107) creates the strain variations in the support structure (101) due to movement of the bottom eyelid of the subject while gazing at a predefined pattern displayed on a display unit;
the FBG sensor (109) acquires the strain variations from the support structure (101), wherein the acquired strain variations are transmitted to an external computing system for processing to identify one or more characteristics associated with the eye movement of the subject.

2. The eye movement tracking device (100) as claimed in claim 1 is placed on cheeks of the subject with assistance of the elastomer layer (105).

3. The eye movement tracking device (100) as claimed in claim 1, wherein the strain variations are created when the bottom eyelid of the subject moves due to eyeball swivels in socket and simultaneously moving the probe (107) placed on the bottom eyelid.

4. The eye movement tracking device (100) as claimed in claim 1 captures gaze movements associated with the eyes of the subject.

5. The eye movement tracking device (100) as claimed in claim 1, wherein the one or more characteristic associated with the eye movement comprises saccades, fixations and blinks.

6. The eye movement tracking device (100) as claimed in claim 1 is validated by comparing outcome from the eye movement tracking device (100) with outcome of an infrared based pupil tracker.

7. The eye movement tracking device (100) as claimed in claim 1, wherein the support structure (101) is a cantilever.

8. A method for tracking eye movements of a subject, the method comprising:
obtaining, by a probe (107) of an eye movement tracking device (100), displacement movement from bottom eyelid of a subject undergoing eye movement tracking, wherein the displacement movement is obtained while gazing at a predefined pattern displayed on a display unit and wherein the probe (107) is capable to rests adjacent to bottom eyelid of a subject undergoing eye movement tracking;
creating, by the probe (107), strain variations in a support structure (101) attached to the probe (107) due to the displacement movement; and
acquiring, by an FBG sensor (109) affixed to the support structure (101), the strain variations, wherein the acquired strain variations are transmitted to an external computing system for processing to identify one or more characteristics associated with the eye movement of the subject.

9. The method as claimed in claim 1, wherein the eye movement tracking device (100) is placed on cheeks of the subject with assistance of an elastomer layer (105).

10. The method as claimed in claim 1, wherein the strain variations are created when the bottom eyelid of the subject moves due to eyeball swivels in socket and simultaneously moving the probe (107) placed on the bottom eyelid.

11. The method as claimed in claim 1, wherein the eye movement tracking device (100) captures gaze movements associated with the eyes of the subject.

12. The method as claimed in claim 1, wherein the one or more characteristic associated with the eye movement comprises saccades, fixations and blinks.

13. The method as claimed in claim 1 further comprising validating outcome from the eye movement tracking device (100) with outcome of an infrared based pupil tracker.

14. The method as claimed in claim 1, wherein the support structure (101) is a cantilever.

Dated this 17th Day of January 2019


MADHUSUDHAN S.T.
Of K&S Partners
IN/PA-1297
Agent for the Applicant

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 201841002316-EDUCATIONAL INSTITUTION(S) [04-06-2024(online)].pdf 2024-06-04
1 201841002316-STATEMENT OF UNDERTAKING (FORM 3) [19-01-2018(online)].pdf 2018-01-19
2 201841002316-OTHERS [04-06-2024(online)].pdf 2024-06-04
2 201841002316-PROVISIONAL SPECIFICATION [19-01-2018(online)].pdf 2018-01-19
3 201841002316-IntimationOfGrant23-11-2023.pdf 2023-11-23
3 201841002316-FORM 1 [19-01-2018(online)].pdf 2018-01-19
4 201841002316-PatentCertificate23-11-2023.pdf 2023-11-23
4 201841002316-DRAWINGS [19-01-2018(online)].pdf 2018-01-19
5 201841002316-Written submissions and relevant documents [21-11-2023(online)].pdf 2023-11-21
5 201841002316-DECLARATION OF INVENTORSHIP (FORM 5) [19-01-2018(online)].pdf 2018-01-19
6 201841002316-FORM-26 [06-11-2023(online)].pdf 2023-11-06
6 201841002316-FORM-26 [01-03-2018(online)].pdf 2018-03-01
7 201841002316-Proof of Right (MANDATORY) [17-05-2018(online)].pdf 2018-05-17
7 201841002316-Correspondence to notify the Controller [30-10-2023(online)].pdf 2023-10-30
8 Correspondence by Agent_Form1_21-05-2018.pdf 2018-05-21
8 201841002316-US(14)-HearingNotice-(HearingDate-06-11-2023).pdf 2023-10-05
9 201841002316-FORM 18 [17-01-2019(online)].pdf 2019-01-17
9 201841002316-Written submissions and relevant documents [19-09-2023(online)].pdf 2023-09-19
10 201841002316-Correspondence to notify the Controller [22-08-2023(online)].pdf 2023-08-22
10 201841002316-DRAWING [17-01-2019(online)].pdf 2019-01-17
11 201841002316-COMPLETE SPECIFICATION [17-01-2019(online)].pdf 2019-01-17
11 201841002316-FORM-26 [22-08-2023(online)].pdf 2023-08-22
12 201841002316-OTHERS [29-04-2021(online)].pdf 2021-04-29
12 201841002316-US(14)-HearingNotice-(HearingDate-04-09-2023).pdf 2023-08-03
13 201841002316-FER.pdf 2021-10-17
13 201841002316-FER_SER_REPLY [29-04-2021(online)].pdf 2021-04-29
14 201841002316-ABSTRACT [29-04-2021(online)].pdf 2021-04-29
14 201841002316-DRAWING [29-04-2021(online)].pdf 2021-04-29
15 201841002316-CLAIMS [29-04-2021(online)].pdf 2021-04-29
16 201841002316-ABSTRACT [29-04-2021(online)].pdf 2021-04-29
16 201841002316-DRAWING [29-04-2021(online)].pdf 2021-04-29
17 201841002316-FER_SER_REPLY [29-04-2021(online)].pdf 2021-04-29
17 201841002316-FER.pdf 2021-10-17
18 201841002316-US(14)-HearingNotice-(HearingDate-04-09-2023).pdf 2023-08-03
18 201841002316-OTHERS [29-04-2021(online)].pdf 2021-04-29
19 201841002316-COMPLETE SPECIFICATION [17-01-2019(online)].pdf 2019-01-17
19 201841002316-FORM-26 [22-08-2023(online)].pdf 2023-08-22
20 201841002316-Correspondence to notify the Controller [22-08-2023(online)].pdf 2023-08-22
20 201841002316-DRAWING [17-01-2019(online)].pdf 2019-01-17
21 201841002316-FORM 18 [17-01-2019(online)].pdf 2019-01-17
21 201841002316-Written submissions and relevant documents [19-09-2023(online)].pdf 2023-09-19
22 201841002316-US(14)-HearingNotice-(HearingDate-06-11-2023).pdf 2023-10-05
22 Correspondence by Agent_Form1_21-05-2018.pdf 2018-05-21
23 201841002316-Correspondence to notify the Controller [30-10-2023(online)].pdf 2023-10-30
23 201841002316-Proof of Right (MANDATORY) [17-05-2018(online)].pdf 2018-05-17
24 201841002316-FORM-26 [01-03-2018(online)].pdf 2018-03-01
24 201841002316-FORM-26 [06-11-2023(online)].pdf 2023-11-06
25 201841002316-Written submissions and relevant documents [21-11-2023(online)].pdf 2023-11-21
25 201841002316-DECLARATION OF INVENTORSHIP (FORM 5) [19-01-2018(online)].pdf 2018-01-19
26 201841002316-PatentCertificate23-11-2023.pdf 2023-11-23
26 201841002316-DRAWINGS [19-01-2018(online)].pdf 2018-01-19
27 201841002316-IntimationOfGrant23-11-2023.pdf 2023-11-23
27 201841002316-FORM 1 [19-01-2018(online)].pdf 2018-01-19
28 201841002316-PROVISIONAL SPECIFICATION [19-01-2018(online)].pdf 2018-01-19
28 201841002316-OTHERS [04-06-2024(online)].pdf 2024-06-04
29 201841002316-STATEMENT OF UNDERTAKING (FORM 3) [19-01-2018(online)].pdf 2018-01-19
29 201841002316-EDUCATIONAL INSTITUTION(S) [04-06-2024(online)].pdf 2024-06-04

Search Strategy

1 201841002316E_20-10-2020.pdf

ERegister / Renewals

3rd: 22 Feb 2024

From 19/01/2020 - To 19/01/2021

4th: 22 Feb 2024

From 19/01/2021 - To 19/01/2022

5th: 22 Feb 2024

From 19/01/2022 - To 19/01/2023

6th: 22 Feb 2024

From 19/01/2023 - To 19/01/2024

7th: 22 Feb 2024

From 19/01/2024 - To 19/01/2025

8th: 22 Feb 2024

From 19/01/2025 - To 19/01/2026

9th: 22 Feb 2024

From 19/01/2026 - To 19/01/2027

10th: 22 Feb 2024

From 19/01/2027 - To 19/01/2028