Sign In to Follow Application
View All Documents & Correspondence

Method And System For Removing A Plurality Of Noises From Eye Gaze Data

Abstract: A method and system for removing a plurality of noises from the eye gaze data has been provided. A low resolution eye tracking sensor has been used for capturing the eye gaze data, which mainly contains systematic noise and variable noise. The eye gaze data of the person is captured when a stimulus is provided to the person. Initial smoothing of gaze data is proposed using Graph Signal Processing (GSP) followed by a time varying Kalman filter for tracking the dynamic eye movement trajectory. An unsupervised learning is performed to further remove the variable noise in the static segments. Finally, a Gaussian weighted transformation is applied to remove the systematic noise.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 December 2016
Publication Number
25/2018
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2023-03-10
Renewal Date

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point, Mumbai-400021, Maharashtra, India

Inventors

1. SINHA, Aniruddha
Tata Consultancy Services Limited, Building 1B, Ecospace Plot - IIF/12, New Town, Rajarhat, Kolkata - 700160, West Bengal, India
2. GAVAS, Rahul Dasharath
Tata Consultancy Services Limited, Building 1B, Ecospace Plot - IIF/12, New Town, Rajarhat, Kolkata - 700160, West Bengal, India
3. CHAKRAVARTY, Kingshuk
Tata Consultancy Services Limited, Building 1B, Ecospace Plot - IIF/12, New Town, Rajarhat, Kolkata - 700160, West Bengal, India
4. TRIPATHY, Soumya Ranjan
Tata Consultancy Services Limited, Building 1B, Ecospace Plot - IIF/12, New Town, Rajarhat, Kolkata - 700160, West Bengal, India
5. CHATTERJEE, Debatri
Tata Consultancy Services Limited, Building 1B, Ecospace Plot - IIF/12, New Town, Rajarhat, Kolkata - 700160, West Bengal, India
6. LAHIRI, Uttama
Indian Institute of Technology, Gandhinagar, Palaj Campus, Gandhinagar - 382355, Gujarat, India

Specification

Claims:1. A method for removing a plurality of noises from eye gaze data of a person, the method comprising:
providing a stimulus for capturing the eye gaze data;
capturing the eye gaze data using an eye tracking sensor;
applying an outlier removal on the captured eye gaze data to get rid of the missing data and estimating the missing data by interpolation, wherein the outlier removal results in the generation of a final eye gaze data comprising captured data and the estimated missing data;
applying, by a processor, a graph signal processing on the final eye gaze data to remove fluctuation due to a variable noise, wherein the variable noise is part of the plurality of noises;
tracking, by the processor, the graph processed eye gaze data using a Kalman filter;
applying, by the processor, an unsupervised clustering approach to the tracked eye gaze data to remove additional artifacts from the tracked signal; and
applying, by the processor, a weighted spatial transformation to the unsupervised clustered eye gaze data to remove the systematic noise, wherein the systematic noise is part of the plurality of noises.

2. The method of claim 1, wherein the Kalman filter is filtering the noise generated due to the dynamic movement of the eyes of the person.

3. The method of claim 1, wherein the unsupervised clustering is used to remove the spatial artifacts generated due to the eye blinks and head movement of the person.

4. The method of claim 1, wherein the Kalman filter is a time varying Kalman filter.

5. The system of claim 1, wherein the method of interpolation is a linear interpolation.

6. The method of claim 1, wherein the graph signal processing comprises smoothening the eye gaze data with respect an underlying graph structure present in the eye gaze data.

7. The method of claim 1 further comprising validation of the eye gaze data using root mean square error (RMSE).

8. The method of claim 1 further comprising the step of calibrating of the eye tracking sensor before capturing the eye gaze data of the person.

9. The method of claim 1, wherein the stimulus is provided on a display screen.

10. The method of claim 1, further comprising derivation of a transformation matrices for nine static points on a screen.

11. A system for removing a plurality of noises from eye gaze data of a person, the system comprising:
a display screen for providing a stimulus for capturing the eye gaze data;
an eye tracking sensor for capturing the eye gaze data;
a memory; and
a processor in communication with the memory, the processor further comprising,
an outlier removal module for applying an outlier removal on the captured eye gaze data to get rid of missing data and estimating the missing data by interpolation, wherein the outlier removal results in the generation of a final eye gaze data comprising captured data and the estimated missing data;
a graph signal processing module for applying a graph signal processing on the final eye gaze data to remove fluctuations due to a variable noise, wherein the variable noise is part of the plurality of noises;
a Kalman filtering module for tracking the graph processed eye gaze data using a Kalman filter;
an unsupervised clustering module for applying an unsupervised clustering approach to the tracked eye gaze data to remove additional artifacts from the tracked signal; and
a spatial transformation module for applying weighted spatial transformation to the unsupervised clustered eye gaze data to remove the systematic noise, wherein the systematic noise is part of the plurality of noises.
, Description:FORM 2

THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003

COMPLETE SPECIFICATION
(See Section 10 and Rule 13)

Title of invention:
METHOD AND SYSTEM FOR REMOVING A PLURALITY OF NOISES FROM EYE GAZE DATA

Applicant:
Tata Consultancy Services Limited
A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th floor,
Nariman point, Mumbai 400021,
Maharashtra, India

The following specification particularly describes the invention and the manner in which it is to be performed.

FIELD OF THE INVENTION

[0001] The present application generally relates to the field of capturing and processing of eye gaze data of a person. More particularly, but not specifically, the invention provides a system and method for removing a plurality of noises from the eye gaze data of the person.

BACKGROUND OF THE INVENTION

[0002] Eye tracking refers to the process of identifying the gaze or movement of an eye. Of late, accessibility of eye tracking technology has resulted in a proliferation of its use in various fields such as medicine, human computer interactions (HCI), biological engineering and psychology etc. For healthcare related applications, identification of oculomotor abnormalities captured by diagnosing one’s eye movement reflecting disabilities related to dizziness or unbalance namely Menieres disease and the neuritis, are often practiced by trained clinicians. Again, in response to computer based tasks, ones gaze-related indices derived from the static eye gaze pattern (fixation) and scan path between successive fixations (saccades) can be used as indicators of one’s attention, stress and other neurological disorders. Thus accurate measurement of one’s gaze fixation points on the screen where stimulus is presented, is critical for clinical applications.

[0003] There are mainly two approaches by which one’s eye gaze data can be captured, namely wearable devices and nearable devices. Among the wearable devices, Electroocculogram (EOG) is one of the popular approaches where the eye movements are detected by measuring weak electrical potentials generated by eye muscles controlling eye movement. However, it suffers from environmental noise and drift errors. Contact lens based eye trackers are quite robust. In addition to that they aren’t user friendly and cost-effective.

[0004] Among the nearable devices, Infrared (IR) based remote eye tracking systems are currently getting popular due to their cost effectiveness. However, they suffer from low resolution and low signal to noise ratio while extracting one’s gaze fixation coordinates, thereby adversely affecting user experience in HCI applications and limiting its usage in medical domain.

[0005] There are various other eye tracking devices available in the market. The raw X-Y location of the gaze provided by the eye tracking devices give the estimated location where the eye is focused. Calibration of eye trackers is an important phase but is often cumbersome process. Efforts have been made to ease the same but requires complex hardware setups. Even though these eye tracking devices are calibrated, the noise still persists due to the IR interference, eye blinks and head movement of the subjects. There are mainly two types of noise, namely variable noise manifested as temporal fluctuations in the time series signal obtained from the device and systematic noise due to an offset that the device is unable to correct.

[0006] Given the number of current and potential uses for eye gaze data, improvements in the accuracy and efficiency of eye tracking techniques are critical to ensuring that eye tracking functionality can be easily incorporated into various types of devices.

SUMMARY OF THE INVENTION

[0007] The following presents a simplified summary of some embodiments of the disclosure in order to provide a basic understanding of the embodiments. This summary is not an extensive overview of the embodiments. It is not intended to identify key/critical elements of the embodiments or to delineate the scope of the embodiments. Its sole purpose is to present some embodiments in a simplified form as a prelude to the more detailed description that is presented below.

[0008] In view of the foregoing, an embodiment herein provides a system for removing a plurality of noises from eye gaze data of a person. The system comprises a display screen, an eye tracking sensor, a memory and a processor. The display screen provides a stimulus for capturing the eye gaze data. The eye tracking sensor captures the eye gaze data. The processor further comprises an outlier removal module, a graph signal processing module, a Kalman filtering module, an unsupervised clustering module and a spatial transformation module. The outlier removal module applies an outlier removal on the captured eye gaze data to get rid of missing data and estimating the missing data by interpolation. The outlier removal results in the generation of a final eye gaze data comprising captured data and the estimated missing data. The graph signal processing module applies a graph signal processing on the final eye gaze data to remove fluctuation due to a variable noise. The variable noise is part of the plurality of noises. The Kalman filtering module tracks the graph processed eye gaze data using a Kalman filter. The unsupervised clustering module applies an unsupervised clustering approach to the tracked eye gaze data to remove additional artifacts from the tracked signal. The spatial transformation module applies weighted spatial transformation to the unsupervised clustered eye gaze data to remove the systematic noise. The systematic noise is part of the plurality of noises.

[0009] Another embodiment provides a method for removing a plurality of noises from eye gaze data of a person. Initially a stimulus is provided for capturing the eye gaze data. The eye gaze data is then captured using an eye tracking sensor. In the next step, an outlier removal is applied on the captured eye gaze data to get rid of missing data and estimating the missing data by interpolation. The outlier removal results in the generation of a final eye gaze data comprising captured data and the estimated missing data. In the next step a graph signal processing is applied on the final eye gaze data to remove fluctuation due to a variable noise. The variable noise is part of the plurality of noises. The graph processed eye gaze data is then tracked using a Kalman filter. In the next step, an unsupervised clustering approach is applied to the tracked eye gaze data to remove additional artifacts from the tracked signal. And finally, a weighted spatial transformation is applied to the unsupervised clustered eye gaze data to remove the systematic noise, wherein the systematic noise is part of the plurality of noises.

BRIEF DESCRIPTION OF THE DRAWINGS

[0010] The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:

[0011] Fig. 1 illustrates a block diagram of a system for removing a plurality of noises from the eye gaze data of a person, in accordance with an embodiment of the disclosure;

[0012] Fig. 2 illustrates a sequence of training and validation stimulus provided on a display screen, in accordance with an embodiment of the present disclosure;

[0013] Fig. 3 shows variable noise and systematic noise associated with the eye tracking sensors, in accordance with an embodiment of the disclosure; and

[0014] Fig. 4 is a flowchart illustrating the steps involved for removing the plurality of noises from the eye gaze data of the person, in accordance with an embodiment of the disclosure; and

[0015] Fig. 5 is a graphical representation of different types of noises present in the eye gaze coordinates with respect to the ground truth.

DETAILED DESCRIPTION OF THE INVENTION

[0016] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.

[0017] Referring now to the drawings, and more particularly to FIG. 1, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method.

[0018] According to an embodiment of the disclosure, a system 100 for removing a plurality of noises from eye gaze data of a person is shown in the block diagram of Fig. 1. The invention provides a method to capture eye tracker data using an off-the-shelf eye tracking device and to correct the error in mapping the eye gaze with respect to the stimulus screen coordinates. The invention provides an algorithmic chain to remove the plurality of noises using both supervised and unsupervised approach. The present disclosure also aims at increasing the efficiency of eye trackers by detecting and removing the noise associated with the eye trackers.

[0019] According to an embodiment of the disclosure, a block diagram of the system 100 is shown in Fig. 1. The system 100 includes a display screen 102, an eye tracking sensor 104, a memory 106 and a processor 108 in communication with the memory 106. The memory 106 is configured to store a plurality of algorithms. The processor 108 further includes a plurality of modules for performing various functions. The plurality of modules access the plurality of algorithms stored in the memory 106 to perform various functions. The plurality of modules include an outlier removal module 110, a graph signal processing (GSP) module 112, a Kalman filtering module 114, an unsupervised clustering module 116 and a spatial transformation module 118.

[0020] According to an embodiment of the disclosure, the person is provided with a stimulus on the display screen 102. In an example, the display screen 102 is a computer screen. The person is seated in front of the display screen 102 kept at a distance of 60 cm approximately with their chin placed on chin rest in order to minimize head movements during the eye gaze data capture. Before the onset of every stimulus, a calibration is performed using calibration application of the eye tracking sensor 104 to minimize the setup error.

[0021] In an embodiment of the present disclosure, specific training and a testing stimulus is designed for analysis of static and dynamic movements of eyes as shown in Fig. 2. In training phase, a black ball of diameter 20 pixels, moves on the screen either horizontally, vertically or diagonally as shown in Fig. 2 (a), (b), (c) and (d). For each case, there are three static positions (marked with dark spot) of duration 5 seconds and the ball moves gradually between these points at a rate of 60 pixels/sec on a display of 1600 * 900 pixels resolution. The person follows the ball and the eye gaze data is collected from the eye tracking sensor 104. The noise characteristics of these 9 static positions (S1 to S9) are modeled using the training stimulus. It is to be noted that the stimulus has been designed to ensure jerk free eye movements. In test phase a set of numbers, 1 to 9 are displayed in a 3 * 3 matrix format. The distance between the numbers are varied to create three test sets namely Large, Medium and Small. The average spacing (in pixels) between numbers for these three sets are 157, 120 and 94 for X direction and 120, 92, 72 for Y direction respectively. This design enables us to evaluate the performance of the proposed method in horizontal, vertical and diagonal directions. The upper and lower limits for these spacing are chosen such that it covers the desired spectrum in order to evaluate the performance of the proposed algorithm. In one extreme, for large spacing, there is a 100% detection during the testing of both the proposed algorithm and the existing techniques. On the other hand, as the spacing is reduced, the rate of reduction in accuracy for the proposed algorithm is much lesser than compared to the existing one. Beyond the small spacing, the accuracy for both the algorithms reduces exponentially. The font size is kept fixed at 40 for all the sets to consider a realistic scenario representing an object.

[0022] It should be appreciated that in an embodiment, EyeTribe can be used as the eye tracking sensor 104. Though the use of any other low cost eye tracking sensor is well within the scope of this disclosure. According to an embodiment of the disclosure, the eye tracking sensor 104 involves mainly two types of noises: variable noise and systematic noise. A graphical representation of the variable noise and the systematic noise is shown in Fig. 3 (a) and Fig. 3 (b) respectively. The variable noise manifested as temporal fluctuations in the time series signal obtained from the device and systematic noise due to an offset that the device is unable to correct. The center containing the dark ball is desired fixation point and the ’x’ represent the actual eye gaze points. The noise due to variable noise Fig. 3 (a) is the dispersion of the gaze data around the actual fixation point; whereas, the noise due to systematic noise Fig. 3(b) is the shift or the disparity in the desired fixation and the average gaze point location. In addition to the above mentioned noises, the eye gaze data captured from the eye tracking sensor 104 may also include additional noises such as artifacts generated due to eye blinks and head movements of the person or dynamic eye movement of the person.

[0023] According to an embodiment of the disclosure, the system 100 uses the outlier removal module 110 on the captured eye gaze data to get rid of the missing data and estimate the missing data. There are chances that the eye gaze data may include lot of erroneous information in between. This information is removed from the eye gaze data and the missing data is estimated by interpolation method using the outlier removal module 110. In an example, linear interpolation have been used, though use of any other interpolation method is well within the scope of this disclosure. The outlier removal results in the generation of a final eye gaze data comprising captured data and the estimated missing data. The final eye gaze data is used for further processing.

[0024] According to an embodiment of the disclosure, the fluctuation in the final eye gaze data is removed by Graph Signal Processing (GSP) based signal de-noising using the graph signal processing module 112. The eye gaze data is obtained in the form of coordinates from the eye tracking sensor 104. The eye gaze data contains the S = {sxk, syk} for every k = 1…. N, where N is the number of data points. The sudden fluctuations in S, as mentioned above, is removed using Graph signal processing (GSP) which smoothens the data with respect to the underlying graph structure present in the data, unlike other low pass filtering methods. In order to apply GSP the data is divided in to number of non-overlapping windows of size l (<= N) and GSP is applied on each of this window. The graph signal G (v, e, A) is formed by taking each of the (sxk, syk) in a particular window as vertices v, connecting edges e between the vertices having a weighted adjacency matrix A. The edges are formed if the Euclidean distance between two vertices n and m is less than a threshold value T as given in equation (1) below:

(1)

[0025] In this type of range based searching, each of the vertices will have different number of neighbors in a particular window which introduces a dynamicity in the graph formation and provides an edge over other filtering methods. The weighted adjacency matrix A is formed by assigning weights to edges depending on the Euclidean distance between the vertices, indicating the closeness between them. The weight of the connection between two vertices n and m are defined by Gaussian kernel for a constant ? as given in equation (3).

[0026] The graph signal G formed in each window is noisy and can be written as:


where St is the clean signal and e is the noise added to it. In order to obtain the clean signal which is smooth as well as follow the track of original data, a multiobjective optimization can be formed in a quadritic form as in equation (4):


Here the value of a controls the the degree of smoothness desired in the estimation of the clean signal St. The closed form solution of the above optimization (equation (4)) problem is given in equation (5).

Here * is hermitian of the matrix. The solution provided in equation (5) is applied on our dataset to solve the denoising task. During the formation of the graph a tradeoff in the windows size is required as the suttle movement of the eye gaze is lost for large windows whereas the efficiency of smoothness reduces for small windows. Here windows of size 30 samples of the signal have been taken heuristically which is also the sampling rate of the eye tracking sensor 104.

[0027] According to an embodiment of the disclosure, the system 100 further configured to track the graph processed eye gaze data using the Kalman filtering module 114. The Kalman filtering module 114 applies the Kalman filter to track / filter the graph processed eye gaze data. The Kalman filter is used to track the eye gaze data in order to further reduce the noise. The GSP filtered eye gaze position {sx, sy}, state vectors at time k is expressed as Rk = where denote velocity of eye ball in X and Y direction respectively. Since instantaneous eye movement is smoothly dependent upon previous velocities (i.e. where time j < k), in this work is modeled as a weighted sum of previous velocities. Hence each position of the eye gaze is assumed to be governed by following set of dynamic equations (6)

[0028] The velocity of eye gaze movement is found to follow AR(3) or ARIMA (3,0,0) thus the coefficients ak-1, ak-2, ak-3 and e are derived from the ARIMA model. It is to be mentioned that the coefficients need to be estimated separately for each subjects. The linear stochastic difference (at time instance k) is used to describe discrete state space model for tracking eye gaze position as given in equation (7).

where F is the state transformation matrix obtained from equation (7). The actual observation made at time k. The noiseless connection amongst the measurement vector and state vector is designated by H. The and are measurement and process noise, respectively.

[0029] According to an embodiment of the disclosure, the system 100 further configured to apply an unsupervised clustering approach to the filtered or tracked eye gaze data using the unsupervised clustering module 116. The unsupervised clustering approach removes additional artifacts from the tracked signal. After GSP and Kalman filtering, the data in the static segment is further clustered to retain only the desired gaze coordinates and to remove the outliers. K-means clustering is used for the same for its low computational overhead. The number of clusters in the data is found using the Xie-Beni index. Finally the cluster with maximum number of points is selected for further analysis.

[0030] According to an embodiment of the disclosure, the system 100 further configiured to apply weighted spatial transformation on the unsupervised clustered eye gaze data to remove the systematic noise using the spatial transformation module 118. The data in the optimum cluster is subjected to linear transformation to remove the systematic noise. This is based on the minimization of the error in the ground truth position and the observed data points. In the training and validation phase, the systematic noise is learnt as a transformation matrix associated with each of the 9 static points at the locations shown in Fig. 2 (a-d). During testing, based on the location of the cluster center, a weighted spatial transformation (TG of dimension 2 * 2) is derived using a Gaussian weighing function. Then the transformation is applied on the filtered test data (GSP and Kalman filtered output) to derive the new data point for estimating the eye gaze position.

[0031] In operation, a flowchart 200 for removing the plurality of noises from eye gaze data of the person is shown in Fig. 4 according to an embodiment of the disclosure. The plurality of noises may include systematic noise, variable noise and other fluctuations due to movement of eyes. Initially at step 202, the stimulus is provided for capturing the eye gaze data of the person. The stimulus is designed for training and testing phase for analysis of static and dynamic movements of the eyes. At step 204, the eye gaze data is captured using the eye tracking sensor 104. In an embodiment, EyeTribe is used as the eye tracking sensor 104. Though the use of any other low cost sensor is well within the scope of this disclosure.

[0032] At step 206, an outlier removal is applied on the eye gaze data to get rid of missing data in the eye gaze data. The missing data is estimated by linear interpolation. The outlier removal results in the generation of a final eye gaze data comprising captured data and the estimated missing data. At next step 208, a graph signal processing is applied by the processor 108 on the final eye gaze data to remove fluctuations due to the variable noise. The graph signal processing comprises smoothening the eye gaze data with respect to an underlying graph structure present in the eye gaze data. At the next step 210, the graph processed eye gaze data is tracked by the processor using the Kalman filter. In an embodiment of the disclosure, a time varying Kalman filter is used. The Kalman filter filters the noise generated due to the dynamic movement of the eyes of the person.

[0033] At step 212, an unsupervised clustering approach is applied to the tracked eye gaze data to remove additional artifacts from the tracked signal. The unsupervised clustering is used to remove the spatial artifacts generated due to the eye blinks and head movement of the person. And finally at step 214, a weighted spatial transformation is applied by the processor 108 to the unsupervised clustered eye gaze data to remove the systematic noise.
[0034] According to an embodiment of the disclosure, the removal of the plurality of noises from the eye gaze data can also be explained with the help of following experimental findings. The experiments were performed on the twenty participants (including 5 females) (age: mean (std) 29.2 (+ 7.27)). All the participants hail from similar education and cultural background. Twelve participants have normal vision and remaining eight have corrected to normal vision using glasses.

[0035] The results are presented for the training and testing stimulus where for training and validation the performance is reported based on the root mean square error (RMSE) as given by equation (9) and for the testing phase, the accuracy in detection of a number is computed using the corrected eye gaze data. RMSE is derived as the difference between the gaze coordinates and the ground truth targets for N gaze points.

[0036] For the training stimulus, even in the static segments, the temporal fluctuations in the time series signal obtained from the device for both X and Y co-ordinates is evident from the red colored plot in Fig. 5. This plot is for the Y co-ordinate of eye gaze data for one of the diagonal stimulus. This is smoothed by signal processing using GSP followed by filtering and tracking using Kalman filter as shown in green color. The ground truth position is shown in blue color. It can be seen that due to motion artifact, marked in green circle in the Fig. 5, the filtered data also fluctuates, hence the clustering is performed. In order to correct the systematic noise, a transformation matrix is learned by a K-fold validation using the training and validation data obtained from each of the static segments for the horizontal, vertical and diagonal stimulus. Table 1 gives the summary of the 5-fold validation computed using RMSE for 9 different static positions, marked as S1 to S9. It can be seen that there is a significant improvement due to the usage of the filtering followed by unsupervised clustering to handle the variable noise and eventually the transformation to handle the systematic noise.

[0037] Finally, the testing is performed on the number stimulus using the learned transformation matrix. Table 2 gives the summary of the results where it can be seen that the accuracy in correctly detecting a number improves significantly as we incorporate the GSP filter and Kalman tracking followed by clustering and spatial transformation.

[0038] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims. The embodiment, thus provides the system and method for measuring cognitive load of the person using a modified baseline derived by determining the inactive state from the rest state of the person.

[0039] It is, however to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device which can be programmed including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g. hardware means like e.g. an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means can include both hardware means and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g. using a plurality of CPUs.

[0040] The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various modules described herein may be implemented in other modules or combinations of other modules. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.

[0041] The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.

[0042] A data processing system suitable for storing and/or executing program code will include at least one processor coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.

[0043] Input/output (I/O) devices (including but not limited to keyboards, displays, pointing devices, etc.) can be coupled to the system either directly or through intervening I/O controllers. Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.

[0044] A representative hardware environment for practicing the embodiments may include a hardware configuration of an information handling/computer system in accordance with the embodiments herein. The system herein comprises at least one processor or central processing unit (CPU). The CPUs are interconnected via system bus to various devices such as a random access memory (RAM), read-only memory (ROM), and an input/output (I/O) adapter. The I/O adapter can connect to peripheral devices, such as disk units and tape drives, or other program storage devices that are readable by the system. The system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments herein.

[0045] The system further includes a user interface adapter that connects a keyboard, mouse, speaker, microphone, and/or other user interface devices such as a touch screen device (not shown) to the bus to gather user input. Additionally, a communication adapter connects the bus to a data processing network, and a display adapter connects the bus to a display device which may be embodied as an output device such as a monitor, printer, or transmitter, for example. The preceding description has been presented with reference to various embodiments. Persons having ordinary skill in the art and technology to which this application pertains will appreciate that alterations and changes in the described structures and methods of operation can be practiced without meaningfully departing from the principle, spirit and scope.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 201621043341-IntimationOfGrant10-03-2023.pdf 2023-03-10
1 Form 3 [19-12-2016(online)].pdf 2016-12-19
2 201621043341-PatentCertificate10-03-2023.pdf 2023-03-10
2 Form 20 [19-12-2016(online)].jpg 2016-12-19
3 Form 18 [19-12-2016(online)].pdf_276.pdf 2016-12-19
3 201621043341-Written submissions and relevant documents [28-02-2023(online)].pdf 2023-02-28
4 Form 18 [19-12-2016(online)].pdf 2016-12-19
4 201621043341-Correspondence to notify the Controller [17-02-2023(online)].pdf 2023-02-17
5 Drawing [19-12-2016(online)].pdf 2016-12-19
5 201621043341-FORM-26 [17-02-2023(online)]-1.pdf 2023-02-17
6 Description(Complete) [19-12-2016(online)].pdf_275.pdf 2016-12-19
6 201621043341-FORM-26 [17-02-2023(online)]-2.pdf 2023-02-17
7 Description(Complete) [19-12-2016(online)].pdf 2016-12-19
7 201621043341-FORM-26 [17-02-2023(online)].pdf 2023-02-17
8 Form 26 [21-01-2017(online)].pdf 2017-01-21
8 201621043341-US(14)-HearingNotice-(HearingDate-23-02-2023).pdf 2023-01-17
9 201621043341-FER.pdf 2021-10-18
9 Other Patent Document [23-01-2017(online)].pdf 2017-01-23
10 201621043341-ABSTRACT [28-04-2021(online)].pdf 2021-04-28
10 ABSTRACT1.JPG 2018-08-11
11 201621043341-CLAIMS [28-04-2021(online)].pdf 2021-04-28
11 201621043341-ORIGINAL UNDER RULE 6(1A) OTHERS-240117.pdf 2018-08-11
12 201621043341-COMPLETE SPECIFICATION [28-04-2021(online)].pdf 2021-04-28
12 201621043341-OTHERS [28-04-2021(online)].pdf 2021-04-28
13 201621043341-FER_SER_REPLY [28-04-2021(online)].pdf 2021-04-28
14 201621043341-COMPLETE SPECIFICATION [28-04-2021(online)].pdf 2021-04-28
14 201621043341-OTHERS [28-04-2021(online)].pdf 2021-04-28
15 201621043341-CLAIMS [28-04-2021(online)].pdf 2021-04-28
15 201621043341-ORIGINAL UNDER RULE 6(1A) OTHERS-240117.pdf 2018-08-11
16 201621043341-ABSTRACT [28-04-2021(online)].pdf 2021-04-28
16 ABSTRACT1.JPG 2018-08-11
17 Other Patent Document [23-01-2017(online)].pdf 2017-01-23
17 201621043341-FER.pdf 2021-10-18
18 201621043341-US(14)-HearingNotice-(HearingDate-23-02-2023).pdf 2023-01-17
18 Form 26 [21-01-2017(online)].pdf 2017-01-21
19 Description(Complete) [19-12-2016(online)].pdf 2016-12-19
19 201621043341-FORM-26 [17-02-2023(online)].pdf 2023-02-17
20 Description(Complete) [19-12-2016(online)].pdf_275.pdf 2016-12-19
20 201621043341-FORM-26 [17-02-2023(online)]-2.pdf 2023-02-17
21 Drawing [19-12-2016(online)].pdf 2016-12-19
21 201621043341-FORM-26 [17-02-2023(online)]-1.pdf 2023-02-17
22 Form 18 [19-12-2016(online)].pdf 2016-12-19
22 201621043341-Correspondence to notify the Controller [17-02-2023(online)].pdf 2023-02-17
23 Form 18 [19-12-2016(online)].pdf_276.pdf 2016-12-19
23 201621043341-Written submissions and relevant documents [28-02-2023(online)].pdf 2023-02-28
24 Form 20 [19-12-2016(online)].jpg 2016-12-19
24 201621043341-PatentCertificate10-03-2023.pdf 2023-03-10
25 201621043341-IntimationOfGrant10-03-2023.pdf 2023-03-10
25 Form 3 [19-12-2016(online)].pdf 2016-12-19

Search Strategy

1 2020-10-2714-06-01E_27-10-2020.pdf

ERegister / Renewals

3rd: 02 Jun 2023

From 19/12/2018 - To 19/12/2019

4th: 02 Jun 2023

From 19/12/2019 - To 19/12/2020

5th: 02 Jun 2023

From 19/12/2020 - To 19/12/2021

6th: 02 Jun 2023

From 19/12/2021 - To 19/12/2022

7th: 02 Jun 2023

From 19/12/2022 - To 19/12/2023

8th: 02 Jun 2023

From 19/12/2023 - To 19/12/2024

9th: 19 Dec 2024

From 19/12/2024 - To 19/12/2025