Sign In to Follow Application
View All Documents & Correspondence

System For Hands Free Access Of Electronic Device

Abstract: SYSTEM FOR HANDS-FREE ACCESS OF ELECTRONIC DEVICE ABSTRACT A system (100) for hands-free access of an electronic device (118) is disclosed. The system (100) comprises a headset (102). The headset (102) comprises an electroencephalogram signal acquisition hardware (104) for capturing sensorimotor rhythm (SMR) brain activity, cameras (108) for binocular convergence-based 3D gaze tracking, and a focus camera (110) for capturing a screen (120) of the electronic device (118). A processing unit (112) generates scripts based on SMR brain activity, gaze tracking, and screen (120) location. Markers (114a-114d) placed on the electronic device (118) enhance gaze estimation accuracy. An interpreter unit (116), receives and interprets scripts, learning user-intended functions. It subsequently manipulates a pointer (124) on the device's interface, with pointer movement directly proportional to the user's eye movements. The system (100) enables terminally disabled persons to interact and use the electronic device (118). Claims: 10, Figures: 8 Figure 1A is selected.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
04 December 2023
Publication Number
01/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SR University
SR University, Ananthasagar, Warangal, Telangana-506371, India (IN) Email ID: patent@sru.edu.in Mb: 08702818333

Inventors

1. Ravichander Janapati
3-93/4, Po & vill: Valbhapur, Via: Kabadi, Dist: Karimnagar(T.S)-505129
2. Jagrit Kumar Chandrakar
Jagrit Chandrakar, In front of marketing society gariyaband road chhura, Gariyaband 493996
3. Rakesh Sengupta
Assistant professor, Department of creative cognition, SR University, Ananthasagar, Warangal, Telangana-506371, India

Specification

Description:BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to a system and particularly to a system for hands-free access of an electronic device.
Description of Related Art
[002] In modern days, being in connection with electronic devices such as mobiles and computers is necessary. Almost all of the day-to-day task is somehow lined and linked with the utilization of electronic devices. However, it becomes very hard and cumbersome for people with body disability and impairments to communicate and interact with electronic devices.
[003] Moreover, a stream of computing known as Brain-Computer Interface (BCI) technology is being developed to assist the physically disabled in communicating with electronic devices. The Brain-Computer Interface (BCI) technology uses sensorimotor rhythms (SMR) produced from a brain of the user upon Motor Imagery (MI) or motor Execution (ME) tasks. Further, these sensorimotor rhythms (SMR) are translated and fed into electronic devices as input signals to perform a set of actions.
[004] However, the potential of the Brain-Computer Interface (BCI) technology to provide individuals with disabilities access to the expansive internet is significant. Additionally, various barriers, such as elevated costs and operational complexities, currently limit the widespread adoption of systems integrated with the Brain-Computer Interface (BCI) technology, restricting their utilization predominantly to academic environments.
[005] There is thus a need for an improved and advanced system for hands-free access of an electronic device that can administer the aforementioned limitations in a more efficient manner.
SUMMARY
[006] Embodiments in accordance with the present invention provide a system for hands-free access of an electronic device. The system comprising: a headset. The headset comprising: an electroencephalogram signal acquisition hardware to capture sensorimotor rhythm (SMR) brain activity during motor imagery or motor execution tasks. The headset further comprising: cameras positioned on the headset to be in front of eyes of the user. The cameras are adapted to track a motion of pupils of the eyes of the user using a binocular convergence-based three-dimensional (3D) gaze tracking. The headset further comprising: a focus camera positioned on the headset to focus outwards. The focus camera is adapted to capture a location of a screen of the electronic device being gazed at by the user. The headset further comprising: a processing unit adapted to generate scripts based on the captured sensorimotor rhythm (SMR) brain activity, the tracked motion of the pupils, and the gazed location of the screen of the electronic device. The system further comprising: markers arranged at locations selected from boundaries, edges, and/or corners of the electronic device for precise gaze estimation. The system further comprising: an interpreter unit installed in the electronic device, and communicatively connected to the processing unit. The interpreter unit is configured to: receive the generated scripts from the processing unit; interpret the received script for learning user-intended functions associated with the script; and control or manipulate a pointer in a user interface on the screen of the electronic according to the learned user intended functions. A movement of the pointer on the screen is directly proportional to the pupils of the eyes of the user.
[007] Embodiments in accordance with the present invention further provide a method of hands-free access of an electronic device using a system. The method comprising steps of: receiving generated scripts from a processing unit; interpreting the received script for learning user-intended functions associated with the script; and controlling or manipulating a pointer on a screen of the electronic according to the learned user-intended functions.
[008] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide a system for hands-free access of an electronic device.
[009] Next, embodiments of the present application may provide a system for hands-free access of an electronic device that is cost-effective and affordable.
[0010] Next, embodiments of the present application may provide a system for hands-free access of an electronic device that is user-friendly and has a lower order of complexity.
[0011] Next, embodiments of the present application may provide a system for hands-free access of an electronic device that requires less time for calibration and very brief training for operation.
[0012] Next, embodiments of the present application may provide a system for hands-free access of an electronic device that is accurate and reliable.
[0013] Next, embodiments of the present application may provide a system for hands-free access of an electronic device that supports multi-directional head movements and eye tracking.
[0014] Next, embodiments of the present application may provide a system for hands-free access of an electronic device that is user-friendly and ergonomic.
[0015] Next, embodiments of the present application may provide a system for hands-free access of an electronic device that has a wide spectrum of compatible combinations of electronic devices.
[0016] These and other advantages will be apparent from the present application of the embodiments described herein.
[0017] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0019] FIG. 1A illustrates a block diagram of a system for hands-free access of an electronic device, according to an embodiment of the present invention;
[0020] FIG. 1B illustrates a hands-free access of an electronic device using the system, according to an embodiment of the present invention;
[0021] FIG. 1C illustrates a training of the system, according to an embodiment of the present invention;
[0022] FIG. 1D illustrates a capture of sensorimotor rhythm (SMR) brain activity, according to an embodiment of the present invention;
[0023] FIG. 1E illustrates a tracking a motion of pupils of eyes of a user using cameras of the system, according to an embodiment of the present invention;
[0024] FIG. 1F illustrates a placement of markers around a screen of the electronic device, according to an embodiment of the present invention;
[0025] FIG. 2 illustrates a block diagram of an interpreter unit of the system, according to an embodiment of the present invention; and
[0026] FIG. 3 depicts a flowchart of a method of hands-free access of the electronic device using the system, according to an embodiment of the present invention.
[0027] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0028] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0029] In any embodiment described herein, the open-ended terms "comprising", "comprises”, and the like (which are synonymous with "including", "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", “consists essentially of", and the like or the respective closed phrases "consisting of", "consists of”, the like.
[0030] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0031] FIG. 1A illustrates a block diagram of a system 100 for hands-free access to an electronic device 118, according to an embodiment of the present invention. In an embodiment of the present invention, the system 100 may enable a physically impaired user to operate and use the electronic device 118 by employing a brain-computer interface (BCI) based technology. According to embodiments of the present invention, the physical impairment may be of, but not limited to, a motor impairment in a body part, a temporary fracture in the body part, an amputation of the body part, a paralysis, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the physical impairment in the user, including known, related art, and/or later developed technologies.
[0032] According to embodiments of the present invention, the electronic device 118 may be, but not limited to, a desktop computer, a laptop computer, a handheld computer, a gaming console, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the electronic device 118, including known, related art, and/or later developed technologies.
[0033] According to embodiments of the present invention, the system 100 may comprise a headset 102, an electroencephalogram signal acquisition hardware 104, semi-dry polymer sensors 106, cameras 108, a focus camera 110, a processing unit 112, markers 114a-114d (hereinafter referred individually to as the marker 114, and plurally to as the markers 114), an interpreter unit 116, the electronic device 118, a screen 120, a user interface 122, and a pointer 124.
[0034] In an embodiment of the present invention, the headset 102 may be adapted to be worn on a head of the physically impaired user. The headset 102 may have a flexible plastic body and may be designed to fit a wide range of head shapes and sizes, in an embodiment of the present invention. The headset 102 may feature a strap (not shown) with a locking mechanism (not shown) to secure and align the headset 102 on the head of the physically impaired user, in an embodiment of the present invention. In an embodiment of the present invention, the headset 102 may comprise the electroencephalogram signal acquisition hardware 104, the cameras 108, the focus camera 110, and the processing unit 112.
[0035] In an embodiment of the present invention, the electroencephalogram signal acquisition hardware 104 may be adapted to capture sensorimotor rhythm (SMR) brain activity during motor imagery or motor execution tasks. In a preferred embodiment of the present invention, the electroencephalogram signal acquisition hardware 104 may be an Emotive Insight band hardware. Embodiments of the present invention are intended to include or otherwise cover any electroencephalogram signal acquisition hardware 104, including known, related art, and/or later developed technologies.
[0036] In an embodiment of the present invention, the electroencephalogram signal acquisition hardware 104 may establish a wireless connection to the electronic device 118 via Bluetooth version 5.0. Through this connection, the dedicated software, Emotive Pro, may facilitate real-time monitoring of the electroencephalogram signal quality. Furthermore, the software may display the electroencephalogram signal visually and present five ranges of band power. The electroencephalogram signal acquisition hardware 104 may measure the electroencephalogram signal using 5 electrodes corresponding to AF3, AF4, T7, T8, and Pz positions. The electroencephalogram signal acquisition hardware 104 may have a sampling rate of 128 SPS (2048 Hertz (Hz) internal) and resolution of 16 bits 1 LSB = 0.128 microvolt (16-bit ADC) and a bandwidth range from 0.5 Hertz (Hz) to 45 Hertz (Hz).
[0037] The sensorimotor rhythm (SMR) brain activity of the user may be captured using the semi-dry polymer sensors 106 that may be pre-arranged in the electroencephalogram signal acquisition hardware 104, in an embodiment of the present invention. In an embodiment of the present invention, the semi-dry polymer sensors 106 may be adjusted and placed at multiple locations on the head of the user to capture the sensorimotor rhythm (SMR) brain activity. The sensorimotor rhythm (SMR) brain activity may be elicited from a brain of the user upon Motor Imagery (MI) or Motor Execution (ME) tasks, in an embodiment of the present invention.
[0038] In an embodiment of the present invention, the semi-dry polymer sensors 106 may reduce a hassle of extensive preparation of gels as the semi-dry polymer sensors 106 may require a minimal amount of moisture for adhesion on the head of the user because of their hydrophilic nature. The Emotive Pro software may display a connection quality and an electroencephalogram quality of individual semi-dry polymer sensor, in an embodiment of the present invention.
[0039] The semi-dry polymer sensors 106 may capture the sensorimotor rhythm (SMR) brain activity in a mu-rhythm within a frequency range of range 8 Hertz (Hz) to 12 Hertz (Hz), in an embodiment of the present invention. In an embodiment of the present invention, the semi-dry polymer sensors 106 may further capture the sensorimotor rhythm (SMR) brain activity within a beta range from around 20 Hertz (Hz) to a gamma range from around 40 Hertz (Hz).
[0040] In an embodiment of the present invention, the cameras 108 may be positioned on the headset 102 to be in front of eyes of the user. In an embodiment of the present invention, the cameras 108 may be adapted to track a motion of pupils of the eyes of the user. The camera may track the motion of the pupils of the eyes of the user using a binocular convergence-based three-dimensional (3D) gaze tracking, in an embodiment of the present invention. In an embodiment of the present invention, the cameras 108 may further be explained in conjunction with FIG. 1E.
[0041] In an embodiment of the present invention, the focus camera 110 may be positioned on the headset 102 to focus outwards. The focus camera 110 may be adapted to capture a location of the screen 120 of the electronic device 118 being gazed at by the user, in an embodiment of the present invention.
[0042] In an embodiment of the present invention, the focus camera 110 may have the sampling frequencies such as, but not limited to, 30 Hertz (Hz) at 1080 pixels, 60 Hertz (Hz) at 720 pixels, 120 Hertz (Hz) at 480 pixels. According to embodiments of the present invention, the focus camera 110 may be, but not limited to, the ultra-wide camera, the macro camera, the telephoto camera, the color balancer camera, the infrared camera, the night vision camera, the thermal camera, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the focus camera 110, including known, related art, and/or later developed technologies.
[0043] processing unit 112 – In an embodiment of the present invention, the processing unit 112 may be configured to generate scripts. The scripts generated by the processing unit 112 may be based on the captured sensorimotor rhythm (SMR) brain activity, the tracked motion of the pupils, and the gazed location of the screen 120 of the electronic device 118, in an embodiment of the present invention. In a preferred embodiment of the present invention, the script may be coded in a Python programming language. Embodiments of the present invention are intended to include or otherwise cover any programming language for coding the script generated by the processing unit 112, including known, related art, and/or later developed technologies.
[0044] In an embodiment of the present invention, the scripts generated by the processing unit 112 may comprise streamed eye gaze data in real-time to map the eyes movements with the pointer movements, employing a velocity threshold, an acceleration threshold, and a minimum deflection threshold for saccade identification.
[0045] In a preferred embodiment of the present invention, the threshold velocity of the pointer 124 may be 30 degrees per second (°/sec). In a preferred embodiment of the present invention, the threshold acceleration of the pointer 124 may be 8000 degrees per second square (°/sec2). In a preferred embodiment of the present invention, the threshold for minimum deflection of the pointer 124 may be 0.1 degrees (°).
[0046] In an embodiment of the present invention, the scripts generated by the processing unit 112 may comprise a smoothing algorithm to smoothen out and remove jagged and jerky movements in the movement of the pointer 124 on the screen 120. In a preferred embodiment of the present invention, the smoothing algorithm may be a Kalman filter and a moving average. Embodiments of the present invention are intended to include or otherwise cover any smoothing algorithm for smoothing out the movement of the pointer 124 on the screen 120, including known, related art, and/or later developed technologies. The Kalman filter may be a recursive algorithm that may estimate the state from the previous time step and the observations in the current time step are needed to compute the update.
[0047] According to embodiments of the present invention, the processing unit 112 may be, but not limited to, a Programmable Logic Control (PLC) unit, a microprocessor, a development board, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the processing unit 112 including known, related art, and/or later developed technologies.
[0048] In an embodiment of the present invention, the markers 114 may be arranged at locations such as, but not limited to, boundaries, edges, and/or corners of the electronic device 118 for precise gaze estimation. The markers 114 may be adapted to adjust a relative change in a location of the screen 120 of the electronic device 118 with respect to the eyes of the user in real-time to produce the precise gaze estimation of gaze positions, in an embodiment of the present invention. In an embodiment of the present invention, the markers 114 may further be explained in conjunction with FIG. 1F.
[0049] In an embodiment of the present invention, the interpreter unit 116 may be installed in the electronic device 118. The interpreter unit 116 may be connected to the processing unit 112, in an embodiment of the present invention. The interpreter unit 116 may further be configured to execute computer-executable instructions to generate an output relating to the system 100. According to embodiments of the present invention, the interpreter unit 116 may be, but not limited to, a Programmable Logic Control (PLC) unit, a microprocessor, a development board, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the interpreter unit 116 including known, related art, and/or later developed technologies. In an embodiment of the present invention, the interpreter unit 116 may further be explained in conjunction with FIG. 2.
[0050] In an embodiment of the present invention, the electronic device 118 may be the device that may be adapted to be accessed hands-free by the user using the system 100. The electronic device 118 may provide the user interface 122, enabling the user to interact with the electronic device 118, in an embodiment of the present invention. Further, the interaction of the user with the electronic device 118 may be facilitated by the pointer 124 provided in the user interface 122.
[0051] In an embodiment of the present invention, the pointer 124 may enable the user to perform the user intended functions on the user interface 122. According to embodiments of the present invention, the user intended functions performed by on the user interface 122 may be, but not limited to, a click action, a scroll up action, a scroll down action, a scroll right action, a scroll left action, a pinch in action, a pinch out action, a single click action, a double click action, a dragging and dropping action, a selection action, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the user intended functions that may be performed using the pointer 124 on the user interface 122, including known, related art, and/or later developed technologies. Further, the pointer 124 may be resizable so that the pointer 124 may be easily visible and spot-able by the user.
[0052] FIG. 1B illustrates the hands-free access of the electronic device 118 using the system 100, according to an embodiment of the present invention. In an exemplary embodiment of the present invention, the electroencephalogram signal acquisition hardware 104 may capture the sensorimotor rhythm (SMR) brain activity during the motor imagery or the motor execution tasks. The cameras 108 may be positioned on the headset 102 and may be adapted to track the motion of pupils of the eyes of the user using the binocular convergence-based three-dimensional (3D) gaze tracking. The focus camera 110 may further be positioned on the headset 102 to focus outwards and may be adapted to capture the location of the screen 120 of the electronic device 118 through the marks. In an embodiment of the present invention, the tracked motion of the pupils of the eyes of the user and the captured location of the screen 120 of the electronic device 118 may be combined to generate pointer coordinates. The pointer coordinates may further be fed into the electronic device 118 via the scripts and the pointer 124 in the user interface 122 of the electronic device 118 may be controlled.
[0053] In an embodiment of the present invention, as shown in the FIG. 1B, the sensorimotor rhythm (SMR) brain activity in form of the electroencephalogram (EEG) data may be combined or correlated with the tracked motion of the pupils, and the captured location of the screen 120. The combined or correlated data may be passed through the filtering, feature extraction, and classification processes to generate executive commands in form of the scripts. In an embodiment of the present invention, the scripts may be composed using a code language that may be selected from a machine language, Java, JSON, Python, and so forth. These languages may offer different functionalities and may be suited for specific purposes based on their syntax and capabilities. Embodiments of the present invention are intended to include or otherwise cover any code language, including known, related art, and/or later developed technologies.
[0054] FIG. 1C illustrates a training 126 of the system 100 for hands-free access to the electronic device 118, according to an embodiment of the present invention. In an embodiment of the present invention, the system 100 may harvest electroencephalogram data from the electroencephalogram signal acquisition hardware 104. The electroencephalogram data harvested may undergo the filtering and the feature extraction. Upon extraction of the electroencephalogram features, the electroencephalogram features may be trained on event data retrieved from an experiment. After training, the electroencephalogram features may be classified into models.
[0055] FIG. 1D illustrates a capture 128 of sensorimotor rhythm (SMR) brain activity using the electroencephalogram signal acquisition hardware 104, according to an embodiment of the present invention. In an exemplary embodiment of the present invention, the system 100 may trained by collection of the electroencephalogram data corresponding to motor imagery activities. The experiment may enable the recording of the subsequent electroencephalogram data along with the corresponding event files. While recording the electroencephalogram signal, the user may be shown images on the screen 120 of the electronic device 118 consecutively at an interval of 5 seconds. The images may be chosen randomly from a set of four images with the texts: “right hand”, “left hand”, “right leg” and “left leg” with their corresponding symbols. The user may then be asked to imagine their corresponding hands and legs moving while the visual cues get deployed on the screen 120. The recording of the electroencephalogram signal may take 25 minutes in which a total of 300 images with visual cues may be displayed, with 75 trials for each category. The electroencephalogram data may be recorded in a MATLAB toolbox and a psych toolbox.
[0056] FIG. 1E illustrates a tracking of the motion of the pupils of the eyes of the user using the cameras 108 of the system 100, according to an embodiment of the present invention. In an embodiment of the present invention, the cameras 108 may contain two separate cameras 108 that may be assigned to each eye (a left eye and a right eye) of the user. The cameras 108 may have a gaze accuracy of 0.60 degrees with a precision of 0.02, in an embodiment of the present invention. In an embodiment of the present invention, the camera may have an internal latency of 8.5 milliseconds (ms).
[0057] In an embodiment of the present invention, the cameras 108 may have a sampling frequency of 200 Hertz (Hz) at 192 pixels by 192 pixels. According to embodiments of the present invention, the cameras 108 may be, but not limited to, an ultra-wide camera, a macro camera, a telephoto camera, a color balancer camera, an infrared camera, a night vision camera, a thermal camera, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the cameras 108, including known, related art, and/or later developed technologies. In an embodiment of the present invention, the cameras 108 may be lightweight bearing less weight and strain on the eyes of the user.
[0058] FIG. 1F illustrates a placement of the markers 114 around the screen 120 of the electronic device 118, according to an embodiment of the present invention. In an embodiment of the present invention, the markers 114 may be placed such that the makers may always be in a field of view of the focus camera 110. The markers 114 may be placed steadily and stationary and may provide a visual fiducial system around the screen 120 of the electronic device 118, in an embodiment of the present invention. In an embodiment of the present invention, the markers 114 may reduce the need for the user to be very steady while using the electronic device 118, making the entire system 100 more practical.
[0059] In an exemplary embodiment of the present invention, if the user may not be very steady while using the electronic device 118, the focus camera 110 may capture a deviation in the markers 114 in every frame as the user moves their head. Further, the deviation in the placement of the markers 114 may be calculated frame by frame by the processing unit 112 and may be included in the generated scripts. The interpreter unit 116 may compensate for the deviation as information relating to deviation may be included in the scripts transmitted from the processing unit 112 to the interpreter unit 116.
[0060] According to embodiments of the present invention, the markers 114 may be, but not limited to, a barcode, a Quick Response (QR) code, and so forth. In a preferred embodiment of the present invention, the marker may be AprilTags. Embodiments of the present invention are intended to include or otherwise cover any type of the markers 114, including known, related art, and/or later developed technologies. In an embodiment of the present invention, the markers 114 may be black and white. In an embodiment of the present invention, the markers 114 may be arranged at the locations such as, but not limited to, the boundaries, the edges, and/or the corners of the electronic device 118 for precise gaze estimation.
[0061] FIG. 2 illustrates a block diagram of the interpreter unit 116 of the system 100, according to an embodiment of the present invention. The interpreter unit 116 may comprise the computer-executable instructions in form of programming modules such as a data receiving module 200, a data interpretation module 202, and a pointer control module 204.
[0062] In an embodiment of the present invention, the data receiving module 200 may be configured to receive the generated scripts from the processing unit 112. The data receiving module 200 may further be configured to transmit the received scripts to the data interpretation module 202, in an embodiment of the present invention.
[0063] In an embodiment of the present invention, the data interpretation module 202 may be configured to be activated upon receipt of the scripts from the data receiving module 200. The data interpretation module 202 may be configured to interpret the received script for learning user-intended functions associated with the script, in an embodiment of the present invention. Upon interpretation of the user-intended functions associated with the script, the data interpretation module 202 may transmit a signal corresponding to the user-intended functions associated with the script to the pointer control module 204, in an embodiment of the present invention.
[0064] In an embodiment of the present invention, the pointer control module 204 may be activated upon receipt of the signal corresponding to the user-intended functions associated with the script from the data interpretation module 202. The pointer control module 204 may be confined to control or manipulate the pointer 124 in the user interface 122 on the screen 120 of the electronic according to the learned user intended functions, in an embodiment of the present invention. In an embodiment of the present invention, the movement of the pointer 124 on the screen 120 may directly be proportional to the pupils of the eyes of the user.
[0065] FIG. 3 depicts a flowchart of a method 300 of the hands-free access of the electronic device 118 using the system 100, according to an embodiment of the present invention.
[0066] At step 302, the system 100 may receive generated scripts from the processing unit 112.
[0067] At step 304, the system 100 may interpret the received script for learning user-intended functions associated with the script.
[0068] At step 306, the system 100 may control or manipulate the pointer 124 on the screen 120 of the electronic according to the learned user intended functions.
[0069] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0070] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
I/We Claim:
1. A system (100) for hands-free access of an electronic device (118), the system (100) comprising:
a headset (102) comprising:
an electroencephalogram signal acquisition hardware (104) to capture sensorimotor rhythm (SMR) brain activity during motor imagery or motor execution tasks;
cameras (108) positioned on the headset (102) to be in front of eyes of the user, and adapted to track a motion of pupils of the eyes of the user using a binocular convergence-based three-dimensional (3D) gaze tracking;
a focus camera (110) positioned on the headset (102) to focus outwards, and adapted to capture a location of a screen (120) of the electronic device (118) being gazed at by the user; and
a processing unit (112), characterized in that the processing unit (112) is adapted to generate scripts based on the captured sensorimotor rhythm (SMR) brain activity, the tracked motion of the pupils, the gazed location of the screen (120) of the electronic device (118);
markers (114a-114d) arranged at locations selected from boundaries, edges, and/or corners of the electronic device (118) for precise gaze estimation; and
an interpreter unit (116) installed in the electronic device (118), and communicatively connected to the processing unit (112), characterized in that the interpreter unit (116) is configured to:
receive the generated scripts from the processing unit (112);
interpret the received script for learning user-intended functions associated with the script; and
control or manipulate a pointer (124) in a user interface (122) on the screen (120) of the electronic according to the learned user intended functions, wherein a movement of the pointer (124) on the screen (120) is directly proportional to the pupils of the eyes of the user.
2. The system (100) as claimed in claim 1, wherein the scripts generated by the processing unit (112) comprises streamed eye gaze data in real-time to map the eyes movements with the pointer (124) movements, employing a velocity threshold, an acceleration threshold, and a minimum deflection threshold for saccade identification.
3. The system (100) as claimed in claim 2, wherein the threshold velocity is 30 degrees per second (°/sec).
4. The system (100) as claimed in claim 2, wherein the threshold acceleration is 8000 degrees per second square (°/sec2).
5. The system (100) as claimed in claim 2, wherein the threshold for minimum deflection is 0.1 degree (°).
6. The system (100) as claimed in claim 1, wherein the processing unit (112) executes a smoothing algorithm to smoothen out and remove jagged and jerky movements in the movement of the pointer (124) on the screen (120).
7. The system (100) as claimed in claim 1, wherein the smoothing algorithm employs a Kalman filter.
8. The system (100) as claimed in claim 1, wherein markers (114a-114d) are adapted to adjust a relative change in a location of the screen (120) of the electronic device (118) with respect to the eyes of the user in real-time to produce the precise gaze estimation of gaze positions.
9. The system (100) as claimed in claim 1, wherein the electroencephalogram signal acquisition hardware (104) comprises semi-dry polymer sensors (106) adapted to be placed at multiple locations on a head of the user to capture the sensorimotor rhythm (SMR) brain activity.
10. A method (300) of hands-free access of an electronic device (118) using a system (100), the method (300) characterized by steps of:
receiving generated scripts from a processing unit (112);
interpreting the received script for learning user-intended functions associated with the script; and
controlling or manipulating a pointer (124) on a screen (120) of the electronic according to the learned user intended functions.
Date: December 02, 2023
Place: Noida

Nainsi Rastogi
Patent Agent (IN/PA-2372)
Agent for the Applicant

Documents

Application Documents

# Name Date
1 202341082264-STATEMENT OF UNDERTAKING (FORM 3) [04-12-2023(online)].pdf 2023-12-04
2 202341082264-REQUEST FOR EARLY PUBLICATION(FORM-9) [04-12-2023(online)].pdf 2023-12-04
3 202341082264-POWER OF AUTHORITY [04-12-2023(online)].pdf 2023-12-04
4 202341082264-OTHERS [04-12-2023(online)].pdf 2023-12-04
5 202341082264-FORM-9 [04-12-2023(online)].pdf 2023-12-04
6 202341082264-FORM FOR SMALL ENTITY(FORM-28) [04-12-2023(online)].pdf 2023-12-04
7 202341082264-FORM 1 [04-12-2023(online)].pdf 2023-12-04
8 202341082264-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [04-12-2023(online)].pdf 2023-12-04
9 202341082264-EDUCATIONAL INSTITUTION(S) [04-12-2023(online)].pdf 2023-12-04
10 202341082264-DRAWINGS [04-12-2023(online)].pdf 2023-12-04
11 202341082264-DECLARATION OF INVENTORSHIP (FORM 5) [04-12-2023(online)].pdf 2023-12-04
12 202341082264-COMPLETE SPECIFICATION [04-12-2023(online)].pdf 2023-12-04
13 202341082264-Proof of Right [07-02-2024(online)].pdf 2024-02-07