Sign In to Follow Application
View All Documents & Correspondence

Hybrid Eeg Bci With Eye Tracker For Cursor Control In Computer Navigation

Abstract: HYBRID EEG BCI WITH EYE TRACKER FOR CURSOR CONTROL IN COMPUTER NAVIGATION A Hybrid EEG BCI with eye tracker for cursor control in computer navigation comprising an eye tracker device, an EEG Cap, a microprocessor, a cloud storage, a web application wherein the user wears the EEG cap and positions the eye tracker for accurate readings and then calibrates by collecting baseline EEG and eye-tracking data to tailor the interface. In another embodiment after presenting the navigation task and instructions, the user engage in brief practice sessions to familiarize them with controlling the cursor through their brain activity and gaze, setting the stage for effective interaction. In another embodiment the microcontroller processes the raw data; Wherein the process concludes once the user successfully completes the task, receiving confirmation through visual or auditory feedback. In another embodiment the next step is execution where it involving the design and development of prototypes that integrate brain-computer interface (BCI) and eye-tracking technologies; Wherein user testing is conducted together feedback, leading to iterative improvement and system integration with eye tracker device. In another embodiment during closure, the system's effectiveness is evaluated, necessary adjustments are made, and comprehensive documentation is prepared. In another embodiment the web application used to show the result of the eye tracker device; Wherein the web application is connected with the eye tracker device via Bluetooth connection. Wherein the cloud storage used to collect the historical data for further work.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
04 December 2024
Publication Number
1/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SR UNIVERSITY
ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA

Inventors

1. RAVICHANDER JANAPATI
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA
2. ARKALA DIVYA JYOTHI
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA
3. SIDDA SINDHUJA
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA
4. KORRA MANASA
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA
5. REBELLI SHIVANI
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA
6. USHA DESAI
S.E.A COLLEGE OF ENGINEERING AND TECHNOLOGY, BANGALORE-560049, INDIA

Specification

Description:FIELD OF THE INVENTION
This invention relates to Hybrid EEG BCI with eye tracker for cursor control in computer navigation.
BACKGROUND OF THE INVENTION
The proposed solution integrates a hybrid brain-computer interface (BCI) utilizing electroencephalography (EEG) and eye-tracking technology to enhance cursor control in computer navigation. This system aims to provide an intuitive and efficient means of interaction for users, particularly benefiting individuals with mobility impairments.To enable individuals with severe motor impairments to control a computer cursor, we propose a hybrid system that integrates an EEG-based Brain-Computer Interface (BCI) and eye tracking technology. This innovative approach allows users to manipulate the cursor through brain signals and gaze direction, facilitating intuitive and independent computer navigation. By combining these non-invasive methods, the system enhances accessibility and empowers users to interact with technology more effectively.
US12001602B2Brain-computer interface with adaptations for high speed, accurate, and intuitive user interaction
Research Gap: A hybrid EEG BCI with an eye tracker combines brain signals and eye movements for precise cursor control, while a traditional BCI focuses solely on interpreting brain activity for fast and intuitive interactions. The hybrid offers enhanced navigation precision, whereas the dedicated BCI prioritizes rapid signal processing.
US20220404910A1Brain Computer Interface with High-Speed eye tracking features
Research Gap: A Brain-Computer Interface (BCI) with high-speed eye tracking integrates real-time eye movement data to enhance user interaction. This technology allows for precise tracking of where a user is looking, enabling faster and more intuitive control of devices or applications. It can be used in gaming, assistive technology for individuals with disabilities, and research settings, providing a seamless connection between visual attention and brain signals to improve responsiveness and accuracy.
US10712820B2 Systems and methods for a hybrid brain interface for robotic swarms using EEG signals and an input device
Research Gap: Emotiv's system primarily focuses on mind control through EEG, and while it provides basic control over digital devices, it is not tailored to smart home environments. Our system not only offers thought-based control but also adds gaze-directed precision, making it more functional for a broader range of devices and home environments.
SUMMARY OF THE INVENTION
This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the invention.
This summary is neither intended to identify key or essential inventive concepts of the invention and nor is it intended for determining the scope of the invention.
To further clarify advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.
The hybrid EEG BCI with an eye tracker integrates brainwave activity and eye movement data to enhance cursor control in computer navigation. EEG captures neural signals related to intention, while the eye tracker provides precise gaze direction. This combination allows for smoother and more intuitive control, enabling users to select items or navigate interfaces by focusing on targets and using brain signals to execute commands. The ultimate goal is to improve accessibility for users with mobility impairments, creating a more seamless interaction experience.
BRIEF DESCRIPTION OF THE DRAWINGS
The illustrated embodiments of the subject matter will be understood by reference to the drawings, wherein like parts are designated by like numerals throughout. The following description is intended only by way of example, and simply illustrates certain selected embodiments of devices, systems, and methods that are consistent with the subject matter as claimed herein, wherein:
FIGURE 1: Block Diagram of the proposed system
The figures depict embodiments of the present subject matter for the purposes of illustration only. A person skilled in the art will easily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the disclosure described herein.

DETAILED DESCRIPTION OF THE INVENTION
The detailed description of various exemplary embodiments of the disclosure is described herein with reference to the accompanying drawings. It should be noted that the embodiments are described herein in such details as to clearly communicate the disclosure. However, the amount of details provided herein is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the scope of the present disclosure as defined by the appended claims.
It is also to be understood that various arrangements may be devised that, although not explicitly described or shown herein, embody the principles of the present disclosure. Moreover, all statements herein reciting principles, aspects, and embodiments of the present disclosure, as well as specific examples, are intended to encompass equivalents thereof.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms “a",” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes” and/or “including,” when used herein, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.
It should also be noted that in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may, in fact, be executed concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
In addition, the descriptions of "first", "second", “third”, and the like in the present invention are used for the purpose of description only, and are not to be construed as indicating or implying their relative importance or implicitly indicating the number of technical features indicated. Thus, features defining "first" and "second" may include at least one of the features, either explicitly or implicitly.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The hybrid EEG BCI with an eye tracker integrates brainwave activity and eye movement data to enhance cursor control in computer navigation. EEG captures neural signals related to intention, while the eye tracker provides precise gaze direction. This combination allows for smoother and more intuitive control, enabling users to select items or navigate interfaces by focusing on targets and using brain signals to execute commands. The ultimate goal is to improve accessibility for users with mobility impairments, creating a more seamless interaction experience.

WHOLE PROCESS IN A NUTSHELL:
The device consists of 3 major part.
1. Initiation
2. Execution
3. Closure
Initiation: In the initiation phase of using a hybrid EEG-BCI and eye tracker, the user wears the EEG cap and positions the eye tracker for accurate readings. The system then calibrates by collecting baseline EEG and eye-tracking data to tailor the interface. After presenting the navigation task and instructions, the user may engage in brief practice sessions to familiarize themselves with controlling the cursor through their brain activity and gaze, setting the stage for effective interaction.
Execution:A hybrid EEG brain-computer interface (BCI) and eye tracker enhance cursor navigation by combining brainwave activity with eye movements. The EEG captures signals related to cursor movement intentions, while the eye tracker identifies gaze direction. Real-time algorithms process these inputs to translate neural and visual cues into cursor movements, enabling intuitive navigation. The system requires initial calibration for accuracy and provides continuous feedback for users to refine their control.
Closure:In the closure phase of using a hybrid EEG-BCI and eye tracker for cursor navigation, the process concludes once the user successfully completes the task, receiving confirmation through visual or auditory feedback. Users can review their performance metrics, gaining insights into their interaction. After removing the EEG cap and eye tracker, the system saves relevant settings and calibration data for future sessions, ensuring a personalized experience. This phase allows users to reflect on their performance and prepare for subsequent tasks.

Process: The development process for the Hybrid EEG BCI with eyetracker for cursor control in computer navigation begins with initiation, where objectives are defined, market research is conducted, and a project plan is established. Next is execution, involving the design and development of prototypes that integrate brain-computer interface (BCI) and eye-tracking technologies. User testing is conducted to gather feedback, leading to iterative improvements and system integration with eyetracker devices. Finally, during closure, the system's effectiveness is evaluated, necessary adjustments are made, and comprehensive documentation is prepared. The project culminates in the launch, ensuring user training and ongoing support for optimal operation.
The hybrid EEG BCI system works by combining brainwave activity and eye-tracking data for more accurate and intuitive cursor control. When a user wears the EEG cap and positions the eye tracker, the system starts by calibrating to baseline readings, capturing the user’s brainwave patterns and eye movement at rest. This allows the system to tailor itself to the user’s unique brain activity and gaze characteristics.
Once calibration is complete, the user can begin interacting with the system. The EEG cap detects brain signals related to the user’s intent to move the cursor, such as when they think about moving the cursor in a certain direction. Simultaneously, the eye tracker tracks the user's gaze and determines which part of the screen they are focusing on. The data from both devices is sent to the microprocessor for processing.

The microprocessor then combines the brainwave activity (which represents intention) with the gaze direction (which identifies the target), and translates this into corresponding cursor movements on the computer screen. For example, if the user focuses on a particular icon and simultaneously intends to select it (as indicated by a change in brain activity), the system will move the cursor to that icon and execute a click or selection.
The calibration phase is crucial to ensure the system is accurately tuned to the user’s brain activity and gaze direction. During this phase, the user is instructed to focus on specific areas of the screen while the EEG and eye-tracking devices collect baseline data. This process ensures that the system understands the user's unique patterns and compensates for any inconsistencies, allowing for a more personalized experience.
Once calibrated, the user can begin interacting with the system by focusing on specific items on the screen. The hybrid nature of the system—combining both eye tracking and EEG signals—ensures that the system remains responsive to the user’s needs and intentions. Continuous feedback is provided to help the user refine their cursor control, making the system progressively more intuitive as the user becomes more accustomed to using both brain signals and gaze direction to control the computer.
Execution Phase
In the execution phase, the system operates continuously, processing the EEG and eye-tracking data in real-time. The combination of brain activity and gaze direction allows for smooth and accurate cursor control. For example, if the user wants to move the cursor to a particular button, they can look at the button (tracked by the eye tracker) while thinking about moving the cursor, which the EEG cap detects. The system processes these inputs and moves the cursor to the target.
During this phase, the system is also designed to provide real-time feedback to the user, showing the movement of the cursor on the screen and confirming actions, such as clicks or selections. This continuous interaction allows users to refine their control over time, making the system increasingly efficient and intuitive.
Once the user completes a task, such as navigating to a specific location or selecting an item, the system enters the closure phase. During this phase, feedback is provided to confirm the successful completion of the task, either through visual or auditory signals. The system may display metrics regarding the user's performance, such as the accuracy of their cursor control or the time taken to complete the task.
The system then saves any relevant data, including calibration settings and interaction history, to the cloud storage. This allows for future sessions to be personalized based on past interactions. Users can also review the historical data to analyze their performance or identify areas for improvement. Once the task is completed and the data is stored, the system can be shut down or reconfigured for the next user.
ADVANTAGES OF THE INVENTION
1. Improved User Accuracy: Users can select targets more accurately by combining gaze direction with neural signals, reducing the likelihood of mis clicks.
2. Enhanced Speed of Interaction: The integration allows for quicker navigation through interfaces, as users can execute commands with minimal delay after focusing on a target.
3. Increased Accessibility: Individuals with mobility impairments may experience a greater sense of independence, as they can interact with technology more easily and efficiently.
4. User Satisfaction and Engagement: Positive feedback from users regarding the intuitive nature of the system may lead to higher satisfaction levels and greater engagement with digital content.
5. Learning Curve Reduction: With a more natural interaction method, users may find it easier to learn and adapt to the technology, leading to faster onboarding.
6. Real-Time Feedback and Adaptation: The system could provide real-time feedback, helping users adjust their focus and intention more effectively during use.
7. Broader Application Potential: Beyond accessibility, this technology could find applications in gaming, virtual reality, and other interactive environments, broadening its impact.
8. Data Collection for Personalization: Continuous use may generate valuable data that can be analyzed to further personalize user experiences and improve the system over time.
9. Multi-tasking Capability: Users might be able to perform multiple tasks more efficiently, as they can control the cursor and execute commands simultaneously through their gaze and brain signals.
10. Research Opportunities: The system could open avenues for further research into brain-computer interactions, neural processing, and the relationship between eye movements and cognitive intent.  
, Claims:1. A hybrid EEG BCI with an eye tracker for cursor control in computer navigation, comprising an eye tracker device, an EEG cap, a microprocessor, cloud storage, and a web application, wherein the user wears the EEG cap and positions the eye tracker for accurate readings, and the system calibrates by collecting baseline EEG and eye-tracking data to tailor the interface.
2. The hybrid EEG BCI as claimed in claim 1, wherein the system presents a navigation task and instructions to the user, who then engages in brief practice sessions to familiarize themselves with controlling the cursor through brain activity and gaze, setting the stage for effective interaction.
3. The hybrid EEG BCI as claimed in claim 1, wherein the microcontroller processes the raw EEG and eye-tracking data to determine cursor movement and executes actions based on the combined brain signals and gaze direction.
4. The hybrid EEG BCI as claimed in claim 1, wherein the system provides confirmation of task completion through visual or auditory feedback once the user successfully completes the task.
5. The hybrid EEG BCI as claimed in claim 1, wherein the execution phase involves the design and development of prototypes integrating brain-computer interface (BCI) and eye-tracking technologies, followed by user testing to gather feedback and iterative improvements.
6. The hybrid EEG BCI as claimed in claim 1, wherein during the closure phase, the system’s effectiveness is evaluated, necessary adjustments are made, and comprehensive documentation is prepared to improve future interactions.
7. The hybrid EEG BCI as claimed in claim 1, wherein the web application is used to display the results from the eye tracker device and is connected to the eye tracker device via a Bluetooth connection for real-time data transmission.
8. The hybrid EEG BCI as claimed in claim 1, wherein cloud storage is used to collect and store historical data for future analysis, enabling continued refinement of the system’s performance.
9. The hybrid EEG BCI as claimed in claim 1, wherein the system integrates a user interface that allows for real-time tracking and adjustments based on the brain activity and gaze direction to enhance the accuracy of cursor control.
10. The hybrid EEG BCI as claimed in claim 1, wherein the system allows users with mobility impairments to navigate computer interfaces with enhanced precision by using both brainwave signals and eye movement, thus improving overall user accessibility and interaction.

Documents

Application Documents

# Name Date
1 202441101346-STATEMENT OF UNDERTAKING (FORM 3) [04-12-2024(online)].pdf 2024-12-04
2 202441101346-REQUEST FOR EARLY PUBLICATION(FORM-9) [04-12-2024(online)].pdf 2024-12-04
3 202441101346-POWER OF AUTHORITY [04-12-2024(online)].pdf 2024-12-04
4 202441101346-FORM-9 [04-12-2024(online)].pdf 2024-12-04
5 202441101346-FORM FOR SMALL ENTITY(FORM-28) [04-12-2024(online)].pdf 2024-12-04
6 202441101346-FORM 1 [04-12-2024(online)].pdf 2024-12-04
7 202441101346-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [04-12-2024(online)].pdf 2024-12-04
8 202441101346-EVIDENCE FOR REGISTRATION UNDER SSI [04-12-2024(online)].pdf 2024-12-04
9 202441101346-EDUCATIONAL INSTITUTION(S) [04-12-2024(online)].pdf 2024-12-04
10 202441101346-DRAWINGS [04-12-2024(online)].pdf 2024-12-04
11 202441101346-DECLARATION OF INVENTORSHIP (FORM 5) [04-12-2024(online)].pdf 2024-12-04
12 202441101346-COMPLETE SPECIFICATION [04-12-2024(online)].pdf 2024-12-04
13 202441101346-FORM 18 [18-02-2025(online)].pdf 2025-02-18