Abstract: A brain-computer interface system for understanding brain signals, comprising, a brain signal sensing unit associated with the system, collects brain signals during thinking about movement tasks, a signal cleaning unit that removes noise and picks out important signal parts, a prediction unit that guesses movement actions from the signal parts, a Shapley value unit that shows how each signal part affects the guesses, a display unit that shows clear explanations of the guesses for experts, the feedback unit updates the prediction model as new brain signals are collected to make it more accurate and the interface unit lets experts change system settings based on the Shapley value reasons.
Description:FIELD OF THE INVENTION
[0001] The present invention relates to a brain-computer interface system for understanding brain signals that accurately captures and interprets brain signals for movement, translating them into reliable commands. This improves motor disability quality of life by enhancing brain signal clarity through identifying key neural patterns, leading to more precise assistive control and advancing neuroscience.
BACKGROUND OF THE INVENTION
[0002] The Brain-Computer Interface (BCI) system establishes a direct communication pathway between the brain and an external device. It works by acquiring brain signals, typically electrical activity from neurons, using sensors either placed on the scalp (non-invasive like EEG) or implanted directly in the brain (invasive). These raw signals are then processed to filter out noise and extract relevant features related to a user's intent, such as imagined movements or cognitive states. Sophisticated protocols translate these features into actionable commands for external devices, like robotic limbs or computer cursors. Feedback to the user is crucial, allowing them to adapt and refine their brain activity for more effective control. BCIs aim to bypass traditional muscle-based control, offering new possibilities for individuals with motor disabilities and advancing neuroscientific research.
[0003] Traditional brain signal analysis methods suffer from a low signal-to-noise ratio, particularly with non-invasive techniques like EEG, making it difficult to discern true brain activity from noise. They also provide poor spatial resolution, struggling to pinpoint specific brain regions involved in activity, and often exhibit limited temporal resolution, hindering real-time understanding. Significant inter-subject variability necessitates extensive individual calibration, and the non-stationary nature of brain signals further complicates consistent model performance. Moreover, a lack of transparency in prediction outcomes and the inherent invasiveness risks of certain high-resolution techniques limit their broader adoption and trustworthy application.
[0004] CN219039708U discloses a utility model discloses an electroencephalogram simulation system for a brain-computer interface system, which comprises a connecting device for externally connecting the brain-computer interface system; the transmission device is used for transmitting an externally input brain wave file; the signal conversion device is used for converting an externally input brain wave file into a first analog signal; the control output device is used for outputting the first analog signal to a brain-computer interface system through the connecting device; and the display device is used for displaying waveform information of the first analog signal. Compared with a common laboratory signal generator sold in the market, the signal generator can output analog signals and brain wave signals of a real human brain.
[0005] CN114841215A discloses a invention relates to a brain-computer interface intelligent equipment control system and method. A camera installed on an augmented reality device in the system can collect an environment image in real time and transmit the image to a scene understanding module, and the scene understanding module can understand scene information by adopting a deep learning technology and transmit the understanding information to an augmented reality display device. A built-in eye tracker of the reality display equipment can acquire the gaze watching position of a user in real time, the eye tracker is combined with an SSVEP normal form to operate a target in a scene, then a control or information acquisition instruction is sent to the operation target, and feedback is carried out on the augmented reality equipment, so that the user experience is improved. The enhancement of cognition, vision, perception and other capabilities can be brought by the combination of the brain-computer interface and the augmented reality technology, and the solution that the paralyzed patient visually and efficiently controls the external equipment is realized.
[0006] Conventionally, many systems are available in the market for understanding brain signals but existing systems often suffer from inaccurate signal capture, poor clarity in brain signal analysis, and lack transparent explanations for predictions. This limits reliable control, hinders research, and ultimately reduces trust and adoption of brain-computer interfaces for users.
[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a system that offers accurate brain signal interpretation for precise control of external devices, clear analysis of neural patterns, and visual explanations for experts, leading to enhanced reliability and improved user quality of life.
OBJECTS OF THE INVENTION
[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.
[0009] An object of the present invention is to develop a system that is capable of accurately capture and interpret brain signals related to movement intentions, enabling reliable translation into actionable commands for external devices, thus improving the quality of life for individuals with motor disabilities.
[0010] Another object of the present invention is to develop a system that is capable of improve the clarity of brain signal analysis by identifying and highlighting the most influential neural patterns contributing to movement predictions, therefore enabling more precise and effective control over assistive devices and fostering advancements in neuroscientific research.
[0011] Yet another object of the present invention is to develop a system that is capable of provide experts with clear, visual explanations of prediction outcomes, facilitating informed decision-making in clinical and research applications, thus enhancing the reliability and trustworthiness of brain-computer interface technologies.
[0012] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0013] The present invention relates to a brain-computer interface system for understanding brain signals that enhance brain signal analysis clarity by identifying key neural patterns for movement predictions, enabling precise assistive device control and advancing neuroscience. The system also provides experts with clear visual explanations of prediction outcomes, fostering informed decision-making and increasing the reliability of brain-computer interface technologies.
[0014] According to an embodiment of the present invention, a brain-computer interface system for understanding brain signals, comprising a brain signal sensing unit associated with the system, collects brain signals during thinking about movement tasks, a signal cleaning unit that removes noise and picks out important signal parts, a prediction unit that guesses movement actions from the signal parts, a Shapley value unit that shows how each signal part affects the guesses, a display unit that shows clear explanations of the guesses for experts, the display unit uses pictures like charts to show which brain signals matter most for the guesses, the feedback unit updates the prediction model as new brain signals are collected to make it more accurate, the interface unit lets experts change system settings based on the Shapley value reasons and the feedback loop adjusts the model to match the user’s unique brain patterns for better results.
[0015] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates a block diagram depicting workflow of a brain-computer interface system for understanding brain signals.
DETAILED DESCRIPTION OF THE INVENTION
[0017] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0018] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0019] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0020] The present invention relates to a brain-computer interface system for understanding brain signals that accurately captures and interprets brain signals related to movement intentions, translating them into reliable commands for external devices. This will improve the quality of life for individuals with motor disabilities by providing experts with clear, visual explanations of prediction outcomes, which facilitates informed decision-making in clinical and research applications.
[0021] Referring to Figure 1, a block diagram depicting workflow of a brain-computer interface system for understanding brain signals is illustrated. The present invention pertains to a brain-computer interface (BCI) system designed to interpret brain signals for understanding and predicting movement intentions with high accuracy and transparency. This system facilitates seamless interaction between the human brain and external devices by translating neural activity into actionable commands, offering significant advancements in neurotechnology applications, such as assistive devices for individuals with motor impairments, neurorehabilitation, and human-machine collaboration. The system integrates signal processing, predictive modeling, and interpretable analytics to provide experts with clear, actionable insights into brain signal contributions, thereby enhancing trust and usability in clinical and research settings.
[0022] The BCI system comprises five core technical modules: a brain signal sensing unit, a signal cleaning unit, a prediction unit, a Shapley value unit, and a display unit, with an additional feedback loop and interface unit to optimize performance and adaptability. These modules work synergistically to capture, process, interpret, and visualize brain signals, ensuring both precision in movement prediction and transparency in how predictions are derived. By leveraging cutting-edge computational techniques, the system addresses challenges such as noise in neural data, variability in individual brain patterns, and the need for interpretable outputs, making it a robust solution for real-world applications.
[0023] The brain signal sensing unit is the first module, responsible for collecting high-fidelity brain signals during tasks involving motor imagery or intended movements. Utilizing non-invasive techniques such as electroencephalography (EEG) or invasive methods like electrocorticography (ECoG), this unit captures raw neural data with high temporal resolution. The sensing unit is designed to be adaptable to various electrode configurations, ensuring compatibility with diverse user needs and clinical setups.
[0024] The signal cleaning unit processes the raw brain signals to eliminate noise and artifacts, which are common in neural recordings due to muscle activity, eye blinks, or environmental interference. This module employs sophisticated signal processing protocols, such as band pass filtering, independent component analysis (ICA), and wavelet transforms, to isolate relevant signal components associated with movement tasks. By enhancing signal quality, this unit ensures that subsequent modules receive clean, reliable data, thereby improving overall system performance.
[0025] The prediction unit forms the core of the system’s decision-making capability, employing machine learning models, such as deep neural networks or support vector machines, to interpret cleaned brain signals and predict intended movement actions. This unit is trained on large datasets of neural activity correlated with specific motor tasks, enabling it to generalize across users while maintaining high accuracy. The prediction unit continuously refines its models through a feedback loop, adapting to new data to improve predictive performance over time.
[0026] A key innovation of the system is the Shapley value unit, which enhances interpretability by quantifying the contribution of each signal component to the prediction outcome. Using cooperative game theory principles, this module calculates Shapley values to assign importance scores to specific brain regions or signal features, such as frequency bands or electrode channels. This allows experts to understand which neural patterns drive the system’s predictions, fostering trust and enabling targeted interventions in clinical applications.
[0027] The display unit presents the system’s outputs in an intuitive format, using visualizations such as heat maps, bar charts, and topographic maps to highlight the most influential brain signals and regions. These visual aids are tailored for experts, providing clear, concise explanations of the prediction process and the role of specific neural features. The display unit ensures that complex computational results are accessible, facilitating informed decision-making in real-time clinical or research environments.
[0028] Additionally, the system incorporates a feedback loop that dynamically updates the prediction model as new brain signals are collected, adapting to the user’s unique neural patterns for personalized performance. An interface unit allows experts to adjust system parameters, such as signal thresholds or model hyper parameters, based on insights from the Shapley value unit. This interactivity ensures that the system remains flexible and optimized for diverse applications.
[0029] The present invention work best in the manner, where the system interprets brain signals to predict movement intentions with high accuracy and transparency, enabling seamless human-machine interaction for applications like assistive devices and neurorehabilitation. The system comprises five core modules—the brain signal sensing unit, signal cleaning unit, prediction unit, Shapley value unit, and display unit augmented by the feedback loop and interface unit for enhanced performance. The brain signal sensing unit captures high-fidelity neural data during motor imagery tasks using techniques like EEG or ECoG, ensuring adaptability to various electrode configurations. The signal cleaning unit removes noise and artifacts from raw signals using protocols such as bandpass filtering and ICA, delivering clean data to subsequent modules. The prediction unit, powered by machine learning models like deep neural networks, interprets these signals to predict movement actions, continuously refining its accuracy through the feedback loop, which adapts to user-specific neural patterns. The Shapley value unit enhances interpretability by quantifying each signal component’s contribution to predictions, identifying key brain regions and features. The display unit visualizes these insights through intuitive heatmaps, bar charts, and topographic maps, enabling experts to understand prediction processes clearly. The interface unit allows experts to adjust system parameters based on Shapley value insights, ensuring flexibility. This integrated approach ensures precise, transparent, and adaptable translation of brain signals into actionable commands, advancing neurotechnology applications.
[0030] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A brain-computer interface system for understanding brain signals, comprising:
i) a brain signal sensing unit associated with the system, collects brain signals during thinking about movement tasks;
ii) a signal cleaning unit that removes noise and picks out important signal parts;
iii) a prediction unit that guesses movement actions from the signal parts;
iv) a Shapley value unit that shows how each signal part affects the guesses; and
v) a display unit that shows clear explanations of the guesses for experts.
2) The system as claimed in claim 1, wherein the display unit uses pictures like charts to show which brain signals matter most for the guesses.
3) The system as claimed in claim 1, wherein the Shapley value unit points out specific brain areas that help make predictions to make the system clearer.
4) The system as claimed in claim 1, wherein the feedback unit updates the prediction model as new brain signals are collected to make it more accurate.
5) The system as claimed in claim 1, wherein the interface unit lets experts change system settings based on the Shapley value reasons.
6) The system as claimed in claim 1, wherein the feedback loop adjusts the model to match the user’s unique brain patterns for better results.
| # | Name | Date |
|---|---|---|
| 1 | 202541077307-STATEMENT OF UNDERTAKING (FORM 3) [13-08-2025(online)].pdf | 2025-08-13 |
| 2 | 202541077307-REQUEST FOR EARLY PUBLICATION(FORM-9) [13-08-2025(online)].pdf | 2025-08-13 |
| 3 | 202541077307-PROOF OF RIGHT [13-08-2025(online)].pdf | 2025-08-13 |
| 4 | 202541077307-POWER OF AUTHORITY [13-08-2025(online)].pdf | 2025-08-13 |
| 5 | 202541077307-FORM-9 [13-08-2025(online)].pdf | 2025-08-13 |
| 6 | 202541077307-FORM FOR SMALL ENTITY(FORM-28) [13-08-2025(online)].pdf | 2025-08-13 |
| 7 | 202541077307-FORM 1 [13-08-2025(online)].pdf | 2025-08-13 |
| 8 | 202541077307-FIGURE OF ABSTRACT [13-08-2025(online)].pdf | 2025-08-13 |
| 9 | 202541077307-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [13-08-2025(online)].pdf | 2025-08-13 |
| 10 | 202541077307-EVIDENCE FOR REGISTRATION UNDER SSI [13-08-2025(online)].pdf | 2025-08-13 |
| 11 | 202541077307-EDUCATIONAL INSTITUTION(S) [13-08-2025(online)].pdf | 2025-08-13 |
| 12 | 202541077307-DRAWINGS [13-08-2025(online)].pdf | 2025-08-13 |
| 13 | 202541077307-DECLARATION OF INVENTORSHIP (FORM 5) [13-08-2025(online)].pdf | 2025-08-13 |
| 14 | 202541077307-COMPLETE SPECIFICATION [13-08-2025(online)].pdf | 2025-08-13 |
| 15 | 202541077307-MARKED COPIES OF AMENDEMENTS [02-09-2025(online)].pdf | 2025-09-02 |
| 16 | 202541077307-FORM 13 [02-09-2025(online)].pdf | 2025-09-02 |
| 17 | 202541077307-AMENDED DOCUMENTS [02-09-2025(online)].pdf | 2025-09-02 |