Abstract: ABSTRACT A DEVICE AND METHOD FOR REALTIME COGNITIVE ASSESSMENT OF A USER A device 100 and method for real time cognitive assessment, is disclosed. The device 100 is configured for receiving health data, inertial motion data, and environmental data. Further, the device 100 is configured for processing the health data, inertial motion data and environmental data to determine occurrence of an event, a type of the event and a severity of the event. Further, the device 100 is configured for triggering one or more tests from a set of tests based on the type of the event and the severity of the event. Further, the device 100 is configured for receiving user inputs from the user on the one or more tests from a set of tests, wherein the user inputs are received in the form of haptic, audio, video, text, or combination thereof and processing the user inputs to generate one or more alerts. [To be published with figure 1]
Claims:WE CLAIM:
1. A device 100 for real time cognitive assessment, the device comprising:
a memory 101; and
a processor 102, wherein the processor 102 is configured for processing programmed instructions stored in the memory 101 for,
receiving
health data of a user from a set of health monitoring sensors 104,
inertial motion data from a set of motion sensors 105, and
environmental data from a set of environment monitoring sensors 106;
processing the health data, inertial motion data, and environmental data to determine occurrence of an event, type of the event and severity of the event;
triggering one or more tests from a set of tests based on the type of the event and severity of the event;
receiving user inputs from the user on the one or more tests from a set of tests, wherein the user inputs are received in the form of haptic, audio, video, text, or combination thereof; and
processing the user inputs to generate one or more alerts.
2. The device 100 as claimed in claim 1 further comprises an audio module, a haptic feedback module, and a display screen.
3. The device 100 as claimed in claim 1 further comprises an interactive touch and facial muscle movement monitoring device.
4. The device 100 as claimed in claim 1 further configured to capture decision making skills of the user, wherein the set of tests comprise one or more tests to assess colors recognition capabilities, patterns recognition capabilities, and simple arithmetic computation capabilities of the user.
5. The device 100 as claimed in claim 1, wherein the set of environment monitoring sensors capture motion, orientation, displacement, and atmospheric parameters, wherein the atmospheric parameters include Atmospheric Pressure, Ambient Temperature, Humidity, Air-resistance, Air Quality, Elevation, and VOC.
6. The device 100 as claimed in claim 1, wherein the set of health monitoring sensors 104 capture heart rate, Heart Rate Variability (HRV), Respiration Rate, Oxygen Saturation, Electrocardiograph (ECG), Blood pressure, pulse rate and PPG.
7. The device 100 as claimed in claim 1 further configured for
receiving input to at least one test from the set of tests from the user;
assessing the input of the user corresponding to the at least one test; and
altering one or more remaining tests in the set of tests based on the response received from the user.
8. The device 100 as claimed in claim 8, wherein the one or more tests in the set of tests is altered to include test that are difficult than the previous test, when the user is able to provide appropriate inputs to the previous test, and wherein the one or more tests in the set of tests is altered to include test that are easier than the previous test, when the user is unable to provide appropriate inputs to the previous test.
9. The device 100 as claimed in claim 1, wherein the set of tests are periodically repeated to test the mental stability and decision making capability of the user.
10. The device 100 as claimed in claim 1, wherein the type of event is at least one of blunt trauma, projectile injury, shockwave injury and fall damage.
11. The device 100 as claimed in claim 1, wherein the severity of the event is one of mild, moderate, severe, and fatal.
12. The device 100 as claimed in claim 1, wherein the set of tests comprise one or more tests that are automated to determine vision, motor, aural, and memory capabilities of the user.
13. A method for for real time cognitive assessment, the method comprising:
receiving
health data of a user from a set of health monitoring sensors 104,
inertial motion data from a set of motion sensors 105, and
environmental data from a set of environment monitoring sensors 106;
processing the health data, inertial motion data and environmental data to determine occurrence of an event, a type of the event and a severity of the event;
triggering one or more tests from a set of tests based on the type of the event and the severity of the event;
receiving user inputs from the user on the one or more tests from a set of tests, wherein the user inputs are received in the form of haptic, audio, video, text, or combination thereof; and
processing the user inputs to generate one or more alerts.
Dated this 23rd Day of June, 2021
Priyank Gupta
Agent for the Applicant
IN/PA- 1454
, Description:FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
TITLE OF INVENTION:
A DEVICE AND METHOD FOR REALTIME COGNITIVE ASSESSMENT OF A USER
APPLICANT:
LVL ALPHA PRIVATE LIMITED
Having address:
A-3/1, Ground Floor, TDS Colony, Pune-411001, Maharashtra, India.
The following specification describes the invention and the manner in which it is to be performed.
RELATED APPLICATIONS
A present application does not relate to any other patent application filed in Indian or abroad.
TECHNICAL FIELD
A present invention relates to the field of a real-time cognitive assessment of a user. More specifically, the present invention relates to real-time cognitive assessment of a user post injury.
BACKGROUND
The subject matter discussed in the background section should not be assumed to be prior art merely because of its mention in the background section. Similarly, a problem mentioned in the background section or associated with the subject matter of the background section should not be assumed to have been previously recognized in the prior art. The subject matter in the background section merely represents different approaches, which in and of themselves may also correspond to implementations of the claimed technology.
With the development in science and technology, new technological inventions have eased the everyday existence of the human population. However, in scenarios such as an accident or during active combat it is important to rapidly assess the physical and mental state of the injured victim. This ensures that immediate medical response is provided. Although in cases where the medic is not available instantly, a battery of real time physiological and psychological assessments can be performed, sequentially. This method is able to adjudicate, post injury, the physical health condition, motor function, decision making, memory recall and retention of the injured victim. The Current state of the art uses electrodes on a user scalp for neurophysiological measurements and categorisation of only Mild-Traumatic Brain Injury (mTBI), that also post accidents. Moderate and Severe TBI has varied neurological and biological consequences like focal contusions, hematomas, diffuse axonal injury, edema, cellular dysfunction, impaired synaptic transmission, cell death and axonal degeneration.
Since the effects of mild TBI are generally undiagnosable on CT or other brain imaging systems, Injury recording and series of periodic tests and health vitals recording during and post activity allows for understanding mid-term cognitive state or degradation of the user’s psychological condition. This allows better focus on rest and recuperation of the individual and proper treatment over the recovery cycle.
The Current State-of-arts can measure in real time the impact trauma to the cranial system, including head and neck, and using standardized equations which measure total impact force over a period of fixed duration of time interval. Using well established scales, the impact injury is classified as mild, moderate, severe and fatal. This information is then used to analyse the cranial trauma victim through well defined Glasgow Coma Scale (CGS), which requires manual assessment by a medic. Also, a medic is required to assess on a predefined scale for eye-opening, verbal response and motor response.
In case of a lack of medical personnel or access to the injured victim, this assessment is not completed, hence delaying medical attention or also stall taking a judgement on the cognitive and mental fitness of the trauma victim.
The claimed invention overcomes this limitation by automating the cognitive assessment process. The claimed invention is able to measure the Injury incident, grade the trauma level, and then run cognitive and consciousness assessment tests, while monitoring health vitals of the injured user. All these measurements are done, while on on-field active duty, not requiring the assistance of any off-field external device or software-based analysis.
The method allows the user wearing the device understand their cognitive state and decision-making levels, and incase they are incapable of operating the simple assessment tests or fail to record correct responses, responsible first-aid, medical personnel or supervisory level are quickly engaged to decrease the response time to such injuries and potentially save the injured victim’s life.
The system accounts for assessment of individuals with whole spectrum of post traumatic conditions, like disorientation, loss of haptic functions, loss of memory function as well as slow cognitive declines, slower reaction rates, loss of cognitive understanding, basic situational awareness and even in extreme cases, unconsciousness, paralysis, comatose or fatality.
The system has an Aural, Visual and tactile assessment system for on-field post Traumatic Understanding, of the mental state of the victim, in near real time. The current state-of-the-arts have the capability of recording specific inputs from the trauma victim related to the Traumatic Brain Injury. The proposed solution allows for recording user responses from the whole spectrum of post injury conditions as mentioned above.
SUMMARY
This summary is provided to introduce concepts related to a system and a method for real time cognitive assessment of user post traumatic brain injury and the concepts are further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
In one embodiment, a device for real time cognitive assessment is illustrated in accordance with an embodiment of the present invention. The device may comprise a memory, and a processor. The processor is configured for processing programmed instructions stored in the memory. The processor may execute programmed instructions stored in the memory for receiving health data of a user from a set of health monitoring sensors, inertial motion data from a set of motion sensors, and environmental data from a set of environment monitoring sensors. The processor may execute programmed instructions stored in the memory for processing the health data, inertial motion data, and environmental data to determine occurrence of an event, a type of the event and a severity of the event. The processor may execute programmed instructions stored in the memory for triggering one or more tests from a set of tests based on the type of the event and the severity of the event. Further, the processor may execute programmed instructions stored in the memory for receiving user inputs from the user on the one or more tests from a set of tests, wherein the user inputs are received in the form of haptic, audio, video, text, or combination thereof. Further, the processor may execute programmed instructions stored in the memory for processing the user inputs to generate one or more alerts.
In another embodiment, a method for real time cognitive assessment, is illustrated in accordance with an embodiment of the present invention. The method comprises steps for receiving health data of a user from a set of health monitoring sensors, inertial motion data from a set of motion sensors, and environmental data from a set of environment monitoring sensors. The method comprises steps for processing the health data, inertial motion data, and environmental data to determine occurrence of an event, a type of the event and a severity of the event. The method comprises steps for triggering one or more tests from a set of tests based on the type of the event and the severity of the event. Further, the method comprises steps for receiving user inputs from the user on the one or more tests from a set of tests, wherein the user inputs are received in the form of haptic, audio, video, text, or combination thereof. Further, the method comprises steps for processing the user inputs to generate one or more alerts.
BRIEF DESCRIPTION OF DRAWINGS
The detailed description is described with reference to the accompanying Figures. In the Figures, the leftmost digit(s) of a reference number identifies the Figure in which the reference number first appears. The same numbers are used throughout the drawings to refer to features and components.
Figure 1 illustrates a device 100 positioned with reference to the COM of the human head, in accordance with an embodiment of the present invention.
Figure 2 illustrates functionalities of the device 100, in accordance with an embodiment of the present invention.
Figure 3 illustrates a top view of the device 100 and capacitive touch and pattern recognition screen 107, in accordance with an embodiment of the present disclosure.
Figure 4 illustrates a flow chart for different types of cognitive tests, in accordance with an embodiment of the present disclosure.
Figure 4.1 illustrates pattern recognition screen 201 enabled on a smartwatch 200 or a handheld device 300, in accordance with an embodiment of the present disclosure.
Figure 4.2 illustrates the cognitive test of tracing a falling ball on screen 201 enabled on the smartwatch 200 or the handheld device 300, in accordance with an embodiment of the present disclosure.
Figure 4.3 and 4.4 illustrates two levels of colour test in either the smart watch 200 or the handheld device 300, in accordance with an embodiment of the present disclosure.
Figure 4.5 illustrates the memory test of digit recognition in either the smart watch 200 or the handheld device 300, in accordance with an embodiment of the present disclosure.
Figure 4.6 illustrates a symbol test in either the smartwatch 200 or the handheld device 300, in accordance with an embodiment of the present disclosure.
Figure 4.7 illustrates a haptic test in either the smartwatch 200 or a handheld device 300, in accordance with an embodiment of the present disclosure.
Figure 5 illustrates the classification of different types of events depending on the changes in Health, Environmental and Inertial parameters, in accordance with an embodiment of the present disclosure.
Figure 6 illustrates the algorithm for detecting the type of event, in accordance with an embodiment of the present disclosure.
Figure 7 illustrates a flowchart of the proposed method, in accordance with an embodiment of the present disclosure.
Figure 8 illustrates the flow chart of all mathematical calculations in order to obtain the severity of the event, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
Reference throughout the specification to “various embodiments,” “some embodiments,” “one embodiment,” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, appearances of the phrases “in various embodiments,” “in some embodiments,” “in one embodiment,” or “in an embodiment” in places throughout the specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner in one or more embodiments.
In one implementation, a device for performing real time cognitive assessment is illustrated. The device may be an earpiece, wristwatch or any other body mounted device. The device may comprise an audio module, a haptic feedback module, a display screen, interactive touch and facial muscle movement monitoring device. The device may be a wearable or a non-wearable device or a combination of both. The device may be enabled with a set of inertial motion sensors, a set of environmental monitoring sensors and a set of health monitoring sensors. The inertial, environmental, and health monitoring sensors may be communicatively coupled with the device.
In one embodiment, the inertial motion data may be received by the device from the set of inertial motion sensors. The set of inertial motion sensors may comprise, but are not limited to, an accelerometer, a gyroscope, a magnetometer, a digital compass and the like. Furthermore, environmental data may be received from the set of environment monitoring sensors. The set of environment monitoring sensors may comprise, but are not limited to, a pressure sensor, an ambient temperature sensor, an air-flow sensor, an air quality sensor, an air resistance sensor, a humidity sensor, a volatile organic compounds sensor and the like. The set of environment monitoring sensors capture motion, orientation, displacement, and atmospheric parameters, wherein the atmospheric parameters include Atmospheric Pressure, Ambient Temperature, Humidity, Air-resistance, Air Quality, Elevation, and VOC. Further, health data of the user may be received from the set of health monitoring sensors. The health monitoring sensors may comprise, but are not limited to, a photoplethysmogram, an oxygen saturation level detection sensor, an electrocardiogram, a pulse rate sensor, a heart rate sensor, a body temperature sensor, a galvanic skin response sensor, an electrodermal activity detection sensor, and the like. Upon receiving the inertial motion data, environmental data and health data of the user, a one or more wearable devices may analyse the inertial motion data, environmental data and health data to determine occurrence of an event, type of the event and severity of the event. The event may be any type of injury to the user of the device. Further, the health data may be further processed to determine a type of the event and severity of the event. In one embodiment, the type of event is at least one of blunt trauma, projectile injury, shockwave injury and fall damage. Further, the severity of the event may be classified as mild, moderate, severe, and fatal.
In one embodiment, based on occurrence of the event, one or more tests from a set of tests may be triggered on the device, to assess health of the user. The set of tests are determined based on the type and severity of the event. In one embodiment, the set of tests are enabled to obtain the psychological state of the user associated with the device. The set of tests comprise tests which are automated to determine vision, motor, aural, and memory capabilities of the user post occurrence of the event. The set of tests may comprise one or more tests to assess colors recognition capabilities, patterns recognition capabilities, and simple arithmetic computation capabilities of the user. The set of tests are periodically repeated to test the mental stability and decision-making capability of the user. For this purpose, the wearable devices may generate periodic tests and capture responses to the tests from the user, analyse the response and report whether the user is cognitively sound or not, over a period of time. The present invention is further elaborated with reference to figure 1.
Referring to figure 1, a device 100 such as an ear-phone for performing and recording test results, is illustrated in accordance with an embodiment of the present subject matter. In one embodiment, the device 100 may record inertial motion data, environmental data and health data from a set of health monitoring sensors, a set of motion sensors, and a set of environment monitoring sensors associated with the device 100. The device 100 may be enabled with a memory and a processor in order to process the data received from the set of inertial motion sensors, health monitoring sensors and environmental monitoring sensors. Based on the analysis of the data, the processor may determine occurrence of an event such as an accident, fall from a building, high speed projectile collision with the head, strike to the head with a blunt object or a bomb explosion which may cause a traumatic brain injury to the user. The data may also be analysed in order to determine the type of event and severity of the event. The type of event may be at least one of blunt trauma, projectile injury, shockwave injury, and fall damage. The device 100 may also be able to determine the severity of the event ranging from mild, moderate, severe, and fatal. These classifications are substantially helpful in quick assessment using cognitive tests and to specify to first responding medical units what type and severity of event has occurred, so they can respond with precision and swiftness. Upon detection of such an event, the processor 102 may generate or identify a set of tests for cognitive assessment that are relevant to the type and severity of the event associated with the user.
For example, if a bomb blast has occurred, the processor 102 may identify a set of cognitive tests to assess the hearing capabilities of the user. On the other hand, if an accident has occurred, the set of tests may include one or more tests to assess the movement of body parts of the user. In one embodiment, a common set of tests may be used to assess all types and severity of the event. The individual components in the ear-phone are illustrated with reference to figure 2.
Referring now to figure 2, a block diagram of the device 100 in the form of a ear-phone is illustrated. The ear-phone may comprise a set of modules. The set of modules may comprise a memory 101, a processor 102, haptic motor 103, Health monitoring sensor 104, Inertial motion sensors (accelerometer and gyroscope sensor) 105, environmental sensors 106, touchpad and pattern recognition interface 107, and a wireless bone conduction communication module 108, . In one embodiment wireless communication module 104 is configured to communicate with the haptic motor 103, external environmental sensors 106, trauma monitoring sensors (accelerometer and gyroscope) 105, in-ear Health monitoring sensor 104, as well as touchpad and pattern recognition interface 107. According to one embodiment, aural, tactile, haptic and visual feedback system is enabled over the device 100 with the help of the set of modules.
In one embodiment, the sensors on the device 100 may be configured to capture inertial motion data, health data, and environmental data. The device 100 may be configured to process the health data and environmental data to determine an event such as a fall from a cliff or bomb blast/ gun shot in the vicinity of the user. For determining such an event, the acceleration of the user’s body/ body parts, sudden change in temperature or pressure in the environment, shock waves, sound waves may be detected based on the inertial motion data, health data and environmental data. Also, health data of the user such as heart rate, heart rate variability, pulse rate, oxygen saturation, respiration rate, Blood pressure, body temperature, and the like may be analysed in order to determine the severity of the event on the user. Furthermore, based on the severity of the event, the processor 102 may generate or identify a set of tests for cognitive assessment of the user. The cognitive assessment tests may be conducted on the device 100 or any other wearable or handheld device associated with the user. One such additional device is illustrated with reference to figure 3.
Referring now to figure 3, a device 100 such as an ear-phone is illustrated. According to an embodiment, the ear-phone may be enabled with a screen 107 and may generate a pattern identification test to verify the decision making skills post occurrence of the event by asking the user to identify/ reproduce patterns post occurrence of the event, on the display screen 107 of the ear-phone. In a similar manner, multiple cognitive tests can be conducted using the device 100. Few of the tests for assessing the physical and mental state of the user are illustrated with reference to the flow chart of figure 4.
Referring to figure 4, the flowchart defines different types of cognitive tests including aural, visual and tactile. Pattern recognition and touch input tests are triggered by a tactile assessment system enabled on the device 100, wherein pattern identification tests may verify the decision-making skills post occurrence of the event by asking the user to identify patterns. Visual input tests are triggered by the visual assessment system by a test for tracking the movement shown on the interactive screen by tracing the falling ball, symbol test, colour and digit recognition. These tests check for coherency, memory and attention post-traumatic brain injury of the user. Graphical presentation of the data is provided depending on the test responses by the user, wherein the user may mute or postpone any tests by special acknowledgement, in scenarios where it is difficult to engage in extended testing or threat levels are still high for the user. The device 100 is configured to continuously monitor the health and environmental parameters during this period.
Referring now to figure 4.1, a device such as a smartwatch 200 or a handheld device 300 is illustrated. The smartwatch 200 or the handheld device 300 may be communicatively coupled with the device 100. The smartwatch 200 or the handheld device 300 may be utilized for activating the set of tests to monitor the health and proper motor functioning of the user. The set of tests are determined based on the type of event. For example, if the event is related to a bomb blast, then the visual and decision-making skills may be tested using the set of tests. For this purpose, the smartwatch 200 or the handheld device 300 can display pattern identification tests of the smartwatch 200 or the handheld device 300 to verify the decision-making skills post occurrence of the event and receive inputs from the user by asking the user to identify patterns on the smartwatch 200 or the handheld device 300. In another example, the device 100 itself may be used for conducting the set of tests as represented in figure 3.
Referring to figure 4.2, the device may be a smart watch 200 or a handheld device 300. The device 200 or 300 can generate a test for tracking the movement shown on the interactive screen by tracing the falling ball. In this test both the balls glow once the path traced by the user on interactive screen 201 of the device 100 matches the original zone. In another embodiment of the invention test for tracking the movement shown on the interactive screen is performed on the touchpad 106 configured to device 200 or 300. Furthermore, according to another embodiment of the present invention, in figure 4.3 the device 200 or 300 generates the test for digit recognition. In this test the user has to memorise the number displayed on the screen 201. The user has to select the number displayed before on the screen as an answer to the test. The results include red highlight indicating wrong entry and green highlight indicating correct entry.
Figure 4.3 and 4.4 illustrates two levels of colour test in either a smart watch 200 or a handheld device300, in accordance with an embodiment of the present disclosure. In this test, the user has to select a colour from a plurality of colours displayed on the screen based on the question displayed on the screen.
Figure 4.5 illustrates the memory test of digit recognition in either a smart watch 200 or a handheld device300, in accordance with an embodiment of the present disclosure. In this test, the user may be instructed to memorise numbers/ alphabets and reproduce or select the same numbers in a next screen displayed on the device 200 or 300.
Referring to figure 4.6, according to another embodiment of the present invention, the device 200 or 300 may be configured to generate the test for symbol recognition. In this test a symbol is displayed on the screen 201. The user has to memorise and draw the symbol on the screen 201 as a response.
In another embodiment of the present invention, referring to figure 4.7, the devices 200 or 300 may generate a test for haptic feedback. The device 200 or 300 comprises a haptic feedback device configured with a linear vibration motor fixed inside the device. According to the embodiment of the present invention, for assessing the haptic feedback, the user has to count the number of vibrations and respond by selecting a number equivalent to number of vibrations.
Referring now to figure 5, classification of different types of events based on the analysis of inertial motion parameters (linear and angular acceleration), health parameters and external environmental parameters is illustrated. The linear and angular acceleration parameters are detected based on detection of cranial displacement and acceleration. Whereas the types of event include blunt injury, fall injury, projectile injury and shock injury. As shown in the figure 5, Blunt injury is detected when the cranial acceleration and displacement is of damping nature over a short interval of time. Fall injury is either a cause of change in cranial orientation, displacement and acceleration, or sudden change in elevation of the human body. Projectile injury occurs when a high impulse impact is obtained in a small fraction of time. Shockwave injury can happen due to sudden change in atmospheric pressure and air resistance. Finally, the state(concussed, unconscious and conscious) of the user is determined from the health parameters and the severity of the event.
Furthermore, depending on the data received from a set of sensors and upon analysing the data, the user state is depicted and categorised into concussed, unconscious and conscious. It must be noted that the environmental data is processed to determine an adverse event, and the health monitoring data is processed to trigger cognitive assessment tests through the one or more wearable devices with the user.
Referring to figure 6, an algorithm is defined for detecting the type of event (Blunt injury, Fall injury, Projectile injury, or Shockwave injury). A flowchart for assessing health of the user post-traumatic brain injury of the user is illustrated.
At step 601, the sensor data is analysed for detection of cranial motion and orientation.
At step 602, if there is any change in cranial motion and orientation, the sensor data is further analysed to detect change in environmental and health parameters.
At step, 603, the probability of different kinds of injuries is calculated based on the sensor data.
The below tables 1, 2, and 3 may be used for determining the final probability of each type of event by using the data of different parameters mentioned above. The type of event having the highest probability may be considered as the injury caused to the user.
Table 1: Represents changes in motion parameters for evaluating the initial probability of each type of event.
Parameters Blunt Trauma Projectile Injury Shockwave Injury Fall Damage
Motion Parameters Cranial displacement and acceleration Damped High NA Sudden
Velocity Impact High High Low Low
Angular Velocity Yes Yes No No
Type of motion Angular + Linear Angular Linear Linear
Table 2: Represents change in vital health parameters.
Parameters Blunt Trauma Projectile Injury Shockwave Injury Fall Damage
Vital Health Parameters Body Temperature Increase Increase No change No change
SP02 Decrease Decrease Decrease No change
Pulse rate Increase Increase Increase No change
Blood Pressure Increase Increase Increase No change
Respiration rate No change No change Decrease No change
Table 3: Represents the change in environmental parameters
Parameters Blunt Trauma Projectile Injury Shockwave Injury Fall Damage
Environmental Parameters Atmospheric temp Increase Increase Increase No change
Ambient temp No change No change Changes No change
Humidity No change No change Changes No change
Air quality No change No change Changes Might change
Elevation No change No change No change Changes
At step 604, the system is configured to enable the cognitive assessment test and capture the user inputs on the cognitive assessment tests. In one embodiment, if the user is not able to attempt a test in the set of tests, then the complexity of the remaining tests in the set of tests may be reduced. On the other hand, if the user is able to attempt the set of tests, without much difficulty, in that case, the complexity of the remaining tests in the set of tests may be increased.
Figure 7 illustrates a flowchart of the proposed system, in accordance with an embodiment of the present disclosure.
At step 701, the device 100 is configured to capture data from a set of motion detection sensors and analyse the data to determine occurrence of an event.
At step 702, upon occurrence of the event, the device 100 is configured to capture changes in the health and environmental data captured from the set of health monitoring sensors 104 and set of environmental monitoring sensors.
At step 703, the device 100 is configured to determine the probability of different types of events and the severity of the event based on the change in the parameters of the health data and environmental data.
At step 704, a set of tests are triggered on the device 100, 200, or 300 based on the type of the event and the severity of the event. Furthermore, the device 100, 200, or 300 may receive user inputs from the user on the one or more tests from a set of tests. The user inputs are received in the form of haptic, audio, video, text, or combination thereof. Further, the device 100, 200, or 300 may process the user inputs to generate one or more alerts. These alerts may be transmitted to a medical team or a nearby hospital using wireless communication.
Referring to figure 8, a flowchart for determining the severity of the event is shown along with all the relevant mathematical calculations. The angular and linear motion parameters are used to calculate parameters like Acceleration at point of interest, Gadd Severity Index, Head Injury Index, General Acceleration Model, Weighted Principal Component Score, which together provides information on severity of the event, direction of the impact and the user state. The figure also shows that the user is receiving recorded audios for their cognitive assessments post occurrence of the event, for guiding them with the rules of the test. This in turn helps to decide whether the user is reacting to the audio or not concluding to the consciousness level of his mind.
Although the system and method for real time cognitive assessment of a user, post traumatic brain injury, have been described in language specific to structural features and methods, it must be understood that the claims are not limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for the system and the method for real time cognitive assessment of a user, post traumatic brain injury.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 202121028172-IntimationOfGrant28-09-2022.pdf | 2022-09-28 |
| 1 | 202121028172-STATEMENT OF UNDERTAKING (FORM 3) [23-06-2021(online)].pdf | 2021-06-23 |
| 2 | 202121028172-FORM FOR STARTUP [23-06-2021(online)].pdf | 2021-06-23 |
| 2 | 202121028172-PatentCertificate28-09-2022.pdf | 2022-09-28 |
| 3 | 202121028172-FORM FOR SMALL ENTITY(FORM-28) [23-06-2021(online)].pdf | 2021-06-23 |
| 3 | 202121028172-Annexure [09-08-2022(online)].pdf | 2022-08-09 |
| 4 | 202121028172-Written submissions and relevant documents [09-08-2022(online)].pdf | 2022-08-09 |
| 4 | 202121028172-FORM 1 [23-06-2021(online)].pdf | 2021-06-23 |
| 5 | 202121028172-FIGURE OF ABSTRACT [23-06-2021(online)].pdf | 2021-06-23 |
| 5 | 202121028172-Annexure [27-07-2022(online)].pdf | 2022-07-27 |
| 6 | 202121028172-Response to office action [27-07-2022(online)].pdf | 2022-07-27 |
| 6 | 202121028172-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [23-06-2021(online)].pdf | 2021-06-23 |
| 7 | 202121028172-EVIDENCE FOR REGISTRATION UNDER SSI [23-06-2021(online)].pdf | 2021-06-23 |
| 7 | 202121028172-Correspondence to notify the Controller [22-07-2022(online)].pdf | 2022-07-22 |
| 8 | 202121028172-US(14)-HearingNotice-(HearingDate-27-07-2022).pdf | 2022-07-01 |
| 8 | 202121028172-DRAWINGS [23-06-2021(online)].pdf | 2021-06-23 |
| 9 | 202121028172-CLAIMS [18-04-2022(online)].pdf | 2022-04-18 |
| 9 | 202121028172-COMPLETE SPECIFICATION [23-06-2021(online)].pdf | 2021-06-23 |
| 10 | 202121028172-COMPLETE SPECIFICATION [18-04-2022(online)].pdf | 2022-04-18 |
| 10 | 202121028172-STARTUP [24-06-2021(online)].pdf | 2021-06-24 |
| 11 | 202121028172-DRAWING [18-04-2022(online)].pdf | 2022-04-18 |
| 11 | 202121028172-FORM28 [24-06-2021(online)].pdf | 2021-06-24 |
| 12 | 202121028172-FER_SER_REPLY [18-04-2022(online)].pdf | 2022-04-18 |
| 12 | 202121028172-FORM-9 [24-06-2021(online)].pdf | 2021-06-24 |
| 13 | 202121028172-FORM 18A [24-06-2021(online)].pdf | 2021-06-24 |
| 13 | 202121028172-OTHERS [18-04-2022(online)].pdf | 2022-04-18 |
| 14 | 202121028172-FER.pdf | 2021-10-22 |
| 14 | 202121028172-Proof of Right [25-06-2021(online)].pdf | 2021-06-25 |
| 15 | 202121028172-FORM-26 [25-06-2021(online)].pdf | 2021-06-25 |
| 15 | Abstract1.jpg | 2021-10-19 |
| 16 | 202121028172-FORM-26 [25-06-2021(online)].pdf | 2021-06-25 |
| 16 | Abstract1.jpg | 2021-10-19 |
| 17 | 202121028172-Proof of Right [25-06-2021(online)].pdf | 2021-06-25 |
| 17 | 202121028172-FER.pdf | 2021-10-22 |
| 18 | 202121028172-FORM 18A [24-06-2021(online)].pdf | 2021-06-24 |
| 18 | 202121028172-OTHERS [18-04-2022(online)].pdf | 2022-04-18 |
| 19 | 202121028172-FER_SER_REPLY [18-04-2022(online)].pdf | 2022-04-18 |
| 19 | 202121028172-FORM-9 [24-06-2021(online)].pdf | 2021-06-24 |
| 20 | 202121028172-DRAWING [18-04-2022(online)].pdf | 2022-04-18 |
| 20 | 202121028172-FORM28 [24-06-2021(online)].pdf | 2021-06-24 |
| 21 | 202121028172-COMPLETE SPECIFICATION [18-04-2022(online)].pdf | 2022-04-18 |
| 21 | 202121028172-STARTUP [24-06-2021(online)].pdf | 2021-06-24 |
| 22 | 202121028172-CLAIMS [18-04-2022(online)].pdf | 2022-04-18 |
| 22 | 202121028172-COMPLETE SPECIFICATION [23-06-2021(online)].pdf | 2021-06-23 |
| 23 | 202121028172-DRAWINGS [23-06-2021(online)].pdf | 2021-06-23 |
| 23 | 202121028172-US(14)-HearingNotice-(HearingDate-27-07-2022).pdf | 2022-07-01 |
| 24 | 202121028172-EVIDENCE FOR REGISTRATION UNDER SSI [23-06-2021(online)].pdf | 2021-06-23 |
| 24 | 202121028172-Correspondence to notify the Controller [22-07-2022(online)].pdf | 2022-07-22 |
| 25 | 202121028172-Response to office action [27-07-2022(online)].pdf | 2022-07-27 |
| 25 | 202121028172-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [23-06-2021(online)].pdf | 2021-06-23 |
| 26 | 202121028172-FIGURE OF ABSTRACT [23-06-2021(online)].pdf | 2021-06-23 |
| 26 | 202121028172-Annexure [27-07-2022(online)].pdf | 2022-07-27 |
| 27 | 202121028172-Written submissions and relevant documents [09-08-2022(online)].pdf | 2022-08-09 |
| 27 | 202121028172-FORM 1 [23-06-2021(online)].pdf | 2021-06-23 |
| 28 | 202121028172-FORM FOR SMALL ENTITY(FORM-28) [23-06-2021(online)].pdf | 2021-06-23 |
| 28 | 202121028172-Annexure [09-08-2022(online)].pdf | 2022-08-09 |
| 29 | 202121028172-PatentCertificate28-09-2022.pdf | 2022-09-28 |
| 29 | 202121028172-FORM FOR STARTUP [23-06-2021(online)].pdf | 2021-06-23 |
| 30 | 202121028172-STATEMENT OF UNDERTAKING (FORM 3) [23-06-2021(online)].pdf | 2021-06-23 |
| 30 | 202121028172-IntimationOfGrant28-09-2022.pdf | 2022-09-28 |
| 1 | SearchHistory(17)-E_21-10-2021.pdf |