Sign In to Follow Application
View All Documents & Correspondence

Real Time Emotion Recognition And Rehabilitation System

Abstract: A real-time emotion recognition and rehabilitation system, comprises of a L-shaped frame 101 arranged with a touch interactive display panel 102 for user input regarding details of the user, an artificial intelligence-based imaging unit 103 installed on the frame 101 to determine facial expressions of the user, a wrist band 104 worn by the user while answering the questionnaire for allowing a sensing module integrated on the wrist band 104 to monitor health conditions of the user, a virtual reality-based headset 105 associated with the system that is accessed by the user for wearing the headset 105 in front of the user’s eyes to play a game in view of providing relaxation to the user, a platform 106 arranged with the frame 101 integrated with plurality of motorized iris lids 107 for allowing a motorized pop-out objects to pop out for calming and providing rehabilitation to the user.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 November 2024
Publication Number
50/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Marwadi University
Rajkot – Morbi Road, Rajkot 360003 Gujarat India.

Inventors

1. Dolly Shah
Department of Computer Engineering – Artificial Intelligence, Marwadi University, Rajkot – Morbi Road, Rajkot 360003 Gujarat India.
2. Uday Nandaniya
Department of Computer Engineering – Artificial Intelligence, Marwadi University, Rajkot – Morbi Road, Rajkot 360003 Gujarat India.
3. Dr. Madhu Shambhu Shukla
Professor, Head of the Department, Department of Computer Science Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot – Morbi Road, Rajkot 360003 Gujarat India.
4. Simrin Fathima Syed
Assistant Professor, Department of Computer Science Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot – Morbi Road, Rajkot 360003 Gujarat India.
5. Vipul Ladva
Assistant Professor, Department of Computer Science Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot – Morbi Road, Rajkot 360003 Gujarat India.
6. Akshay Ranpariya
Assistant Professor, Department of Computer Science Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot – Morbi Road, Rajkot 360003 Gujarat India.
7. Neel Dholakia
Assistant Professor, Department of Computer Science Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot – Morbi Road, Rajkot 360003 Gujarat India.

Specification

Description:FIELD OF THE INVENTION

[0001] The present invention relates to a real-time emotion recognition and rehabilitation system that is designed to assess a user's emotional state and offers rehabilitation based on the analysis. Additionally, the proposed system further evaluates the user's current mood and provides means like playing games or entertainment to shift the user’s state of mind from anger, sadness, or depression to a more joyful and relaxed state.

BACKGROUND OF THE INVENTION

[0002] Rehabilitation through real-time emotion recognition has become increasingly important as mental health and emotional well-being are recognized as essential components of overall health. Many individuals today experience emotional challenges such as stress, anxiety, anger, sadness, or depression, which can significantly impact daily functioning and quality of life. Early detection and intervention are key to improving emotional health and preventing further deterioration. Real-time emotion recognition technologies, powered by AI and machine learning, offer a promising solution to identifying these emotional states as they occur, allowing for timely and targeted interventions. By continuously monitoring facial expressions, voice tone, and physiological data such as heart rate or skin conductivity, these systems can assess a person's emotional condition in real time. Such emotional recognition systems enable personalized rehabilitation methods, which can be tailored to the user’s specific needs at any given moment.

[0003] Real-time emotion recognition systems are increasingly being used in rehabilitation programs to support users in managing their emotional well-being. These systems rely on various equipment such as wearable devices (e.g., smartwatches, EEG headsets), facial recognition software, and speech analysis tools. Wearable devices can monitor physiological indicators like heart rate, skin conductance, and body temperature, providing real-time data that can be used to assess a user's emotional state. Facial recognition software analyzes micro-expressions and facial muscle movements to detect emotions, while speech analysis tools evaluate tone, pitch, and speech patterns to assess emotional changes. Combined, these technologies can offer valuable insights into a user’s emotional state, helping therapists and rehabilitation specialists tailor interventions more effectively. However, despite their potential, these systems have several drawbacks. For one, they may not always accurately interpret emotions, as individuals express feelings in unique ways, and cultural or personal differences can affect emotional cues. Wearable devices may also be uncomfortable or intrusive, reducing user compliance over time. Additionally, privacy concerns arise, especially regarding sensitive emotional data that could be misused or accessed by unauthorized parties. Lastly, the reliance on technology may overlook the need for human empathy and personalized care, which is often crucial in emotional rehabilitation.

[0004] CN210205593U discloses an auxiliary appliance for rehabilitation nursing. The device comprises a medicine applying cylinder and a massager, a cover is arranged at the top of the medicine feeding cylinder; the medicine feeding cylinder is composed of a liquid medicine pipe and a ball. A connector is arranged at the bottom of the medicine feeding cylinder; a medicine adding opening is formed in one side of the bottom end of the medicine feeding barrel; an end cover is arranged at the end part of the medicine adding opening; the end cover is fixedly connected with the medicine adding opening through threads; the ball is embedded in the top of the medicine feeding cylinder; the medicine feeding cylinder is movably connected with the medicine feeding cylinder in a clamping manner; inner partition plates are arranged at the bottoms of the balls; the traumatic injury treatment device has the advantages that the traumatic injury treatment device can be used for conveniently applying medicine to the traumatic injury part and massaging the traumatic injury part, hands cannot be contaminated by the medicine, the massaging strength can also be kept, and the traumatic injury treatment device is beneficial to recovery of traumatic injuries.

[0005] CN103473631A discloses a rehabilitation therapy management system which comprises a rehabilitation therapy server unit, a rehabilitation therapy evaluating and diagnosing workstation unit, a rehabilitation equipment unit, a communication unit and a power supply unit; the rehabilitation therapy server unit, the communication unit, the rehabilitation therapy evaluating and diagnosing workstation unit and the rehabilitation equipment unit are connected in sequence; the power supply unit is respectively connected with electric equipment in the rehabilitation therapy server unit, the communication unit, the rehabilitation therapy evaluating and diagnosing workstation unit and the rehabilitation equipment unit. According to the rehabilitation therapy management system disclosed by the invention, the disadvantages of low working efficiency, fussy therapy process, waste of rehabilitation resource, poor reliability, small application range and the like in the prior art can be overcome; the advantages of high working efficiency, convenience in therapy, shared resource, good reliability and large application range can be realized.

[0006] Conventionally, many systems have been developed to provide rehabilitation support, however the systems mentioned in the prior arts have limitations pertaining to analyzation of user’s emotions and fails in entertaining the user to change the user’s state of mind.

[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a system that requires to be capable of analyzing a user's emotions to provide effective rehabilitation. The developed system assesses the user’s emotional condition and responds by offering games or activities that helps in transition of the user’s mood from negative feelings like anger, sadness, or depression to a positive, relaxed state.

OBJECTS OF THE INVENTION

[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.

[0009] An object of the present invention is to develop a system that is capable of analyzing emotion of a user for providing rehabilitation to the user.

[0010] Another object of the present invention is to develop a system that is capable of evaluating current state of mind of the user and accordingly enables the user to play game and get entertained in view of changing the mind state of the user from anger, sadness or depression to joyful and relaxed state.

[0011] Yet another object of the present invention is to develop a system that is capable of monitoring heath parameters of the user.

[0012] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.

SUMMARY OF THE INVENTION

[0013] The present invention relates to a real-time emotion recognition and rehabilitation system that is capable of evaluating a user’s emotional state and provides rehabilitation support tailored to user’s needs, by assessing the user's mood, the system enables the user to plays games for entertainment that help shift negative emotions, such as anger or sadness, into a more positive, joyful mindset.

[0014] According to an embodiment of the present invention, a real-time emotion recognition and rehabilitation system, comprises of a L-shaped frame developed to be installed on a ground surface, wherein the frame is arranged with a touch interactive display panel that is accessed by a user for providing input regarding details of the user which is stored in a database linked with an inbuilt microcontroller, an artificial intelligence-based imaging unit installed on the frame and integrated with a processor for capturing and processing multiple images in vicinity of the frame, respectively to determine facial expressions of the user, in accordance to which the microcontroller prompts a set of questionnaire for the user on the panel, based on which the microcontroller evaluates a current state of mind of the user, a wrist band associated with system and developed to be worn by the user while answering the questionnaire in view of allowing a sensing module integrated on the wrist band to monitor health conditions of the user, wherein the microcontroller is pre-fed with an OpenCV Haar Cascade Classifier protocol that is implemented by the microcontroller to evaluate required test or activity to be performed by the user as per the user’s current state of mind.

[0015] According to another embodiment of the present invention, the proposed system further comprises of a virtual reality-based headset associated with the system that is accessed by the user for wearing the headset in front of the user’s eyes, wherein based on the evaluated test or activity, the microcontroller actuates a screen integrated inside the headset to allow the user to play a game in view of providing relaxation to the user from current state of mind, in case the current state of mind corresponds to anger, sadness or depression, a platform arranged with the frame and laid of the surface and integrated with plurality of motorized iris lids that are actuated by the microcontroller to get open for allowing a motorized pop-out objects integrated underneath each of the lids to get deployed for allowing the user to crush the objects using the user’s feet in view of performing the activity for calming and providing rehabilitation to the user, in case the current state of mind corresponds to irritability or frustration, a speaker is mounted on the frame that is actuated by the microcontroller to produce music for calming the user, and a microphone is integrated on the frame for receiving voice command of the user while answering any particular question in the prompted questionnaire.

[0016] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0017] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of a real-time emotion recognition and rehabilitation system.

DETAILED DESCRIPTION OF THE INVENTION

[0018] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.

[0019] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.

[0020] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.

[0021] The present invention relates to a real-time emotion recognition and rehabilitation system that is capable of analyzing and interpreting a user’s emotional state, offering rehabilitation through targeted interventions. The proposed system is further capable of determining the user's mental condition and plays games or provides entertainment to transition emotions from anger, sadness, or depression to a calmer, more joyful state.

[0022] Referring to Figure 1, an isometric view of a real-time emotion recognition and rehabilitation system is illustrated, comprises of a L-shaped frame 101 arranged with a touch interactive display panel 102, an artificial intelligence-based imaging unit 103 installed on the frame 101, a wrist band 104 associated with system, a virtual reality-based headset 105 associated with the system, a platform 106 arranged with the frame 101, plurality of motorized iris lids 107 integrated with the platform 106, a motorized pop-out objects having slider 108 integrated underneath each of the lids, a speaker 109 mounted on the frame 101, and a microphone 110 integrated on the frame 101.

[0023] The proposed invention includes a frame 101 preferably in L-shape incorporating various components associated with the system, developed to be positioned on a ground surface. The frame 101 is made up of any material selected from but not limited to metal or plastic that ensures rigidity of the frame 101 for longevity of the system.

[0024] A user is required to access and presses a switch button arranged on the frame 101 to activate the system for associated processes of the system. The switch button when pressed by the user, opens up an electrical circuit and allows currents to flow for powering an associated microcontroller of the system for operating of all the linked components for performing their respective functions upon actuation.

[0025] The microcontroller, mentioned herein, is preferably an Arduino microcontroller. The Arduino microcontroller used herein controls the overall functionality of the components linked to it. The Arduino microcontroller is an open-source programming platform 106.

[0026] After the activation of the system, the user accesses a touch interactive display panel 102 installed over the frame 101 for providing input regarding details of the user. When the user touches the surface of the touch interactive display panel 102 to enter the input details, then an internal circuitry of the touch interactive display panel 102 senses the touches of the displayed option and synchronically, the internal circuitry converts the physical touch into the form of electric signal. The microcontroller processes the received signal from the display panel 102 in order to process the signal and determine the user selection and store the user response to a linked database for further associated functions related to the user input.

[0027] Upon receiving of the user input, the microcontroller generates a command to activate an artificial intelligence-based imaging unit 103 integrated on the frame 101 for capturing multiple images in a vicinity of the frame 101 to determine facial expressions of the user. The imaging unit 103 incorporates a processor that is encrypted with an artificial intelligence protocol. The artificial intelligence protocol operates by following a set of predefined instructions to process data and perform tasks autonomously. Initially, data is collected and input into a database, which then employs protocol to analyze and interpret the captured images. The processor of the imaging unit 103 via the artificial intelligence protocol processes the captured images and sent the signal to the microcontroller.

[0028] The microcontroller assesses the facial expressions of the user and accordingly prompts a set of questionnaire for the user on the display panel 102. The user is required to provide input by answering the questionnaire. The system is associated with a wrist band 104 which is required to be worn by the user while answering the questionnaire. The wrist band 104 is interconnected with the microcontroller via a wireless communication module which includes, but not limited to Wi-Fi (Wireless Fidelity) module, Bluetooth module, GSM (Global System for Mobile Communication) module.

[0029] The user is enabled to provide voice input via a microphone 110 mounted on the frame 101 regarding answering any particular question in the prompted questionnaire. The microphone 110 turns the sound energy emitted by the user into electrical energy. The sound waves created by the user carry energy towards the microphone 110. Inside the microphone 110, a diaphragm, made of plastic, is present and moves back and forth when the sound wave hits the diaphragm. The coil attached to the diaphragm also moves in same way. The magnetic field produced by the permanent magnet cuts through the coil. As the coil moves, the electric current flows. The electric current from coil flows to an amplifier which convert the sound into electrical signal. The microcontroller linked to the microphone 110 recognize the voice and perform the operations according to the command given by the user regarding answering any particular question in the prompted questionnaire.

[0030] The wrist band 104 incorporates a sensing module to monitor health conditions of the user. The sensing module includes a FBG (Fiber Bragg Grating) and PPG (photoplethysmography) sensors. The FBG sensor reflects wavelength of light that shifts in response to variations in vital health parameters leading to change in refractive index permanently due to exposed light intensity and due to periodic variation in the refractive index, the FBG sensor detects vital health parameters of the user. The PPG sensor works by obtaining a plethysmogram of the user to detect blood volume changes in the microvascular bed of tissue. The PPG sensor uses a pulse oximeter which illuminates the skin and measures changes in light absorption to monitor health parameters of the user.

[0031] The microcontroller collects and process the combined data from the sensing module. The microcontroller is pre-fed with an OpenCV Haar Cascade Classifier protocol. In accordance to the user’s response, the microcontroller evaluates the current state of mind of the user. The microcontroller implements the OpenCV Haar Cascade Classifier protocol to evaluate required test or activity to be performed by the user as per the user’s current state of mind.

[0032] The system is associated with a virtual reality-based headset 105 and that is accessed by the user in view of wearing the headset 105 in front of the user’s eyes. In case the evaluated user’s current state of mind corresponds to anger, sadness or depression, the microcontroller accordingly actuates the headset 105 to outcast a game to be played by the user.

[0033] The virtual reality (VR) headset 105 works by immersing users in a simulated environment through a combination of high-resolution displays and motion sensors. The headset 105 tracks head movements and adjusts the visual display accordingly, enabling the user to play a game outcast in the screen of the headset 105 and provide further head movements as per requirement. The screen of the VR headset 105 enables the user to play the game in an interactive manner such that providing relaxation to the user from current state of mind.

[0034] The bottom portion of the frame 101 is arranged with a platform 106 such that configured with plurality of motorized iris lids. A motorized pop-out object via a slider 108 and incorporated underneath each of the lids. In case the evaluated current state of mind corresponds to irritability or frustration, the microcontroller actuates the motorized iris lids 107 and the slider 108 simultaneously such that deploys the pop-out objects out of the lids.

[0035] Each of the motorized iris lids, mentioned herein, consists of a ring in bottom configured with multiple slots along periphery, multiple number of blades and blade actuating ring on the top. The blades are pivotally jointed with blade actuating ring and the base plate are hooked over the blade. The blade actuating ring is rotated clock and antilock wise by a DC motor embedded in ball actuating ring which results in opening of the holes for popping out of the objects.

[0036] The slider 108 is associated with of a pair of sliding rails fabricated with grooves in which the wheel of the slider 108 is positioned that is further connected with a bi-directional motor via a shaft. The microcontroller actuates the bi-directional motor to rotate in a clockwise and anti-clockwise direction that aids in the rotation of the shaft, wherein the shaft converts the electrical energy into rotational energy for allowing movement of the wheel to translate over the sliding rail by a firm grip on the grooves. The movement of the slider 108 results in the translation of the objects outwards from the lids.

[0037] The user is enabled to play the game with the pop-out objects such that the user is required to crush the objects using the user’s feet in view of performing the activity for calming and providing rehabilitation to the user. The frame 101 is integrated with a speaker 109 and that is actuated by the microcontroller to play music in order to calm the user. The speaker 109 works by taking the input signal from the microcontroller, it then processes and amplifies the received signal through a series of equipment in a specific order within the speaker 109, and then sends the output signal in form of producing music for calming the user.

[0038] A battery (not shown in figure) is associated with the system to supply power to electrically powered components which are employed herein. The battery is comprised of a pair of electrodes named as a cathode and an anode. The battery uses a chemical reaction of oxidation/reduction to do work on charge and produce a voltage between their anode and cathode and thus produces electrical energy that is used to do work in the system.

[0039] The present invention works best in the following manner, where the L-shaped frame 101 as disclosed in the invention is developed to be installed on the surface, with the touch-interactive display panel 102 for user input, linked to the database via the inbuilt microcontroller. The imaging unit 103 captures facial expressions to assess the user's mood, triggering the questionnaire on the display. The system also includes the wristband with the sensing module (featuring FBG and PPG sensors) to monitor the user’s health conditions. The microcontroller processes this data, using the OpenCV Haar Cascade Classifier to evaluate the user’s mental state and determine the appropriate activity. If the user's mood suggests stress or negative emotions (e.g., anger, sadness), the virtual reality (VR) headset 105 is activated to display the game, promoting relaxation. For users displaying irritability or frustration, the motorized platform 106 under the frame 101 opens iris lids, releasing pop-out objects for the user to crush with their feet, serving as the physical activity for rehabilitation. Additionally, the system is equipped with the speaker 109 to play calming music and the microphone 110 for voice commands during the questionnaire, further enhancing the user's interaction and therapeutic experience. The entire process is managed by the microcontroller, which ensures the tailored, interactive, and effective approach to mental and emotional wellness.

[0040] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A real-time emotion recognition and rehabilitation system, comprising:

i) a L-shaped frame 101 developed to be installed on a ground surface, wherein said frame 101 is arranged with a touch interactive display panel 102 that is accessed by a user for providing input regarding details of said user which is stored in a database linked with an inbuilt microcontroller;
ii) an artificial intelligence-based imaging unit 103 installed on said frame 101 and integrated with a processor for capturing and processing multiple images in vicinity of said frame 101, respectively to determine facial expressions of said user, in accordance to which said microcontroller prompts a set of questionnaire for said user on said panel, based on which said microcontroller evaluates a current state of mind of said user;
iii) a wrist band 104 associated with system and developed to be worn by said user while answering said questionnaire in view of allowing a sensing module integrated on said wrist band 104 to monitor health conditions of said user, wherein said microcontroller is pre-fed with an OpenCV Haar Cascade Classifier protocol that is implemented by said microcontroller to evaluate required test or activity to be performed by said user as per said user’s current state of mind;
iv) a virtual reality-based headset 105 associated with said system that is accessed by said user for wearing said headset 105 in front of said user’s eyes, wherein based on said evaluated test or activity, said microcontroller actuates a screen integrated inside said headset 105 to allow said user to play a game in view of providing relaxation to said user from current state of mind, in case said current state of mind corresponds to anger, sadness or depression; and
v) a platform 106 arranged with said frame 101 and laid of said surface and integrated with plurality of motorized iris lids 107 that are actuated by said microcontroller to get open for allowing a motorized pop-out objects integrated underneath each of said lids 107 to get deployed for allowing said user to crush said objects using said user’s feet in view of performing said activity for calming and providing rehabilitation to said user, in case said current state of mind corresponds to irritability or frustration.

2) The system as claimed in claim 1, wherein said sensing module includes a FBG (Fiber Bragg Grating) and PPG (photoplethysmography) sensors.

3) The system as claimed in claim 1, wherein a speaker 109 is mounted on said frame 101 that is actuated by said microcontroller to produce music for calming said user.

4) The system as claimed in claim 1, wherein a microphone 110 is integrated on said frame 101 for receiving voice command of said user while answering any particular question in said prompted questionnaire.

Documents

Application Documents

# Name Date
1 202421089543-STATEMENT OF UNDERTAKING (FORM 3) [19-11-2024(online)].pdf 2024-11-19
2 202421089543-REQUEST FOR EXAMINATION (FORM-18) [19-11-2024(online)].pdf 2024-11-19
3 202421089543-REQUEST FOR EARLY PUBLICATION(FORM-9) [19-11-2024(online)].pdf 2024-11-19
4 202421089543-PROOF OF RIGHT [19-11-2024(online)].pdf 2024-11-19
5 202421089543-POWER OF AUTHORITY [19-11-2024(online)].pdf 2024-11-19
6 202421089543-FORM-9 [19-11-2024(online)].pdf 2024-11-19
7 202421089543-FORM FOR SMALL ENTITY(FORM-28) [19-11-2024(online)].pdf 2024-11-19
8 202421089543-FORM 18 [19-11-2024(online)].pdf 2024-11-19
9 202421089543-FORM 1 [19-11-2024(online)].pdf 2024-11-19
10 202421089543-FIGURE OF ABSTRACT [19-11-2024(online)].pdf 2024-11-19
11 202421089543-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-11-2024(online)].pdf 2024-11-19
12 202421089543-EVIDENCE FOR REGISTRATION UNDER SSI [19-11-2024(online)].pdf 2024-11-19
13 202421089543-EDUCATIONAL INSTITUTION(S) [19-11-2024(online)].pdf 2024-11-19
14 202421089543-DRAWINGS [19-11-2024(online)].pdf 2024-11-19
15 202421089543-DECLARATION OF INVENTORSHIP (FORM 5) [19-11-2024(online)].pdf 2024-11-19
16 202421089543-COMPLETE SPECIFICATION [19-11-2024(online)].pdf 2024-11-19
17 Abstract.jpg 2024-12-06
18 202421089543-FORM-26 [03-06-2025(online)].pdf 2025-06-03