Abstract: An artificial intelligence-based health management system for individuals with Autism Spectrum Disorder (ASD), comprising a body 101 configured with a chamber 102 for storing multiple wearable bands 103 associated with system, a primary imaging unit 104 assess facial details of a user positioned in close proximity to body 101, a LED (Light Emitting Diode) 201 emit a blinking light for indicating correct band 103 for user, a microphone 205 captures voice signals of user, to detect speech patterns, a touch interactive display panel 105 displays relevant therapy exercises and recommendations, to user, an inflatable spherical unit 106 linked with an inflating unit 107, is accessed by user to apply pressure onto spherical unit 106.
Description:FIELD OF THE INVENTION
[0001] The present invention relates to an artificial intelligence-based health management system for individuals with Autism Spectrum Disorder (ASD), by monitoring, analysing, and improving communication, emotional, and behavioural skills. More specifically the system also provides real-time support by continuously assessing a user’s facial expressions, physiological conditions, and behaviour patterns, and offering tailored interventions and therapy recommendations in order to foster social interaction, thereby enhance communication skills, and alleviate stress for individuals on the autism spectrum.
BACKGROUND OF THE INVENTION
[0002] Autism Spectrum Disorder (ASD) is a neurodevelopmental condition characterized by challenges in social interaction, communication, and repetitive behaviours. Traditional methods of assisting individuals with ASD typically involve in-person therapies, behavioural assessments, and speech training. These interventions often rely on manual observations and face-to-face interactions, requiring caregivers or therapists to monitor and guide the user through exercises. However, these methods are often time-consuming, limited in real-time feedback, and unable to provide continuous support. The existing systems lack personalization and consistent monitoring of behavioural progress, leading to delays in appropriate intervention. Additionally, these traditional approaches cannot track a user’s daily fluctuations in emotional or physiological states, resulting in missed opportunities for timely intervention.
[0003] Initially, therapeutic interventions for children with autism were predominantly based on behavioural therapies such as Applied Behavioural Analysis (ABA), which relied heavily on one-on-one interaction with therapists and teachers. These methods often involved using physical rewards or punishments to encourage or discourage certain behaviours. However, these traditional therapies, while effective, are often slow and do not offer immediate feedback to caregivers, therapists, or users. This result in a lack of timely intervention when a user’s behaviour changes unexpectedly, leaving critical moments of improvement or de-escalation missed. So, tools like computers, specialized module, and interactive learning programs began to be integrated into therapeutic practices. Equipment’s such as communication boards, touchscreens, and audio-visual aids were used to help children with ASD develop social skills, manage their emotions, and enhance their communication abilities. But some of the tools and modules used for ASD interventions are complex to operate, requiring significant training for caregivers and therapists. This complexity creates barriers to use, particularly for families without access to technical support.
[0004] US2021228130A1 discloses about an invention that includes an autism spectrum disorder (“ASD”) diagnostics and disease management. Specifically, the present disclosure teaches a method of assessing schizophrenia or ASD in a subject in which a subject's usage data for a mobile device is collected over a first predefined time window. A usage behaviour parameter is determined from the usage data, and the determined usage behaviour parameter is compared to a reference. From the comparison it may be determined whether the schizophrenia or ASD in the subject is improving, persisting or worsening. A system including a mobile device having sensors recording usage data and a remote device operatively linked to the mobile device is also disclosed.
[0005] WO2023056419A1 discloses about an invention that includes a system for evaluating a subject, including: a processor in communication with a carbon monoxide (CO) detector, and a memory in communication with the processor having stored thereon a set of instructions which, when executed by the processor, cause the processor to: receive, from the CO detector, a measure of CO in a subject suspected of having at least one of autism spectrum disorder (ASD), autoimmunity, or inflammation; obtain a level of at least one biomarker associated with the subject based on receiving the measure of CO in the subject; and generate a report based on obtaining the level of the at least one biomarker.
[0006] Conventionally, many systems have been used for managing health for individuals with Autism Spectrum Disorder (ASD). However, these existing systems are incapable of adapting its therapeutic exercises, recommendations, and interventions based on the user’s needs and behavioural progress. Additionally, these existing systems also lack the ability to track and display the user’s progress, including speech development, behavioural patterns, and emotional responses, which leads to a deterioration in health management.
[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a system that requires to adapt and personalizes its therapeutic exercises, recommendations, and interventions based on the user’s unique needs and behavioural progress, thereby allowing caregivers to access up-to-date reports and adjust therapy plans accordingly. In addition, the developed system also needs to facilitate automatic tracking and displaying of the user’s progress, including speech development, behavioural patterns, and emotional responses, in view of allowing caregivers to monitor improvements and provide timely interventions when necessary, ensuring better care management.
OBJECTS OF THE INVENTION
[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.
[0009] An object of the present invention is to develop a system that is capable of offering personalized therapeutic interventions based on real-time monitoring of the user's physiological and behavioural parameters, such as heart rate, motion, speech patterns, and emotional state, thereby improving communication skills and social interaction for individuals with Autism Spectrum Disorder (ASD).
[0010] Another object of the present invention is to develop a system that continuously tracks and analyzes the user's emotional and behavioural responses during therapy, in view of offering real-time feedback and interventions, such as stress-relief exercises, speech therapy, and social skill training, to optimize therapeutic outcomes.
[0011] Another object of the present invention is to develop a system that integrate with external resources, such as licensed specialists or professional guidance, to provide caregivers with easy access to expert advice and intervention when persistent symptoms are detected, thereby ensuring long-term care and support for individuals with ASD.
[0012] Yet another object of the present invention is to develop a system that incorporate continuous assessment and real-time feedback to evaluate facial expressions, behaviour, and physiological responses during therapy, thereby allowing itself to make data-driven decisions for appropriate intervention strategies.
[0013] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0014] The present invention relates to an artificial intelligence-based health management system for individuals with Autism Spectrum Disorder (ASD) that is capable of facilitating customized therapeutic support through real-time tracking of the user's physiological and behavioural data, including heart rate, movement, speech patterns, and emotional state, thus enhancing communication abilities and social engagement for individuals with Autism Spectrum Disorder (ASD).
[0015] According to an embodiment of the present invention, an artificial intelligence-based health management system for individuals with Autism Spectrum Disorder (ASD), comprises of a body configured with a chamber for storing multiple wearable bands associated with the system, the body is configured with multiple motorized wheels for seamless mobility over a ground surface, as per requirement, a primary artificial intelligence-based imaging unit equipped with a facial recognition module, is mounted on the body, to assess facial details of a user positioned in close proximity to the body, a microcontroller linked with the primary imaging unit for processing the assessed facial data to compare with a pre-stored data from a linked database, for fetching a suitable wearable band for the user, the database stores information about ASD symptoms, intervention strategies, therapeutic exercises, and expert recommendations, which is continuously updated with new research by the microcontroller through a linked server ensuring that the system offers the latest and most effective interventions for the user, a LED (Light Emitting Diode) is installed on each of the bands, to emit a blinking light for indicating correct band for the user, which is accessed by the user to wear on wrist portion, a sensing module including a heart rate sensor, a skin conductance sensor and a motion sensor, mounted on each of the bands for detecting heart rate, moisture and motion of the user, allowing for detection of physiological changes, stress, and emotional distress in real time, a microphone is mounted on the band for capturing voice signals of the user, to detect speech patterns, while the detected parameters and speech patterns are processed by the microcontroller to analyse behavioural and physiological patterns of the user, to diagnose ASD (Autism Spectrum Disorder) and gauge symptoms intensity, a touch interactive display panel mounted on an exterior of the body, the microcontroller processes the symptoms to determine the intensity and consequently fetches relevant therapy exercises and recommendations from the linked database, that are conveyed to the user via the display panel to suggest the user to perform the displayed exercises and recommendations, which includes but not limited to speech therapy exercises and stress-relief exercises, and these therapies are continuously monitored by the primary imaging unit to assess facial expressions in view of checking interaction of the user with the content being displayed, the speech-therapy includes pronunciation techniques, vocal modeling, and speech rhythm correction, which assists in improving communication skills of the user, especially when stammering or delayed speech are detected, through the microphone, the stress-relief exercises includes deep breathing, mindfulness activities, or guided relaxation via a speaker mounted on the body that is configured to work in synchronization with the microphone for interacting with the user.
[0016] According to another embodiment of the present invention, the system further includes display panel is configured to engage the user in structured social activities to improve their social skills, such as recognizing emotions and making appropriate eye contact for helping the user to address communication difficulties and promote better social interaction, a secondary imaging unit is arranged on the bands for continuously monitoring of the user’s behaviour patterns, eye contact, and repetitive movements to suggest appropriate interventions through the speaker or display panel, when the user’s therapy is scheduled, an inflatable spherical unit linked with an inflating unit, arranged on the body that is accessed by the user to apply pressure onto the spherical unit, a plurality of touch sensors are integrated on the spherical unit to detect contact with the user’s palm, based on which the inflating unit inflate/deflate the spherical unit, for allowing the user to apply an optimal pressure onto the spherical unit, while the primary imaging unit continuously monitor the user’s facial expressions during the therapy, to assess hand and eye coordination of the user, an user interface installed in a computing unit wirelessly linked with the microcontroller, the microcontroller generates a detailed report on speech development, behaviour patterns, and therapy outcomes, that is transmitted to the user interface through a communication module, to allow a caregiver to assess the report for tracking progress, and in case the detected symptoms persists, the microcontroller fetches contact of a licensed specialist from the database, and prompts on the user interface to seek professional guidance for further consultations, thereby facilitating in providing continuous support and intervention based on real-time data. for promoting better social interaction of the user and a battery is associated with the system for powering up electrical and electronically operated components associated with the system.
[0017] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates a perspective view of an artificial intelligence-based health management system for individuals with Autism Spectrum Disorder (ASD); and
Figure 2 illustrates a perspective view of a wearable band associated with the system.
DETAILED DESCRIPTION OF THE INVENTION
[0019] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0020] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0021] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0022] The present invention relates to an artificial intelligence-based health management system for individuals with Autism Spectrum Disorder (ASD) that is capable of enabling personalized therapeutic interventions by continuously monitoring the user's physiological and behavioural indicators, such as heart rate, motion, speech patterns, and emotional state, thereby fostering improved communication and social interaction for individuals with Autism Spectrum Disorder (ASD).
[0023] Referring to Figure 1 and 2, a perspective view of an artificial intelligence-based health management system for individuals with Autism Spectrum Disorder (ASD) and a perspective view of a wearable band associated with the system are illustrated, respectively, comprising a body 101 configured with a chamber 102, multiple wearable bands 103 associated with the system, a primary artificial intelligence-based imaging unit 104 mounted on the body 101, a LED (Light Emitting Diode) 201 is installed on each of the bands 103, a sensing module including a heart rate sensor 202, a skin conductance sensor 203 and a motion sensor 204, mounted on each of the bands 103, a microphone 205 is mounted on the band 103, a touch interactive display panel 105 mounted on an exterior of the body 101, an inflatable spherical unit 106 linked with an inflating unit 107, arranged on the body 101, a speaker 108 mounted on the body 101, a secondary imaging unit 206 is arranged on the bands 103, the body 101 is configured with multiple motorized wheels 109.
[0024] The system disclosed herein comprising a body 101 designed with a chamber 102 intended to store multiple wearable bands 103 associated with the system. This chamber 102 is specifically structured to accommodate and organize the bands 103 in a secure and accessible manner. The design of the chamber 102 allows for efficient storage, ensuring that the wearable bands 103 are readily available for use when required. The configuration of the body 101 and chamber 102 facilitates easy access and retrieval of the wearable bands 103. The chamber 102 is optimized to hold the bands 103 in a manner that maintains their integrity and ensures their proper condition for use.
[0025] The body 101 is installed with multiple motorized wheels 109 (preferably 2 to 6 in numbers) that enable seamless mobility over a ground surface, as required. These motorized wheels 109 allow the body 101 to move smoothly and efficiently across various surfaces, ensuring easy transportation and repositioning of the body 101 when needed. The motorized wheels 109 are a circular object that revolves on an axle to enable the body 101 to move easily over the ground surface. For maneuvering the body 101 each of the wheels 109 need to rotate and which is governed by a hub motor fit in the hub of each of the wheels 109 that provides the rotation motion to the wheels 109 for maneuvering the body 101 on the ground surface.
[0026] The body 101 is installed with a primary artificial intelligence-based imaging unit 104 that is equipped with a facial recognition module to assess facial details of a user positioned in close proximity to the body 101. The imaging unit 104 disclosed herein comprises of an image capturing arrangement including a set of lenses that captures multiple images of the surroundings and the captured images are stored within memory of the imaging unit 104 in form of an optical data.
[0027] The imaging unit 104 also comprises of the processor which processes the captured images. This pre-processing involves tasks such as noise reduction, image stabilization, or color correction. The processed data is fed into AI protocols for analysis which utilizes machine learning techniques, such as deep learning neural networks, to extract meaningful information from the visual data which are processed by the microcontroller to assess facial details of a user positioned in close proximity to the body 101.
[0028] The microcontroller is linked to the primary imaging unit 104 for processing the facial data captured from the user. The microcontroller compares the assessed facial details with pre-stored data in a linked database, allowing the system to identify the most suitable wearable band 103 for the user. The microcontroller analyzes the facial features to ensure that the wearable band 103 selected is personalized and matches the specific requirements of the user. This process enables accurate and efficient identification of the appropriate band 103, thereby facilitating quick and easy selection of the band 103.
[0029] The database is designed to store comprehensive information regarding Autism Spectrum Disorder (ASD) symptoms, intervention strategies, therapeutic exercises, and expert recommendations. This information is regularly updated by the microcontroller through a linked server, ensuring the system remains current with the latest research and advancements in ASD treatments. The continuous updating of the database allows the system to provide the most up-to-date and effective interventions tailored to the user’s needs. This ensures that the system is always equipped with the latest knowledge, improving the quality of care and support provided to the user.
[0030] Each wearable band 103 is equipped with a LED (Light Emitting Diode) 201, which is controlled by the microcontroller. Upon identifying the appropriate band 103 for the user, the microcontroller activates the LED (Light Emitting Diode) 201 to emit a blinking light, signalling to the user that the correct band 103 has been selected. This visual indication assists the user in easily locating and selecting the designated band 103 from the available options. Once the correct band 103 is identified by the blinking LED (Light Emitting Diode) 201, the user access and wear the band 103 on their wrist, ensuring the proper band 103 is used for the intended purpose.
[0031] The LED (Light Emitting Diode) 201 mentioned herein is a two-lead semiconductor light source also known as p-n junction which produce the lighting when constant voltage is supplied across the diode. When the voltage is supplied across the diode, the electrons recombine with the electrons hole in the diode which result in conversion of electron into photons which is another form of light, that glow in view of indicating correct band 103 for the user.
[0032] Also, the wearable band 103 is selected from the chamber 102 based on pre-fed personalization data associated with the authorized user. This data includes specific preferences, physical characteristics, or other relevant factors that have been previously stored in the system. Using this personalized information, the microcontroller identifies and retrieves the most suitable band 103 for the user, ensuring that the band 103 aligns with their individual needs and requirements.
[0033] A sensing module integrated into each wearable band 103 comprises a heart rate sensor 202, a skin conductance sensor 203, and a motion sensor 204. These sensors 202, 203, 204 continuously monitor the physiological parameters of the user, including heart rate, moisture levels (as an indicator of stress), and movement. This data is captured in real-time, enabling the microcontroller to detect changes in the user's physical state, such as elevated stress or emotional distress.
[0034] The heart rate sensor 202 utilizes an optical sensor or electrical detection method to monitor the user's pulse rate. The heart rate sensor 202 detects the rhythmic flow of blood through the skin's surface, either via light reflection (photoplethysmography) or electrodes that measure electrical impulses from the heart. The heart rate sensor 202 continuously measures beats per minute (BPM) and sends this data to the microcontroller for processing. The microcontroller compares the heart rate with predetermined thresholds to assess stress or health conditions.
[0035] The skin conductance sensor 203 measures the electrical conductance of the skin, which fluctuates based on the moisture levels in the skin's outer layer. When a person experiences stress or emotional arousal, their skin's moisture level increases, leading to higher electrical conductivity. The skin conductance sensor 203 uses two electrodes to pass a small current through the skin and measures the changes in conductance. The data is relayed to the microcontroller for analysis of emotional or physiological responses, helping in stress or emotional distress detection.
[0036] The motion sensor 204 detects physical movements through accelerometers or gyroscopes embedded in the band 103. The motion sensor 204 measures changes in acceleration and orientation, enabling the microcontroller to detect the user’s activity level, whether they are still or in motion. This data is processed in real-time to monitor the user's movement patterns. The motion sensor 204 tracks movement intensity, direction, and frequency, providing insights into the user’s physical activity and behaviour, which help detect symptoms of stress, anxiety, or any disruptions in daily activity.
[0037] A microphone 205 mounted on the wearable band 103, designed to capture the voice signals of the user. The microphone 205 records the speech patterns of the user, which are subsequently processed by the microcontroller. The data gathered from both the speech and other physiological parameters are analyzed to assess behavioural and physiological patterns indicative of Autism Spectrum Disorder (ASD). The microcontroller compares the captured speech and associated physiological data against predefined thresholds to detect symptoms and gauge their intensity. This process aids in diagnosing ASD and monitoring changes or developments in the user’s condition, enabling tailored interventions.
[0038] The microphone 205 captures sound waves, which are then converted into electrical signals. These signals are transmitted to the microcontroller for processing, where the speech patterns are extracted. The microphone 205 operates by detecting variations in air pressure as sound waves, using a diaphragm to vibrate in response to these pressure changes. The diaphragm's movements are converted into electrical signals, which are amplified and processed by the microcontroller. The resulting data helps assess the user’s speech characteristics such as pitch, tone, rhythm, and fluency, enabling the analysis of speech patterns associated with ASD.
[0039] A touch interactive display panel 105 mounted on the exterior of the body 101, which serves as the interface through which the user interacts with the system. The microcontroller processes the detected symptoms and analyzes the intensity of those symptoms, based on which the microcontroller retrieves relevant therapy exercises and recommendations from a linked database. These exercises, which may include speech therapy and stress-relief exercises, are then displayed on the panel 105 for the user. The user interacts with the display, selecting exercises or recommendations, thereby facilitating the performance of the prescribed therapeutic activities to aid in the management of the user’s condition.
[0040] The display panel 105 comprises an LED or LCD screen, a control board, a backlight arrangement, and input connectors. The LED/LCD screen serves as the main visual output, while the control board manages data input and image processing. The backlight arrangement, often made of LEDs, illuminates the screen, ensuring visibility. When information is sent to the display, the control board processes the data and directs the LED/LCD pixels to show specific colors, creating images or text. The backlight adjusts brightness for optimal clarity. This combined functionality enables the panel 105 to accurately display speech therapy and stress-relief exercises for the user.
[0041] The microcontroller herein continuously monitors the user’s engagement with the displayed therapy exercises through the primary imaging unit 104. This primary imaging unit 104 is equipped with facial recognition capabilities to assess the user’s facial expressions and emotional responses while interacting with the displayed content.
[0042] By analyzing changes in facial expressions, such as recognition of emotions or levels of engagement, the microcontroller evaluates the user’s interaction with the therapy exercises. This real-time monitoring allows for the adjustment of therapeutic content, ensuring that the prescribed exercises are effectively engaging the user and promoting the desired behavioural or physiological responses.
[0043] The speech-therapy comprises pronunciation techniques, vocal modelling, and speech rhythm correction to enhance the communication skills of the user, especially in cases of stammering or delayed speech. The microphone 205 detects the user's speech signals, which are then processed by the microcontroller to identify specific speech patterns and irregularities. Based on this analysis, the microcontroller fetches appropriate therapy exercises from the linked database. These exercises are displayed on the display panel 105 for the user to follow, assisting in improving speech clarity, rhythm, and pronunciation.
[0044] Synchronously, the stress-relief exercises comprise deep breathing techniques, mindfulness activities, and guided relaxation, which are delivered through a speaker 108 mounted on the body 101. The speaker 108 is configured to operate in synchronization with the microphone 205 to facilitate interaction with the user. The microcontroller processes the data captured by the microphone 205 and adjusts the audio output from the speaker 108 accordingly to guide the user through the stress-relief exercises. These activities are designed to help the user manage stress, promote relaxation, and improve overall emotional well-being, while ensuring the exercises are personalized based on real-time physiological feedback from the user.
[0045] The speaker 108 receives audio signals from the microcontroller based on processed data from the microphone 205 and sensors. Upon receiving the processed instructions, the speaker 108 emits sound waves to the user, which may include voice commands for deep breathing exercises, guided relaxation, or mindfulness activities. The sound output is designed to interact with the user’s emotional and physiological state, adjusting in real-time to provide the correct stress-relief guidance. The speaker 108 ensure that the audio output aligns with the user’s needs, facilitating effective engagement in the therapeutic exercises.
[0046] Further the display panel 105 is configured to engage the user in structured social activities aimed at enhancing their social skills. These activities are designed to assist the user in recognizing emotions and making appropriate eye contact, which are critical components in effective communication. By providing interactive exercises and visual cues, the display panel 105 helps the user address communication difficulties and improve social interaction. The system adapts these activities to the user’s individual progress and needs, fostering a supportive environment that encourages the development of crucial social skills through visual feedback and interactive learning.
[0047] A secondary imaging unit 206 is positioned on the bands 103 to continuously monitor the user’s behaviour patterns, including eye contact and repetitive movements. This secondary imaging unit 206 works in the similar manner as of primary imaging unit 104 and tracks the user's interactions and identifies patterns in their behaviour, which are then processed by the microcontroller. Based on the data captured, the microcontroller suggests appropriate interventions, such as therapeutic exercises or behavioural modifications, through the speaker 108 or display panel 105. These interventions are aligned with the user’s therapy schedule, ensuring that real-time feedback and guidance are provided to assist in improving the user’s social and behavioural skills.
[0048] An inflatable spherical unit 106 is integrated with an inflating unit 107 and positioned on the body 101 for user interaction. The user accesses the spherical unit 106 to apply pressure, which is adjusted by the inflation or deflation of the spherical unit 106. Prior actuation of the inflating unit 107 the microcontroller detect contact with the user’s palm via plurality of touch sensors are integrated on the spherical unit 106.
[0049] The touch sensors integrated on the spherical unit 106 detect physical contact with the user’s palm. When the user applies pressure to the spherical unit 106, the touch sensors sense the contact through changes in electrical capacitance or resistance, depending on the sensor type. These sensors continuously monitor the surface of the spherical unit 106 for any interaction with the user's hand. Upon detecting palm contact, the touch sensors send a signal to the microcontroller, which processes the data and triggers the actuation of the inflating unit 107. This ensures that pressure is applied based on the user's interaction with the spherical unit 106.
[0050] Simultaneously, the inflating unit 107 operates by receiving a signal from the microcontroller to initiate the inflation or deflation process of the spherical unit 106. Upon activation, inflating unit 107 draws in air from an external source or expels air from the unit to adjust the pressure inside the spherical unit 106. The inflating unit 107 is controlled precisely to ensure the correct amount of air is added or removed based on user input and touch sensor feedback. The microcontroller regulates the pressure based on real-time data, providing the optimal resistance required for the user to apply the necessary force for the intended therapy.
[0051] A user interface, installed in a computing unit and wirelessly linked with the microcontroller, allows for seamless transmission of data between the system and the caregiver. The microcontroller generates detailed reports that encompass speech development, behaviour patterns, and therapy outcomes, which are transmitted to the user interface via a communication module. This enables caregivers to assess the user's progress in real-time. In cases where symptoms persist, the microcontroller retrieves the contact information of a licensed specialist from the database, prompting the caregiver via the user interface to seek professional guidance, ensuring continuous support and intervention for the user’s social interaction improvement.
[0052] Moreover, a battery is associated with the system for powering up electrical and electronically operated components associated with the system and supplying a voltage to the components. The battery used herein is preferably a Lithium-ion battery which is a rechargeable unit that demands power supply after getting drained. The battery stores the electric current derived from an external source in the form of chemical energy, which when required by the electronic component of the system, derives the required power from the battery for proper functioning of the system.
[0053] The present invention works best in the following manner, where the body 101 as disclosed in the invention is configured with the chamber 102 for storing multiple wearable bands 103 associated with the system. The body 101 is configured with multiple motorized wheels 109 for seamless mobility over the ground surface. The primary artificial intelligence-based imaging unit 104 equipped with the facial recognition module assess facial details of the user positioned in close proximity to the body 101. Then the microcontroller linked with the primary imaging unit 104 processes the assessed facial data to compare with the pre-stored data from the linked database, for fetching the suitable wearable band 103 for the user. The database stores information about ASD symptoms, intervention strategies, therapeutic exercises, and expert recommendations, which is continuously updated with new research by the microcontroller through the linked server ensuring that the system offers the latest and most effective interventions for the user. The LED (Light Emitting Diode) 201 emit the blinking light for indicating correct band 103 for the user, which is accessed by the user to wear on wrist portion. Thereafter the sensing module detects heart rate, moisture and motion of the user, allowing for detection of physiological changes, stress, and emotional distress in real time. Synchronously, the microphone 205 captures voice signals of the user, to detect speech patterns, while the detected parameters and speech patterns are processed by the microcontroller to analyse behavioural and physiological patterns of the user, to diagnose ASD (Autism Spectrum Disorder) and gauge symptoms intensity. The touch interactive display panel 105 display relevant therapy exercises and recommendations from the linked database to suggest the user to perform the displayed exercises and recommendations, which includes but not limited to speech therapy exercises and stress-relief exercises. And these therapies are continuously monitored by the primary imaging unit 104 to assess facial expressions in view of checking interaction of the user with the content being displayed.
[0054] In continuation, the speech-therapy includes pronunciation techniques, vocal modeling, and speech rhythm correction, which assists in improving communication skills of the user, especially when stammering or delayed speech are detected, through the microphone 205. The stress-relief exercises include deep breathing, mindfulness activities, or guided relaxation via the speaker 108 that work in synchronization with the microphone 205 for interacting with the user. Thereafter the display panel 105 engage the user in structured social activities to improve their social skills, such as recognizing emotions and making appropriate eye contact for helping the user to address communication difficulties and promote better social interaction. The secondary imaging unit 206 continuously monitoring of the user’s behaviour patterns, eye contact, and repetitive movements to suggest appropriate interventions through the speaker 108 or display panel 105, when the user’s therapy is scheduled. The inflatable spherical unit 106 linked with the inflating unit 107 is accessed by the user to apply pressure onto the spherical unit 106. Plurality of touch sensors detect contact with the user’s palm. Based on which the inflating unit 107 to inflate/deflate the spherical unit 106, for allowing the user to apply the optimal pressure onto the spherical unit 106. And the primary imaging unit 104 continuously monitor the user’s facial expressions during the therapy, to assess hand and eye coordination of the user. The user interface installed in the computing unit wirelessly linked with the microcontroller. The microcontroller generates the detailed report on speech development, behaviour patterns, and therapy outcomes, that is transmitted to the user interface through the communication module, to allow the caregiver to assess the report for tracking progress. In case the detected symptoms persist, the microcontroller fetches contact of the licensed specialist from the database, and prompts on the user interface to seek professional guidance for further consultations.
[0055] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) An artificial intelligence-based health management system for individuals with Autism Spectrum Disorder (ASD), comprising:
i) a body 101 configured with a chamber 102 for storing multiple wearable bands 103 associated with said system, wherein a primary artificial intelligence-based imaging unit 104 equipped with a facial recognition module, is mounted on said body 101 for capturing and processing multiple images in vicinity of said body 101, to assess facial details of a user positioned in close proximity to said body 101;
ii) a microcontroller linked with said primary imaging unit 104 for processing said assessed facial data to compare with a pre-stored data from a linked database, for fetching a suitable wearable band 103 for said user, wherein a LED (Light Emitting Diode) 201 is installed on each of said bands 103, that is activated by said microcontroller to emit a blinking light for indicating correct band 103 for said user, which is accessed by said user to wear on wrist portion;
iii) a sensing module including a heart rate sensor 202, a skin conductance sensor 203 and a motion sensor 204, mounted on each of said bands 103 for detecting heart rate, moisture and motion of said user, allowing for detection of physiological changes, stress, and emotional distress in real time, wherein a microphone 205 is mounted on said band 103 for capturing voice signals of said user, to detect speech patterns, while said detected parameters and speech patterns are processed by said microcontroller to analyse behavioural and physiological patterns of said user, to diagnose ASD (Autism Spectrum Disorder) and gauge symptoms intensity;
iv) a touch interactive display panel 105 mounted on an exterior of said body 101, wherein said microcontroller processes said symptoms to determine the intensity and consequently fetches relevant therapy exercises and recommendations from a linked database, that are conveyed to said user via said display panel 105 to suggest said user to perform said displayed exercises and recommendations, which includes but not limited to speech therapy exercises and stress-relief exercises, and these therapies are continuously monitored by said primary imaging unit 104 to assess facial expressions in view of checking interaction of said user with said content being displayed;
v) an inflatable spherical unit 106 linked with an inflating unit 107, arranged on said body 101 that is accessed by said user to apply pressure onto said spherical unit 106, wherein a plurality of touch sensors are integrated on said spherical unit 106 to detect contact with said user’s palm, based on which, said microcontroller activates said inflating unit 107 to inflate/deflate said spherical unit 106, for allowing said user to apply an optimal pressure onto said spherical unit 106, while said primary imaging unit 104 continuously monitor said user’s facial expressions during said therapy, to assess hand and eye coordination of said user; and
vi) an user interface installed in a computing unit wirelessly linked with said microcontroller, wherein said microcontroller generates a detailed report on speech development, behaviour patterns, and therapy outcomes, that is transmitted to said user interface through a communication module, to allow a caregiver to assess said report for tracking progress, and in case said detected symptoms persists, said microcontroller fetches contact of a licensed specialist from said database, and prompts on said user interface to seek professional guidance for further consultations, thereby facilitating in providing continuous support and intervention based on real-time data. for promoting better social interaction of said user.
2) The system as claimed in claim 1, wherein said speech-therapy includes pronunciation techniques, vocal modeling, and speech rhythm correction, which assists in improving communication skills of said user, especially when stammering or delayed speech are detected, through said microphone 205.
3) The system as claimed in claim 1, wherein said stress-relief exercises includes deep breathing, mindfulness activities, or guided relaxation via a speaker 108 mounted on said body 101 that is configured to work in synchronization with said microphone 205 for interacting with said user.
4) The system as claimed in claim 1, wherein said display panel 105 is configured to engage said user in structured social activities to improve their social skills, such as recognizing emotions and making appropriate eye contact for helping said user to address communication difficulties and promote better social interaction.
5) The system as claimed in claim 1, wherein a secondary imaging unit 206 is arranged on said bands 103 for continuously monitoring of said user’s behaviour patterns, eye contact, and repetitive movements to suggest appropriate interventions through said speaker 108 or display panel 105, when said user’s therapy is scheduled.
6) The system as claimed in claim 1, wherein said body 101 is configured with multiple motorized wheels 109 for seamless mobility over a ground surface, as per requirement.
7) The system as claimed in claim 1, wherein said database stores information about ASD symptoms, intervention strategies, therapeutic exercises, and expert recommendations, which is continuously updated with new research by said microcontroller through a linked server ensuring that said system offers the latest and most effective interventions for said user.
8) The system as claimed in claim 1, wherein said band 103 is selected from said chamber 102, as per said pre-fed data relating to personalization for said authorized user.
9) The system as claimed in claim 1, wherein a battery is associated with said system for powering up electrical and electronically operated components associated with said system.
| # | Name | Date |
|---|---|---|
| 1 | 202541035255-STATEMENT OF UNDERTAKING (FORM 3) [10-04-2025(online)].pdf | 2025-04-10 |
| 2 | 202541035255-REQUEST FOR EXAMINATION (FORM-18) [10-04-2025(online)].pdf | 2025-04-10 |
| 3 | 202541035255-REQUEST FOR EARLY PUBLICATION(FORM-9) [10-04-2025(online)].pdf | 2025-04-10 |
| 4 | 202541035255-PROOF OF RIGHT [10-04-2025(online)].pdf | 2025-04-10 |
| 5 | 202541035255-POWER OF AUTHORITY [10-04-2025(online)].pdf | 2025-04-10 |
| 6 | 202541035255-FORM-9 [10-04-2025(online)].pdf | 2025-04-10 |
| 7 | 202541035255-FORM FOR SMALL ENTITY(FORM-28) [10-04-2025(online)].pdf | 2025-04-10 |
| 8 | 202541035255-FORM 18 [10-04-2025(online)].pdf | 2025-04-10 |
| 9 | 202541035255-FORM 1 [10-04-2025(online)].pdf | 2025-04-10 |
| 10 | 202541035255-FIGURE OF ABSTRACT [10-04-2025(online)].pdf | 2025-04-10 |
| 11 | 202541035255-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [10-04-2025(online)].pdf | 2025-04-10 |
| 12 | 202541035255-EVIDENCE FOR REGISTRATION UNDER SSI [10-04-2025(online)].pdf | 2025-04-10 |
| 13 | 202541035255-EDUCATIONAL INSTITUTION(S) [10-04-2025(online)].pdf | 2025-04-10 |
| 14 | 202541035255-DRAWINGS [10-04-2025(online)].pdf | 2025-04-10 |
| 15 | 202541035255-DECLARATION OF INVENTORSHIP (FORM 5) [10-04-2025(online)].pdf | 2025-04-10 |
| 16 | 202541035255-COMPLETE SPECIFICATION [10-04-2025(online)].pdf | 2025-04-10 |