Sign In to Follow Application
View All Documents & Correspondence

Artificial Intelligence Enabled System And Method For Providing Personalized Cognitive Therapy And Dementia Care

Abstract: Exemplary embodiments of the present disclosure are directed towards a system for providing personalized cognitive therapy and dementia care. The system comprises a first computing device equipped with a cognitive therapy and health monitoring module that enables user registration and input of demographic and health data. The system incorporates a wearable device communicatively coupled to the first computing device to collect real-time health metrics, such as heart rate, movement, and location data. The cognitive therapy and health monitoring module monitors the user’s emotional state using emotion recognition technology and transmits user interaction data and health metrics to a cloud server over a network. The cloud server hosts a cognitive health data processing and analysis module, which processes the received data to assess cognitive health, predict behavioral risks, generate personalized therapy recommendations, and detect safety incidents. Insights and alerts are transmitted to the user’s device, caregivers, and doctors.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
29 November 2024
Publication Number
49/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

ANVAYAA KIN CARE PRIVATE LIMITED
202, Plot No.705, Road No.36, Jubilee Hills, Hyderabad 500033, Telangana, India.

Inventors

1. PRASHANTH REDDY SHYAMALA
202, Plot No.705, Road No.36, Jubilee Hills, Hyderabad 500033, Telangana, India.
2. KOVVURI VINAY KUMAR REDDY
202, Plot No.705, Road No.36, Jubilee Hills, Hyderabad 500033, Telangana, India.
3. TRIPTI SINGH BALAJI
19085, Prestige Shantiniketan, ITPL Main Road, Bangalore 560048.

Specification

Description:DESCRIPTION
TECHNICAL FIELD
[001] The present disclosure generally relates to the field of artificial intelligence, healthcare technology, and cognitive therapy. More particularly, the present disclosure relates to a system and method for providing personalized cognitive therapy and dementia care using artificial intelligence, wearable technology, emotion recognition, and real-time health monitoring. Additionally, the present disclosure support individuals with dementia and their caregivers by enhancing cognitive health, ensuring safety, and delivering actionable insights.

BACKGROUND
[002] Dementia is a progressive neurodegenerative syndrome that significantly impairs an individual’s cognitive and physical abilities, requiring continuous and personalized care. This condition poses overwhelming challenges for both people with dementia (PWD) and their caregivers. Caregivers, in particular, face immense pressure to provide consistent, on-demand assistance, often struggling to manage the complex and evolving needs of the patient. The absence of effective tools to support personalized care further exacerbates the burden.

[003] Existing technological solutions in the market attempt to address certain aspects of dementia care but fail to provide a comprehensive, integrated approach. For example, cognitive applications offer basic stimulation activities, such as memory games, to engage users. However, these solutions lack the ability to customize activities based on the user’s specific health data, emotional states, or cognitive stimulation targets, limiting their effectiveness for personalized therapy.

[004] Similarly, health tracking applications rely on wearable devices to monitor general health metrics like heart rate and movement. While these tools can provide useful insights, they are not specifically designed for dementia care and do not include critical safety features such as real-time fall detection or wandering alerts. The lack of integration with dementia-specific needs undermines their ability to ensure the safety and well-being of PWD.

[005] Caregiver-focused platforms address some organizational aspects, such as reminders and task management, but fail to incorporate advanced artificial intelligence for cognitive assessment or real-time feedback on the patient’s health status. This lack of actionable insights leaves caregivers without the tools to adjust care strategies dynamically, which is crucial in dementia care. Overall, existing solutions lack the level of personalization needed for effective dementia care. They fail to deliver cognitive therapy tailored to individual needs, do not provide AI-driven systems for detecting early signs of dementia, and offer insufficient integration with wearable technologies for real-time safety monitoring. Additionally, there is no emotionally responsive virtual assistant capable of interacting dynamically with PWD to provide support, manage anxiety, or enhance their emotional well-being. These limitations highlight the need for a holistic solution that integrates personalized cognitive therapy, AI-driven dementia detection, real-time health monitoring, and emotional support into a unified system.

[006] In the light of the aforementioned discussion, there exists a need for a system with novel methodologies that would overcome the above- mentioned challenges.

SUMMARY
[007] The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements of the invention or delineate the scope of the invention. Its sole purpose is to present some concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.

[008] Exemplary embodiments of the present disclosure are directed towards an artificial intelligence enabled system and method for providing personalized cognitive therapy and dementia care.

[009] An objective of the present disclosure is directed towards a system that provides a personalized AI-driven solution to assist users (individuals/ patients) with dementia (PWD) and their caregivers through a combination of real-time health monitoring and cognitive therapy.

[0010] Another objective of the present disclosure is directed towards a system that conducts regular health assessments, provide medical reminders, and deliver tailored activities to engage users and promote cognitive stimulation.

[0011] Another objective of the present disclosure is directed towards a system that leverages artificial intelligence to assess the cognitive health of users and analyze their daily habits, and track changes over time for effective care management.

[0012] Another objective of the present disclosure is directed towards a system that incorporates emotion recognition technology to analyze users’ emotional states and dynamically adjust its interactions and responses in real-time, ensuring empathetic and adaptive support.

[0013] Another objective of the present disclosure is directed towards a system that offers caregivers features such as the ability to book essential services, access smart reports, and receive personalized recommendations for enhanced dementia care management.

[0014] Another objective of the present disclosure is directed towards a system that incorporates fall detection using smartwatch technology and utilizes AI-driven dementia diagnosis to enable early detection and timely intervention.

[0015] Another objective of the present disclosure is directed towards a system that provides personalized cognitive therapy through customized activities designed to engage memory, language, and sensory skills. The system includes a memory box feature that incorporates games based on tagged photos to enhance memory recall for users.

[0016] Another objective of the present disclosure is directed towards a system that enables real-time health monitoring by integrating with users’ smartwatches to provide fall detection and continuous health tracking, ensuring the safety of individuals with dementia.

[0017] Another objective of the present disclosure is directed towards a system that facilitates early dementia detection by utilizing artificial intelligence to assess the cognitive health of users through the analysis of interaction data.

[0018] Another objective of the present disclosure is directed towards a system that incorporates an emotional support assistant, which uses emotion recognition technology to understand and interact with users based on their emotional states, providing adaptive and empathetic communication.

[0019] Another objective of the present disclosure is directed towards a system that helps caregivers by offering service booking options, general assistance, and personalized tips and recommendations for effective dementia care.

[0020] Another objective of the present disclosure is directed towards a system that supports the healthcare industry by enabling dementia care in home settings or care facilities through continuous health monitoring, AI-based cognitive assessments, and personalized activity recommendations.

[0021] Another objective of the present disclosure is directed towards a system that provides caregivers with real-time health insights, service booking capabilities, and emotion-adaptive assistance to effectively manage the daily needs of individuals with dementia.

[0022] Another objective of the present disclosure is directed towards a system that integrates with wearable health monitoring devices, offering expandability for broader applications in geriatric care beyond dementia management.

[0023] Another objective of the present disclosure is directed towards a system that utilizes AI-driven detection and emotional response mechanisms, which can be extended to address other neurodegenerative diseases or mental health conditions requiring personalized care.

[0024] Another objective of the present disclosure is directed towards a system that provides personalized care by delivering activities and therapy uniquely tailored to each user’s cognitive needs, emotional state, and health data.

[0025] Another objective of the present disclosure is directed towards a system that enables early dementia detection through AI-driven analysis, allowing for proactive care and distinguishing itself from solutions that only respond to symptoms.

[0026] Another objective of the present disclosure is directed towards a system that offers real-time monitoring by integrating with smartwatches to enhance safety for individuals with dementia, providing immediate alerts for falls and wandering.

[0027] Another objective of the present disclosure is directed towards a system that incorporates an emotion-responsive companion, which adjusts its behavior based on the user’s emotional state to improve engagement and provide comfort.

[0028] Another objective of the present disclosure is directed towards a system that delivers holistic support by serving both users with dementia and their caregivers, offering practical options such as service booking and health reports alongside personalized dementia care.

[0029] Another objective of the present disclosure is directed towards a system that enables collaborative care planning by providing a shared, interactive interface for patients, caregivers, and doctors to collaboratively design and adjust therapy plans in real time, ensuring holistic and personalized care.

[0030] Another objective of the present disclosure is directed towards a system that utilizes predictive AI for behavioral management, including the ability to forecast behavioral patterns such as wandering, and incorporates a guardian AI companion to identify and mitigate risks of wandering or falls.

[0031] Another objective of the present disclosure is directed towards a system that creates a cognitive digital twin, a virtual replica of the patient’s cognitive state, to simulate responses to hypothetical scenarios, enabling proactive adjustments to therapy plans.

[0032] Another objective of the present disclosure is directed towards a system that includes an AI-powered social circle manager, which analyzes social interaction patterns to identify relationships that positively impact mental health, suggests reconnections with beneficial contacts, and provides insights for enhanced social engagement.

[0033] Another objective of the present disclosure is directed towards a system that incorporates a context-aware therapy engine, which adapts therapy based on environmental factors such as time of day, location, and user performance, to maximize effectiveness.

[0034] Another objective of the present disclosure is directed towards a system that enhances interaction through social AI agents designed to engage users in meaningful conversations, promoting emotional well-being and fostering connection.

[0035] Another objective of the present disclosure is directed towards a system that incorporates a dynamic AI ecosystem, a collaborative network of AI modules specializing in cognitive, emotional, and physical therapy, which shares insights across multiple patients to optimize care.

[0036] Another objective of the present disclosure is directed towards a system that includes personality-driven AI companions, offering distinct personas such as friendly, assertive, or nurturing, tailored to individual user preferences to enhance personalized care.

[0037] Another objective of the present disclosure is directed towards a system that incorporates multi-lingual cognitive agents capable of supporting regional dialects and cultural nuances, ensuring better engagement for users in diverse and underrepresented areas.

[0038] Another objective of the present disclosure is directed towards a system that enhances the memory box feature through memory synthesis, utilizing generative AI to recreate immersive scenarios based on tagged photos and past user memories, enriching cognitive and emotional experiences.

[0039] Another objective of the present disclosure is directed towards a system that incorporates future memory augmentation, generating hypothetical memories by asking users imaginative questions and creating associated images or videos, fostering creativity and enhancing user engagement.

BRIEF DESCRIPTION OF THE DRAWINGS
[0040] In the following, numerous specific details are set forth to provide a thorough description of various embodiments. Certain embodiments may be practiced without these specific details or with some variations in detail. In some instances, certain features are described in less detail so as not to obscure other aspects. The level of detail associated with each of the elements or features should not be construed to qualify the novelty or importance of one feature over the others.

[0041] FIG. 1 is a block diagram depicting a schematic representation of a system for providing personalized cognitive therapy and dementia care, in accordance with one or more exemplary embodiments.

[0042] FIG. 2 is a block diagram depicting an embodiment of a cognitive therapy and health monitoring module, in accordance with one or more exemplary embodiments.

[0043] FIG. 3 is a block diagram depicting an embodiment of a cognitive health data processing and analysis module, in accordance with one or more exemplary embodiments.

[0044] FIG. 4A is an example diagram depicting a conceptual representation of the multiple dimensions of care and monitoring for individuals with dementia (PWD), in accordance with one or more exemplary embodiments of the present disclosure.

[0045] FIG. 4B is an example diagram depicting an exemplary a user interface screen displaying a summary of stimulation targets and their associated ranges for an individual with dementia, in accordance with one or more exemplary embodiments.

[0046] FIG. 4C illustrates an exemplary user interface screen displaying a medication reminders dashboard for individuals with dementia, in accordance with one or more exemplary embodiments.

[0047] FIG. 4D illustrates an exemplary user interface screen displaying details of a cognitive stimulation activity, in accordance with one or more exemplary embodiments.

[0048] FIG. 4E illustrates an exemplary user interface screen displaying a dashboard for individuals with dementia, in accordance with one or more exemplary embodiments.

[0049] FIG. 4F illustrates an exemplary user interface screen displaying a multilingual questionnaire interface for assessing the cognitive and emotional state of a user, in accordance with one or more exemplary embodiments.

[0050] FIG. 5A is an example diagram depicting an exemplary user interface for creating or updating a medication reminder within the system, in accordance with one or more exemplary embodiments of the present disclosure.

[0051] FIG. 5B an example diagram depicting a user interface designed for scheduling doctor appointments, in accordance with one or more exemplary embodiments of the present disclosure.

[0052] FIG. 5C an example diagram depicting a user interface screen for the "Memory Box" feature, in accordance with one or more exemplary embodiments.

[0053] FIG. 5D an example diagram depicting a user interface screen for selecting user interests and preferences, in accordance with one or more exemplary embodiments.

[0054] FIG. 5E an example diagram depicting a user interface designed to enable activity management and exploration for individuals with dementia (PWD) and their caregivers, in accordance with one or more exemplary embodiments of the present disclosure.

[0055] FIG. 5F an example diagram depicting a user interface designed to facilitate access to games and activities that enhance cognitive stimulation and emotional engagement for individuals with dementia (PWD) , in accordance with one or more exemplary embodiments of the present disclosure.

[0056] FIG. 5G an example diagram depicting a user interface screen for activity selection, in accordance with one or more exemplary embodiments.

[0057] FIG. 6A is an example diagram depicting an exemplary user interface of word search game, in accordance with one or more exemplary embodiments.

[0058] FIG. 6B is an example diagram depicting user interface of a matching pairs game, in accordance with one or more exemplary embodiments.

[0059] FIG. 6C is an example diagram depicting a user interface screen of a spot-the-differences game, in accordance with one or more exemplary embodiments.

[0060] FIG. 6D is an example diagram depicting a user interface screen of a sliding puzzle game, in accordance with one or more exemplary embodiments.

[0061] FIG. 6E illustrates an exemplary user interface screen for a “Guess the Image” game aimed at improving cognitive recognition and memory recall, in accordance with one or more exemplary embodiments.

[0062] FIG. 7 is a flow diagram depicting a method for providing personalized cognitive therapy and dementia care, in accordance with one or more exemplary embodiments.

[0063] FIG. 8 is a block diagram illustrating the details of a digital processing system in which various aspects of the present disclosure are operative by execution of appropriate software instructions.

DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0064] It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The present disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.

[0065] The use of “including”, “comprising” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items. Further, the use of terms “first”, “second”, and “third”, and the like, herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.

[0066] Referring to FIG. 1 is a block diagram 100 depicting a schematic representation of a system for providing personalized cognitive therapy and dementia care, in accordance with one or more exemplary embodiments. The system 100 includes a first computing device 102, a second computing device 104, a third computing device 106, a cloud server 108, a network 110, a processor 112, memory 114, a cognitive therapy and health monitoring module 116, a cognitive health data processing and analysis module 118, a database 120, and a wearable device 122.

[0067] The computing devices 102, 104, 106 may include, but is not limited to, a personal digital assistant, smartphones, personal computers, a mobile station, computing tablets, a handheld device, an internet enabled calling device, an internet enabled calling software, a telephone, a mobile phone, a digital processing system, and so forth. The computing devices 102, 104, 106 may include the processor 112 in communication with a memory 114. The processor 112 may be a central processing unit. The memory 114 is a combination of flash memory and random-access memory. The processor 112 may execute instructions and process data within the system. The memory 114 may be configured to store program instructions, data, and temporary information needed for system operations. The computing devices 102, 104, 106 may be communicatively connected with the cloud server 108 via the network 110. The network 110 may include, but not limited to, an Internet of things (IoT network devices), an Ethernet, a wireless local area network (WLAN), or a wide area network (WAN), a Bluetooth low energy network, a ZigBee network, a WIFI communication network e.g., the wireless high speed internet, or a combination of networks, a cellular service such as a 4G (e.g., LTE, mobile WiMAX) or 5G cellular data service, a RFID module, a NFC module, wired cables, such as the world-wide-web based Internet, or other types of networks may include Transport Control Protocol/Internet Protocol (TCP/IP) or device addresses (e.g. network-based MAC addresses, or those provided in a proprietary networking protocol, such as Modbus TCP, or by using appropriate data feeds to obtain data from various web services, including retrieving XML data from an HTTP address, then traversing the XML for a particular node) and so forth without limiting the scope of the present disclosure.

[0068] Although the computing devices 102, 104, 106 is shown in FIG. 1, an embodiment of the system 100 may support any number of computing devices. The computing devices 102, 104, 106 supported by the system 100 is realized as a computer-implemented or computer-based device having the hardware or firmware, software, and/or processing logic needed to carry out the computer-implemented methodologies described in more detail herein. The cognitive therapy and health monitoring module 116 may be any suitable applications downloaded from GOOGLE PLAY® (for Google Android devices), Apple Inc.'s APP STORE® (for Apple devices), or any other suitable database. The cognitive therapy and health monitoring module 116 may be a desktop application which runs on Windows or Linux or any other operating system and may be downloaded from a webpage or a CD/USB stick etc. In some embodiments, the cognitive therapy and health monitoring module 116 may be software, firmware, or hardware that is integrated into the computing devices 102, 104, and 106. The computing devices 102, 104, and 106 may present a web page to the user by way of a browser, wherein the webpage comprises a hyper-link may direct the user to uniform resource locator (URL).

[0069] The first computing device 102 may be configured to operate as a device associated with the user (patient). The first computing device 102 includes a processor 112 and a memory 114. The memory 114 may be configured to store a cognitive therapy and health monitoring module 116, which enables personalized cognitive therapy, emotional analysis, and health tracking for the user. The first computing device 102 may also communicate with a wearable device 122 to receive health metrics such as heart rate, movement data, and fall detection alerts.

[0070] The second computing device 104 may be configured to operate as a device used by caregivers. The second computing device 104 includes a processor 112 and a memory 114, which stores the cognitive therapy and health monitoring module 116. The cognitive therapy and health monitoring module 116 may be configured to provide caregivers with access to real-time health data, therapy progress reports, and personalized care recommendations for the user. Caregivers may also use the second computing device 104 to schedule appointments, book services, and monitor safety alerts related to the user.

[0071] The third computing device 106 may be configured to operate as a device used by doctors. The third computing device 106 includes a processor 112 and a memory 114, which stores the cognitive therapy and health monitoring module 116. The cognitive therapy and health monitoring module 116 may be configured to enable doctors to analyze the cognitive health of users, adjust therapy plans, and provide medical recommendations. The third computing device 106 may also receive and process reports generated by the cloud server 108 to support decision-making.

[0072] The cloud server 108 may be configured to act as the central processing and storage unit for the system. The cloud server 108 includes a cognitive health data processing and analysis module 118, which may be configured to process data received from all computing devices and wearable devices. The cognitive health data processing and analysis module 118 may analyze user interactions, health metrics, and cognitive patterns to generate insights, such as early detection of dementia, therapy recommendations, and behavioral predictions. The cloud server 108 also includes a database 120, which may be configured to securely store user data, therapy plans, health reports, and system-generated insights. The wearable device 122, which may be configured to track real-time health metrics for the user. The wearable device 122 may measure parameters such as heart rate, movement patterns, and location data. The wearable device 122 may further include features for detecting falls or wandering and sending real-time alerts to the first computing device 102, second computing device 104, and third computing device 106 via the network 110. The network 110 may be configured to facilitate communication and data exchange among the first computing device 102, second computing device 104, third computing device 106, wearable device 122, and cloud server 108. The network 110 may include wired or wireless communication protocols, such as Wi-Fi, Bluetooth, or cellular networks, ensuring seamless real-time connectivity across all system components. The cognitive therapy and health monitoring module 116 in each computing device may be configured to adapt therapy plans dynamically based on user interactions and feedback. The cognitive health data processing and analysis module 118 in the cloud server 108 may further enhance the system’s capabilities by identifying early cognitive decline, predicting behavioral trends, and generating actionable insights for caregivers and doctors. The wearable device 122 provides continuous safety monitoring, integrating seamlessly with the other system components.

[0073] Referring to FIG. 2 is a block diagram 200 depicting an embodiment of a cognitive therapy and health monitoring module, in accordance with one or more exemplary embodiments. The cognitive therapy and health monitoring module include a bus 201, a user registration module 202, a user authentication module 204, a user interface module 206, a health monitoring module 208, an emotion recognition module 210, a wearables integrating module 212, a companion monitoring module 214, and a photo tagging and categorization module 216.

[0074] The user registration module 202 may be configured to facilitate the registration of users (patients), caregivers, and doctors within the system. This module may collect user-specific data during onboarding, such as demographic information, medical history, cognitive health status, and therapy preferences. The collected data may serve as the foundation for personalizing therapy and recommendations.

[0075] The user authentication module 204 may be configured to authenticate users accessing the system. The user authentication module 204 may implement secure login mechanisms, such as username-password combinations, two-factor authentication, or biometric verification, to ensure that only authorized users can access the system’s features and data.

[0076] The user interface module 206 may be configured to provide an intuitive and user-friendly interface for all users, including patients, caregivers, and doctors. The user interface module 206 may present therapy activities, health metrics, emotional feedback, and system recommendations in an accessible format. The user interface module 206 may also allow users to interact with the system, input feedback, and navigate through various features.

[0077] The health monitoring module 208 may be configured to track and monitor the user’s health data in real time. The health monitoring module 208 may receive inputs from wearable devices, such as heart rate, movement, and location data, to ensure safety and monitor the user’s physical well-being. The health monitoring module 208 may also be configured to trigger alerts in response to anomalies, such as falls or wandering.

[0078] The emotion recognition module 210 may be configured to analyze the user’s emotional state by processing data from online activities, games, facial expressions, and voice tone. The emotion recognition module 210 may dynamically adjust therapy activities and communication based on the detected emotional state, promoting mental well-being and reducing anxiety. For example, if a user exhibits signs of frustration, the module may suggest calming activities or adapt the companion’s tone.

[0079] The wearables integrating module 212 may be configured to integrate data from wearable devices, such as smartwatches or fitness bands, with the system. The wearables integrating module 212 may collect and transmit health metrics, such as step count, sleep patterns, heart rate, and location, to the health monitoring module 208 and cognitive health data processing and analysis module 118 for further processing.

[0080] The companion monitoring module 214 may be configured to monitor the user’s engagement and behavior during interactions with the system. The companion monitoring module 214 may analyze cues such as facial expressions, tone of voice, and interaction patterns to adapt the communication style dynamically. The companion monitoring module 214 may provide empathetic and supportive responses to manage mood swings, encourage engagement, and reduce anxiety. The companion monitoring module 214 may be configured to monitor the user’s engagement, emotional state, and behavior during interactions with the system. Functioning as a virtual assistant, the companion monitoring module 214 provides empathetic and adaptive communication tailored to the user’s cognitive and emotional needs. The companion monitoring module 214 analyzes cues such as facial expressions, tone of voice, reaction times, and interaction patterns to assess the user’s mood and engagement level. Based on these insights, the module dynamically adjusts its communication style, offering real-time emotional support by responding in a friendly, calming, or encouraging manner. The Companion Monitoring Module 214 may also recommend or modify therapy activities to align with the user’s emotional state, ensuring a personalized and effective therapeutic experience. The companion monitoring module 214 may engage the user in meaningful conversations through natural language processing (NLP) to foster emotional well-being and reduce feelings of isolation or anxiety. By providing reminders for therapy activities, medications, or caregiver interactions, the Companion Monitoring Module 214 ensures consistent engagement and support for the user. Its integration with the emotion recognition module 210 enhances its ability to deliver real-time, tailored assistance, creating a holistic and interactive experience that promotes both cognitive and emotional health.

[0081] The photo tagging and categorization module 216 may be configured to allow users or caregivers to upload, tag, and categorize photos for use in memory-based games. photo tagging and categorization module 216 may organize photos into categories (e.g., family, travel, events) and ensure that they are appropriately tagged for easy retrieval. The tagged photos are then transmitted to the server for generating personalized memory games.

[0082] Referring to FIG. 3 is a block diagram 300 depicting an embodiment of a cognitive health data processing and analysis module, in accordance with one or more exemplary embodiments. The cognitive health data processing and analysis module includes a cognitive health data analysis module 302, a behavior prediction module 304, a personalized therapy module 306, a wearable data processing 308, a reports generating module 310, an activities generating module 312, a games generating module 314, and a memory games generating module 316.

[0083] The cognitive health data analysis module 302 may be configured to process user data collected from interactions, health monitoring devices, and therapy activities. The cognitive health data analysis module 302 may analyze cognitive patterns, emotional responses, and user performance to assess the user’s cognitive health. The cognitive health data analysis module 302 may also identify early signs of cognitive decline, such as memory lapses, difficulty in task execution, or irregular health metrics, enabling timely intervention and therapy adjustments.

[0084] The behavior prediction module 304 may be configured to predict user behavior patterns based on historical and real-time data. The behavior prediction module 304 may use advanced AI algorithms to forecast potential risks, such as wandering, agitation, or falls, and generate preemptive alerts to caregivers or doctors. By analyzing trends in user activities, the behavior prediction module 304 module may also recommend adjustments to therapy plans or safety protocols to mitigate behavioral risks.

[0085] The personalized therapy module 306 may be configured to generate and adapt therapy plans tailored to the user’s cognitive health status, emotional state, and performance in previous activities. The personalized therapy module 306 may integrate data from the cognitive health data analysis module 302 and behavior prediction module 304 to create dynamic, personalized therapy strategies. These therapy plans may include cognitive stimulation tasks, physical exercises, and relaxation activities to improve the user’s overall well-being.

[0086] The wearable data processing module 308 may be configured to process data collected from wearable devices, such as smartwatches or fitness bands. The wearable data processing module 308 may analyze metrics such as heart rate, physical activity, sleep patterns, and location data to monitor the user’s health in real-time. The processed data may be used to detect anomalies, such as falls or irregular heart rates, and inform the cognitive health data analysis module 302 for further action.

[0087] The reports generating module 310 may be configured to generate health and therapy progress reports for caregivers and doctors. These reports may include visual representations, such as graphs or wheels, that illustrate the user’s cognitive and physical health progress across various stimulation categories. The reports may also provide actionable insights and recommendations to refine care strategies and therapy plans.

[0088] The activities generating module 312 may be configured to create a variety of cognitive and physical activities personalized to the user’s needs. These activities may target specific cognitive stimulation categories, such as memory, language, or sensory skills, and are adjusted dynamically based on user performance and feedback. The activities generating module 312 ensures that the activities are engaging, effective, and aligned with the therapy goals.

[0089] The games generating module 314 may be configured to design interactive games aimed at enhancing cognitive abilities, emotional well-being, and user engagement. These games may include puzzles, matching games, and interactive scenarios that challenge the user’s cognitive skills while providing an enjoyable experience. The games generating module 314 may adapt game difficulty levels and themes based on user preferences and performance.

[0090] The memory games generating module 316 may be configured to create personalized memory-based games using tagged and categorized photos uploaded by the user or caregivers. These games may involve recalling faces, objects, or events and are specifically designed to improve the user’s memory recall abilities. The memory games generating module 316 may integrate emotional and cognitive feedback to refine the games and ensure they are both effective and enjoyable for the user.

[0091] Referring to FIG. 4A is an example diagram 400a depicting a conceptual representation of the multiple dimensions of care and monitoring for individuals with dementia (PWD), in accordance with one or more exemplary embodiments of the present disclosure. Cognitive Dimension: This dimension focuses on mental functions such as memory, communication, attention, problem-solving, and language skills. Therapy modules and assessments under this dimension aim to stimulate and improve cognitive functions. Physical Dimension: This includes aspects related to physical health and mobility, such as strength, endurance, vision, hearing, and chronic health conditions like diabetes or heart disease. Wearable devices track physical metrics to support this domain. Emotional Dimension: This dimension addresses emotional well-being, including mood, resilience, empathy, and the ability to identify and express appropriate emotions. Emotion recognition technology dynamically adjusts therapy activities to cater to this dimension. Psychological Dimension: The psychological domain includes stress management, adaptability, coping mechanisms, self-esteem, and endurance. Therapy modules target these aspects to improve the psychological stability of PWDs. Social Dimension: Social engagement is crucial for mental health. This domain focuses on interactions with family, friends, community involvement, communication skills, and cultural activities. AI-powered social tools in the system promote and track social connections. Daily Activity Dimension: This dimension encompasses essential daily living activities, such as eating, bathing, dressing, toileting, and transportation. Personalized therapy recommendations ensure that users maintain independence in these areas wherever possible. The interconnected representation in FIG. 4A emphasizes that effective dementia care requires addressing all these dimensions cohesively. The system integrates data and feedback from these domains to personalize therapy, assess progress, and generate recommendations for caregivers and healthcare providers. In operation, the system continuously monitors and adapts therapy programs across these dimensions based on real-time inputs from wearable devices, emotion recognition modules, user interactions, and feedback from caregivers and doctors. The holistic approach ensures comprehensive care for PWDs, enhancing their quality of life and reducing caregiver burden.

[0092] Referring to FIG. 4B is an example diagram 400b depicting an exemplary a user interface screen displaying a summary of stimulation targets and their associated ranges for an individual with dementia, in accordance with one or more exemplary embodiments. This interface is designed to visually represent and summarize the individual’s progress across key therapeutic areas, assisting caregivers, doctors, and users in understanding and tailoring their therapy plans. The central spider chart 402b provides a visual representation of the individual's stimulation target scores. The chart includes five dimensions: Cognitive Function: Represents the individual’s mental abilities, such as memory, problem-solving, and attention. Communication: Represents the individual’s ability to engage in conversations and comprehend language. Behavioral and Psychological Well-Being: Reflects the individual's emotional and mental health, including stress management and emotional adaptability. Functional Abilities: Tracks the individual's ability to perform daily living tasks, such as dressing, eating, and bathing. Physical Health: Represents metrics such as mobility, endurance, and physical activity levels. The filled area within the spider chart provides a quick snapshot of the individual’s current performance or progress in these categories, enabling a visual comparison between dimensions. Below the spider chart, a tabular representation is displayed. The table includes: Column 1 (404b): Lists the specific stimulation targets, including cognitive function (408b), communication (410b), behavioral and psychological well-being (412b), functional abilities (414b), and physical health (416b). Column 2 (406b): Indicates the associated range or status for each stimulation target. The ranges are categorized as “High,” “Medium (Med),” or “Low,” providing detailed insights into the strengths and areas needing improvement for each target. This user interface allows caregivers, doctors, and users to: Quickly assess progress in different stimulation categories. Identify areas that require additional focus or adjustments in therapy plans. Monitor trends and improvements over time. In operation, the data displayed in FIG. 4B is generated by the cognitive health data processing and analysis module hosted on the cloud server. This module processes user data, such as interaction metrics, health metrics from wearable devices, and emotional responses, to calculate the scores for each stimulation target. The user interface dynamically updates these scores and their associated ranges based on real-time and historical data. This interface enhances engagement by providing a clear and accessible representation of the therapy outcomes, facilitating better decision-making for caregivers and healthcare providers while encouraging users to remain actively involved in their therapy process.

[0093] Referring to FIG. 4C illustrates an exemplary user interface screen 400c displaying a medication reminders dashboard for individuals with dementia, in accordance with one or more exemplary embodiments. The Add New button allows users or caregivers to create new medication reminders by inputting details such as the medication name, dosage, and schedule. Below the title bar, the interface displays two tabs—Today (404c) and Scheduled (406c). The Today tab shows all medications due on the current day, while the Scheduled tab provides an overview of future reminders. The main body of the interface lists individual medication reminders, each displaying: Medication Name (e.g., "Dolo" and "Paracetamol") (408c, 410c), Dosage (e.g., "1.00 Capsules" and "1.00 Tablets"), and Scheduled Times (e.g., "Before Meal Breakfast," "Before Meal Lunch," or "Before Meal Dinner"). Each medication entry also includes an Edit option, enabling caregivers or users to update the medication details as needed. The bottom section of the interface features a navigation bar with five icons (412c–420c) to provide easy access to other system functionalities: Home (412c): Redirects the user to the system's home screen. Task Management (414c): Provides access to task-related features, such as therapy assignments and scheduling. Reminders (416c): Highlights the active screen for managing medication and activity reminders. Reports (418c): Displays cognitive and physical health progress reports for the user. Settings (420c): Opens the settings menu for configuring system preferences, such as notification settings or user profile updates. In operation, the Reminders interface dynamically updates based on real-time inputs from the user or caregiver. The system ensures that all reminders are triggered at the scheduled times, with notifications sent to the user's device. These notifications may include audible, visual, or vibration alerts, providing additional support for individuals with dementia who may have memory difficulties. The data displayed in FIG. 4C is managed by the cognitive therapy and health monitoring module, which integrates caregiver inputs, user preferences, and therapy schedules to ensure personalized and effective care. This user interface simplifies medication management and reduces the risk of missed doses, contributing to improved health outcomes for individuals with dementia.

[0094] Referring to FIG. 4D illustrates an exemplary user interface screen 400d displaying details of a cognitive stimulation activity, in accordance with one or more exemplary embodiments. FIG. 4D depicts an exemplary user interface (400d) for providing detailed information about a cognitive stimulation activity. This interface is designed to guide individuals with dementia (PWD) and their caregivers in engaging with specific activities aimed at promoting cognitive, emotional, and psychological well-being. At the top of the interface, the title bar includes the activity title "Activity Details" and a back arrow (400d), enabling users to navigate back to the previous screen. Below the title bar, the interface displays a visual representation of the activity, such as a painting or drawing session, followed by an activity description (e.g., "Painting and Drawing Fun"). The description provides a brief overview of the activity's purpose, stating that it is designed to help individuals with dementia express their creativity and emotions. Users or caregivers can select the Read More link to view additional details about the activity. The Start Activity button (402d) is prominently displayed below the activity description. Selecting this button initiates the activity, allowing users or caregivers to begin the task. The system may guide the user step-by-step through the activity once it is started. Further below, the Materials Needed section (404d) lists the required items for the activity. In the example shown, items include paintbrushes, paints (watercolor, acrylic), drawing paper, pencils, erasers, palettes, and a water cup. These materials are displayed as easily readable tags, simplifying the preparation process for users and caregivers. At the bottom of the screen, a navigation bar provides access to additional system features. The navigation icons include: Home (406d): Returns the user to the system's main dashboard. Task Management (408d): Provides access to activity schedules and related tasks. Reminders (410d): Displays reminders for medications, therapy sessions, and scheduled activities. Reports (414d): Shows progress reports related to cognitive and physical health. Settings (416d): Opens the system's settings for customization and user preferences. The data displayed in FIG. 4D is dynamically populated by the personalized therapy module within the cognitive health data processing and analysis module hosted on the cloud server. The activity details, materials, and instructions are tailored to the user's cognitive abilities, emotional state, and therapy goals. In operation, this interface streamlines user engagement by providing clear instructions and materials for each activity. It also fosters a sense of accomplishment and creativity for individuals with dementia, while caregivers are supported with simple, actionable guidance. Through this functionality, the system ensures that cognitive therapy activities are accessible, effective, and enjoyable for both users and caregivers.

[0095] FIG. 4E illustrates an exemplary user interface (400e) designed as a central dashboard for individuals with dementia and their caregivers. This dashboard provides an overview of key features, personalized suggestions, and activity schedules, enhancing user engagement and therapy management. At the top of the interface, a greeting message ("Hello Shivani") is displayed, welcoming the user. Next to it, a notification icon (404e) provides alerts and reminders about therapy schedules, appointments, or system updates. Below the greeting, the Profile Completion Bar (406e) visually indicates the percentage of profile completion for the user. It highlights how completing the profile will improve the personalization of activities and therapy recommendations. The Suggestions section (408e) displays tailored recommendations based on the user’s cognitive health data, behavior, or feedback. For example, in the provided interface, a suggestion is displayed to improve flexibility and gross motor skills, along with a Book Now button (410e) for scheduling related activities or services. Further below, the Activities Today section highlights the scheduled activity for the day (e.g., "Gardening"), accompanied by an illustration to encourage engagement. Selecting this section provides additional details about the activity, including instructions, materials, and benefits. The Services Booked section (412e) lists upcoming and completed services booked by the user or caregiver. In the provided example, a painting service is booked, with details such as the status (e.g., "Booked"), the name of the caregiver ("Harika Sri"), and the booking date ("2024-09-27"). This section ensures transparency and ease of managing services for caregivers and users. At the bottom of the interface, a navigation bar provides quick access to other system features. The icons include: Home (414e): Redirects the user to the main dashboard. Task Management (416e): Displays assigned tasks or therapy activities. Reminders (418e): Manages reminders for medications, therapy sessions, and other schedules. Reports (420e): Provides insights into cognitive and physical progress. Settings (422e): Allows customization of user preferences and system settings. The dashboard dynamically updates based on real-time data received from the cognitive therapy and health monitoring module and the cognitive health data processing and analysis module. These updates ensure the information displayed is personalized and relevant to the user’s current cognitive state, physical health, and therapy needs. The interface simplifies therapy and activity management by consolidating key information into an accessible dashboard. By integrating user-specific data, real-time notifications, and actionable suggestions, the system enhances the quality of care and support for individuals with dementia and their caregivers.

[0096] FIG. 4F illustrates an exemplary user interface screen displaying a multilingual questionnaire interface for assessing the cognitive and emotional state of a user, in accordance with one or more exemplary embodiments. FIG. 4F illustrates an exemplary user interface (400f) designed for administering a cognitive and emotional assessment questionnaire in a multilingual format. This interface facilitates user or caregiver interaction to gather feedback, contributing to personalized therapy adjustments. At the top of the screen, a progress bar visually represents the completion status of the questionnaire (e.g., "5/20 steps completed"). This feature provides users and caregivers with a clear indication of how many questions remain, helping maintain engagement throughout the assessment process. Below the progress bar, a question is displayed in the user's preferred language. The question in FIG. 4F translates to: "Do you ever feel this is where you are or what time of day it is?" The multilingual capability ensures accessibility for diverse users, including those from underrepresented linguistic groups.

[0097] Referring to FIG. 5A is an example diagram 500a depicting an exemplary user interface for creating or updating a medication reminder within the system, in accordance with one or more exemplary embodiments of the present disclosure. This screen is designed to assist individuals with dementia (PWD) and their caregivers in managing medications efficiently by providing clear and structured input fields. At the top of the screen, the title "Medicine Reminder" is prominently displayed, providing context for the screen's purpose. Below the title, visual icons reinforce the theme of medication management. The interface includes the following labeled input fields and options The medicine name field 502a may be configured to allow users or caregivers to input the name of the medication. For example, in the provided interface, the medicine name "Levothyroxine" has been entered. This field ensures accuracy in tracking specific medications. The dosage field 504a may be configured to accept numerical inputs representing the quantity of the medication to be administered. A corresponding unit selection dropdown menu may be configured to allow users to specify the unit of measurement, such as "Mg" or "ml." In the provided example, the dosage is entered as "0.25 Mg." The "When to Take" field 506a may be configured to provide a dropdown menu, enabling users to select options such as "Before Meal" or "After Meal" to define when the medication should be taken. The time of day options 508a may be configured to allow users to specify whether the medication should be taken during "Breakfast," "Lunch," or "Dinner." These selections ensure that reminders are aligned with the user’s routine. The start date field 510a may be configured to accept user inputs specifying the beginning of the medication schedule, while the end date field 512a may be configured to define the conclusion of the schedule. In the example, the start date is "2024-11-07," and the end date is "2024-11-30." At the bottom of the interface, the Update button 514a may be configured to allow users to save the entered medication details or modify an existing reminder. Upon selecting this button, the system stores the updated information and integrates it into the user’s personalized therapy plan. The data entered in this interface is transmitted to the cognitive therapy and health monitoring module, which is configured to generate reminders at the specified times. Notifications are delivered to the user’s device or caregiver via visual, audible, or vibratory alerts, ensuring timely adherence to the medication schedule.

[0098] FIG. 5B an example diagram depicting a user interface designed for scheduling doctor appointments, in accordance with one or more exemplary embodiments of the present disclosure. This screen facilitates seamless booking by providing relevant doctor information, input options, and a streamlined workflow for users or caregivers managing appointments for individuals with dementia. At the top of the interface, the title "Doctor Details" indicates the purpose of the screen. Below this, detailed information about the selected doctor is displayed, including: Doctor’s Name and Photo: Shown prominently to confirm the selected healthcare professional. Availability: The next available appointment slot is displayed (e.g., "Next Available: 10:00 AM Tomorrow"). Consultation Fee: Indicates the fee for the consultation (e.g., "400"). Additional Attributes: Includes data such as the number of patients treated ("Patients: 50+"), years of experience ("Experience: 5+ Years"), and a rating ("Rating: 9.0"). A Call Option button allows users to contact the doctor directly for further inquiries. This feature enhances flexibility and communication between patients or caregivers and healthcare professionals. Below the doctor details, the interface includes: Reason for Visit: A text input field where users or caregivers can specify the purpose of the consultation. This information helps doctors prepare for the session effectively. Select Date 502b, this field may be configured to allow users to choose a preferred appointment date using a calendar-like interface. The available dates are displayed in a horizontal scrollable format, highlighting the selected date (e.g., "Tue 26"). Select Time 504b field may be configured to allow users to select a time slot for the appointment. Available time slots are displayed as buttons (e.g., "14:00," "15:00," etc.), enabling users to pick a convenient time. At the bottom of the screen, the book appointment button 506b may be configured to confirm the booking. Once the user selects a date and time and clicks the button, the system saves the appointment details and sends confirmation notifications to the user and the doctor. The appointment details entered in this interface are transmitted to the cognitive therapy and health monitoring module on the user's device and the cloud server. The system synchronizes the booking with the user’s therapy and care plan, ensuring that the schedule aligns with ongoing activities and requirements. This interface simplifies appointment management for individuals with dementia and their caregivers by presenting relevant information in an organized and intuitive manner. The ability to select a doctor, specify a reason for the visit, and confirm a date and time ensures a seamless user experience while facilitating better healthcare coordination.

[0099] Referring to FIG. 5C an example diagram 500c depicting a user interface screen for the "Memory Box" feature, in accordance with one or more exemplary embodiments. The interface allows users or caregivers to upload and categorize photos into predefined categories such as "Past Memory," "People," and "Other," enabling personalized memory-based activities. At the top of the screen, the title "Categories" is displayed to indicate that the interface is used for organizing uploaded photos into categories. Below the title, the "Memory Box" description provides guidance to the user or caregiver, stating: "Upload pictures of loved ones and events to create a keepsake and help personalize activities." Past Memory 502c, this category may be configured to allow users or caregivers to upload and store photos related to significant past events, such as holidays, celebrations, or childhood memories. These images are used to create personalized memory-based games or activities. People 504c, this category may be configured to organize photos of loved ones, such as family members, friends, or caregivers. The system may associate these images with emotion recognition modules to enhance therapy activities that involve recognizing and recalling faces. Other 506c, this category may be configured to store miscellaneous images that do not fit into the predefined categories. These could include nature scenes, travel photos, or hobbies that hold significance for the user. The user interface may be configured to allow the uploading of photos directly from the user’s device or external sources, such as cloud storage or social media. Each category icon may act as a button, which, when selected, navigates the user to a detailed screen for viewing, uploading, or editing images within that category. Once uploaded, the photos may be transmitted to the cognitive health data processing and analysis module hosted on the cloud server. The system may be configured to use these images in memory-based activities, such as identifying people, objects, or events, thereby supporting the user’s cognitive and emotional therapy. The "Memory Box" feature, as depicted in FIG. 5C, may be configured to dynamically integrate with the emotion recognition module to evaluate user responses to specific images. For example, if a particular image elicits a positive emotional reaction, the system may prioritize that image in future therapy activities to promote engagement and well-being. This user interface simplifies the process of uploading and organizing photos, making it accessible for both users with dementia and their caregivers. By categorizing images into intuitive groups, the system ensures that therapy activities are tailored to the user’s preferences and cognitive needs.

[00100] Referring to FIG. 5D an example diagram 500d depicting a user interface screen 500d for selecting user interests and preferences, in accordance with one or more exemplary embodiments. The interface allows users or caregivers to specify activities the individual enjoys or dislikes, enabling the system to provide personalized activity recommendations. A submit button (502d) allows users to save the selected preferences. This screen facilitates the customization of therapy recommendations by collecting input on activities the user likes and dislikes. At the top of the interface, the text reads: "Tell us interests of Suresh to receive personalized activity recommendations." This prompts the user or caregiver to provide relevant preferences for the individual receiving care. The interface is divided into two sections: Activities Suresh Likes: This section may be configured to allow users or caregivers to select activities that the individual enjoys. Each activity is displayed as a selectable button (e.g., "Art and Craft," "Puzzles and Brain Games," "Dance and Music," etc.). Users can toggle activities on or off to indicate preferences, as shown in the example where "Meditation" is selected while others are deselected. Activities Suresh Does Not Like: This section may be configured to capture activities that the individual dislikes or finds unengaging. Similar to the "Likes" section, it allows toggling of options, enabling caregivers to highlight areas to avoid during therapy sessions. At the bottom of the interface, the Submit button 502d may be configured to save the user’s selections. When clicked, the selected preferences are transmitted to the cognitive health data processing and analysis module hosted on the cloud server. The system may then utilize this data to: Generate personalized activity recommendations tailored to the user’s interests. Avoid suggesting activities from the "Dislikes" category, ensuring higher engagement and satisfaction. This preference data may also be integrated into other modules, such as the activities generating module, to dynamically create or adjust therapy activities. For example: If "Meditation" is marked as a preferred activity, the system may schedule regular meditation sessions or suggest calming activities. If "Puzzles and Brain Games" is marked as a disliked activity, such tasks are deprioritized or excluded from the therapy plan. The interface ensures simplicity and accessibility, using large, clearly labeled buttons that are easy to toggle. This design supports both individuals with dementia and their caregivers in quickly and accurately selecting preferences. By integrating this functionality, the system creates a more personalized therapy experience that aligns with the user’s unique interests and needs, improving the effectiveness of the care plan and fostering a positive therapeutic environment.

[00101] Referring to FIG. 5E an example diagram 500e depicting a user interface designed to enable activity management and exploration for individuals with dementia (PWD) and their caregivers. This interface categorizes activities into various states and provides additional options for game-based therapy. At the top of the interface, the title "Activities" is prominently displayed to indicate the screen's purpose. Adjacent to the title, a View All button may be configured to allow users to access the complete list of activities, regardless of their categorization. The recommended tab 502e may be configured to display activities tailored to the user’s preferences and cognitive profile. These recommendations are generated dynamically by the system based on user interaction data, health metrics, and feedback. The scheduled tab 504e may be configured to list activities that are planned for future sessions. This helps caregivers and users manage daily or weekly schedules effectively. The completed tab 506e may be configured to show activities that the user has already engaged in. This helps track progress and provides a record of past therapeutic interactions. The Recommended Activities section displays activity cards for activities such as "Map Exploration," "Meditation Technique 1," "Vintage Fashion Exploration," "Gardening," and more. Each activity card may be configured to allow users to view details, start the activity, or provide feedback. A View More option is provided to access additional recommended activities. Below the activity cards, the Games Profile section showcases cognitive games such as "Match Left and Right," "Sequence," and "Matching Pairs." These games may be configured to target specific cognitive skills such as memory, attention, and problem-solving. Each game icon may be configured to navigate users to detailed instructions or directly start the game. At the bottom of the interface, a navigation bar provides quick access to other system features: The home button (508e) may be configured to redirect users to the system's main dashboard. The activities button (510e) may be configured to keep users on the current screen for managing activities and exploring games. The reminders button (512e) may be configured to display a list of medication schedules, therapy reminders, and other notifications. The reports button (514e) may be configured to provide visual summaries and progress tracking for cognitive and physical health. The settings button (516e) may be configured to open user preferences, such as language, notification settings, and accessibility options. The activities and games displayed in FIG. 5E are dynamically updated based on real-time data processed by the cognitive health data processing and analysis module. This ensures that all recommendations and schedules are tailored to the user’s unique cognitive and emotional state, enhancing engagement and therapeutic outcomes. By categorizing activities and providing a games section, this interface simplifies therapy management while ensuring a diverse and engaging experience for users. The combination of personalized recommendations and intuitive navigation ensures that therapy goals are met effectively.

[00102] Referring to FIG. 5F an example diagram 500f depicting a user interface designed to facilitate access to games and activities that enhance cognitive stimulation and emotional engagement for individuals with dementia (PWD). At the top of the screen, the title "Activities" indicates the purpose of the interface. Adjacent to the title, a View All button may be configured to allow users to view all available games, regardless of their categorization. The recommended tab 502f may be configured to display games specifically tailored to the user’s cognitive profile, emotional state, and preferences. These recommendations are dynamically generated based on the user’s historical interactions and therapy goals. The scheduled tab 504f may be configured to list games that have been planned for upcoming therapy sessions, helping caregivers and users organize their schedules effectively. The completed tab 506f may be configured to show games that the user has previously engaged in, providing a record of past interactions and achievements. The Games Profile section displays a variety of predesigned games aimed at improving cognitive abilities, including: Match Left and Right: A game that may be configured to improve visual-spatial awareness. Sequence: A game that may be configured to enhance memory and logical thinking. Matching Pairs: A game that may be configured to promote pattern recognition. Word Matching: A game that may be configured to enhance language skills. Image Guess: A game that may be configured to stimulate recall and recognition. Spot the Difference: A game that may be configured to improve attention to detail. Puzzle Game: A game that may be configured to challenge problem-solving skills. Each game icon may be configured to allow users to view details, select difficulty levels, or start the game. Below the predefined games, the Personalized Games section highlights games tailored to the user’s preferences and experiences: Memory Match: May be configured to use tagged photos from the "Memory Box" to enhance memory recall. Relationship Recall: May be configured to use images of loved ones and significant events to strengthen emotional connections and memory retention. The home button (508f) may be configured to redirect users to the system’s main dashboard. The activities button (510f) may be configured to keep users on the current screen for managing games and exploring activities. The reminders button (512f) may be configured to display scheduled therapy sessions, medication reminders, and other notifications. The reports button (514f) may be configured to provide visual summaries of the user’s cognitive and emotional progress over time. The settings button (516f) may be configured to open user preferences for notifications, language options, and system customization. The games and activities shown in FIG. 5F are dynamically updated based on real-time data processed by the cognitive health data processing and analysis module. This ensures that the user’s experience is continually personalized to align with therapy objectives.

[00103] Referring to FIG. 5G an example diagram 500g depicting a user interface screen for activity selection, in accordance with one or more exemplary embodiments. The interface provides various categories of activities, including board games, outdoor activities, and exercises, enabling users or caregivers to select activities tailored to the user's preferences and therapy needs. At the top of the screen, the title "Activity Selection" is prominently displayed, indicating the purpose of the interface. Below the title, activities are organized into categories such as board games, outdoor activities, and exercises. Each activity is represented by a visually engaging card, enhancing user interaction. The interface includes the following categories and activities: Board Games: This section features games that may be configured to improve cognitive abilities, strategic thinking, and social interaction. Examples include: Snakes and Ladders: May be configured to enhance basic arithmetic and decision-making. Play Tambola/Bingo: May be configured to promote number recognition and social engagement. Chess: May be configured to stimulate problem-solving and strategic planning. A Game of Tic-Tac-Toe: May be configured to encourage logical reasoning and quick decision-making. Playing Connect Four: May be configured to develop pattern recognition and tactical skills. Outdoor Activities: This section features activities aimed at physical well-being and social engagement. Examples include: Outdoor Walks: May be configured to encourage mobility and relaxation. Bubble Popping: May be configured to promote hand-eye coordination and sensory engagement. Ring Toss: May be configured to improve motor skills and accuracy. Kite Flying: May be configured to enhance focus and coordination. Seven Stones: May be configured to encourage teamwork and physical activity. Exercising: This section provides exercises tailored to different body parts and fitness levels. Examples include: Upper Body: May be configured to strengthen arm and shoulder muscles. Lower Body: May be configured to improve leg strength and balance. Back and Spine: May be configured to promote posture and flexibility. Each category includes a View All button, which may be configured to display the full list of activities available within that category. This ensures that users or caregivers can explore a comprehensive range of options to suit the user’s preferences and therapeutic goals. Activity cards may be configured to allow users to view additional details about the activity, such as instructions, benefits, and required materials. Users can select an activity by tapping on the corresponding card, which triggers the system to display further options for scheduling or starting the activity immediately. The selected activities are transmitted to the cognitive health data processing and analysis module, which may be configured to: Incorporate the activity into the user’s personalized therapy plan. Track the user’s engagement and performance during the activity. Provide feedback and recommendations based on the user’s interaction data. By categorizing activities into intuitive groups, the interface enhances usability for individuals with dementia and their caregivers.

[00104] Referring to FIG. 6A is an example diagram depicting an exemplary user interface of word search game, in accordance with one or more exemplary embodiments. At the top of the interface, the title reads "Find the Hidden Words in the Grid!" This is followed by a brief instruction: "Select the letters to form a word from the list below. Good luck!" This text provides users with clear guidance on how to interact with the game. The central feature of the interface is the letter grid, which consists of an arrangement of letters designed to hide specific words. Each letter in the grid may be configured to respond to user interactions, such as taps or swipes. For example: When a user selects a letter that is part of a hidden word, the system may visually highlight the word (e.g., "ELEPHANT" and "CHEETAH" are marked in FIG. 6A). Below the grid, the Words to Find section displays a list of target words that the user needs to identify. This section may be configured to: Visually update the status of each word when it is found, such as striking through the completed words (e.g., "ELEPHANT" and "CHEETAH"). Adjust dynamically based on game difficulty or user preferences. The game mechanics may be configured to: Provide hints, such as highlighting the first letter of a word, for users who may need assistance. Adapt the difficulty level by increasing or decreasing the grid size or the complexity of the words, based on the user’s performance and cognitive abilities. Once the user identifies all the words in the grid, the system may provide visual or auditory feedback, such as displaying a congratulatory message or sound, to reinforce positive engagement.

[00105] Referring to FIG. 6B is an example diagram depicting user interface of a matching pairs game, in accordance with one or more exemplary embodiments. At the top of the interface, the title reads "Match all the pairs to win the game!" with a brief instruction: "Flip the cards and find the matching images. Good luck!" This text provides clear and concise guidance to users or caregivers. The central feature of the interface is a grid of cards, where each card is initially displayed face-down. These cards may be configured to: Respond to user interactions, such as taps, by flipping to reveal the image on the card. Flip back to their original position if a matching pair is not found. The game mechanics may be configured to: Provide immediate feedback when a matching pair is found by visually highlighting the matched cards and removing them from the grid. Track the number of moves or attempts made by the user to encourage focus and engagement. Each image on the cards may be selected to align with the user’s preferences or therapeutic goals. For example: Images of animals, objects, or familiar scenes may be used to stimulate recognition and memory recall. The game may incorporate custom images uploaded by caregivers or family members to personalize the experience further. The difficulty level of the game may be configured dynamically by: Adjusting the grid size or the number of cards. Increasing or decreasing the complexity of the images based on the user’s cognitive abilities and performance. Upon completing the game by matching all pairs, the system may provide visual or auditory feedback, such as displaying a congratulatory message or sound effect, to reinforce a sense of accomplishment. The data collected during the game, such as completion time and accuracy, may be transmitted to the cognitive health data processing and analysis module. This module may be configured to: Analyze the user’s cognitive performance over time. Adjust future game settings to maintain an optimal level of challenge and engagement. Generate progress reports for caregivers or healthcare providers, offering insights into the user’s cognitive health.

[00106] Referring to FIG. 6C is an example diagram depicting a user interface screen of a spot-the-differences game, in accordance with one or more exemplary embodiments. This game involves comparing two similar images and identifying subtle differences between them. At the top of the interface, the title reads "Spot the Differences!" This is followed by instructions: "Find and click all the differences between the two images below." These instructions provide clarity to users or caregivers about the objective and gameplay. The central feature of the interface includes two images displayed side by side. These images are nearly identical, with a set number of differences embedded within them. The images may be configured to: Highlight differences when a user successfully identifies and clicks on them. Provide immediate feedback, such as visual or auditory cues, to acknowledge correct selections A counter labeled "Found Differences: 0 / 3" is displayed below the images. This counter may be configured to: Update dynamically as users identify and select differences. Reflect the total number of differences required to complete the game, motivating users to progress. A Use Hint button is prominently placed below the counter. This button may be configured to: Highlight an unspotted difference for users who are struggling to progress. Restrict the number of available hints to maintain a level of challenge and engagement. The gameplay may be configured to: Adjust the difficulty by increasing the number of differences or the complexity of the images, based on the user’s cognitive abilities and performance. Provide additional time for users to complete the game if required. Upon successfully identifying all differences, the system may display a congratulatory message or sound effect, reinforcing a sense of accomplishment and encouraging continued engagement. The game data, such as the number of hints used, time taken to complete, and accuracy, may be transmitted to the cognitive health data processing and analysis module. This module may be configured to: Analyze the user’s visual-spatial abilities and attention to detail over time. Adjust the difficulty of future games to ensure they remain engaging and appropriate for the user’s cognitive level. Generate insights and progress reports for caregivers or healthcare providers.

[00107] Referring to FIG. 6D is an example diagram depicting a user interface screen of a sliding puzzle game, in accordance with one or more exemplary embodiments. At the top of the screen, the title reads "Solve the Puzzle!" followed by a brief instruction: "Drag and drop the tiles to arrange them in the correct order." These elements provide users with clear guidance on how to interact with the game. The central section of the interface features: A grid of tiles, where each tile represents a part of the overall image. The tiles may be configured to: Respond to user interactions, such as dragging and dropping, allowing users to rearrange them within the grid. Provide visual feedback, such as snapping into place, when a tile is correctly positioned. One or more tiles are intentionally missing from the grid to create space for movement. This empty space may be configured to: Allow adjacent tiles to be moved into it, enabling users to rearrange the remaining tiles progressively. Introduce a layer of strategy as users must consider the sequence of movements required to solve the puzzle. The gameplay mechanics may be configured to: Offer varying levels of difficulty by adjusting the number of tiles or the complexity of the image. Include a hint feature that displays the completed image for reference, helping users who may find the puzzle challenging. Upon completing the puzzle by arranging all tiles in their correct positions, the system may provide visual or auditory feedback, such as a congratulatory message or sound, to reward the user’s effort and encourage continued participation. The sliding puzzle game interface is designed to: Enhance cognitive and motor skills by requiring users to analyze, plan, and execute sequences of movements. Promote engagement through an interactive and visually stimulating activity. Provide a sense of accomplishment upon successful completion. The system may be configured to collect gameplay data, such as the number of moves and time taken to solve the puzzle. This data may be transmitted to the cognitive health data processing and analysis module, which may be configured to: Analyze the user’s problem-solving abilities and track progress over time. Adjust future game settings to align with the user’s cognitive abilities and preferences. Generate insights for caregivers or healthcare providers to support personalized therapy planning. By integrating this sliding puzzle game into the system, FIG. 6D demonstrates how gamification can be effectively utilized to support therapeutic goals, making cognitive exercises enjoyable and accessible for individuals with dementia.

[00108] Referring to FIG. 6E illustrates an exemplary user interface screen for a “Guess the Image” game aimed at improving cognitive recognition and memory recall, in accordance with one or more exemplary embodiments. At the top of the screen, the title reads "Guess the Image!" followed by instructions: "Find the hidden picture. Select the letters below to guess the name!" These provide clear guidance on the gameplay objective and steps. The central section features: A partially revealed image divided into tiles. As the game progresses, additional tiles may be revealed to provide more visual clues. The tiles may be configured to: Automatically reveal over time or as the user guesses letters correctly. Enhance user engagement by gradually unveiling the image to maintain interest. Below the image is a set of letter slots where the user forms their guess. The letter slots may be configured to: Dynamically update as the user selects letters. Provide immediate feedback by highlighting incorrectly or correctly selected letters. A letter selection grid is displayed at the bottom, containing a pool of letters that the user can choose from. The letter grid may be configured to: Allow users to select letters to populate the slots for guessing the word. Provide a reset option (e.g., a "clear" button) for users to modify their guesses. Gameplay mechanics may include: A scoring system to reward correct guesses and deduct points for incorrect ones. A hint option to reveal a letter or additional parts of the image for users who are struggling. Upon correctly guessing the name of the image, the system may display a congratulatory message or sound, reinforcing a sense of accomplishment and encouraging further participation. The gameplay data, such as the number of attempts, time taken, and use of hints, may be transmitted to the cognitive health data processing and analysis module. This module may be configured to: Evaluate the user’s recognition and recall abilities based on performance metrics. Adjust future game settings, such as the difficulty of images or word complexity, to align with the user’s cognitive level. Generate reports for caregivers or healthcare providers to monitor progress and tailor therapy plans.

[00109] Referring to FIG. 7 is a flow diagram depicting a method for providing personalized cognitive therapy and dementia care, in accordance with one or more exemplary embodiments. The exemplary method 700 commences at step 702, enabling a user to register and input demographic and health information by a cognitive therapy and health monitoring module on a first computing device. Thereafter at step 704, collecting real-time health metrics, including heart rate, movement patterns, and location data, by a wearable device. Thereafter at step 706, collecting the real-time health metrics from the wearable device by the cognitive therapy and health monitoring module. Thereafter at step 708, monitoring the emotional state of the user by the cognitive therapy and health monitoring module using emotion recognition technology. Thereafter at step 710, transmitting the user interaction data and health metrics from the first computing device to a cloud server over the network by the cognitive therapy and health monitoring module. Thereafter at step 712, receiving the user interaction data and health metrics by a cognitive health data processing and analysis module enabled in the cloud server. Thereafter at step 714, analyzing the user interaction data and health metrics by the cognitive health data processing and analysis module to assess cognitive health, including identifying cognitive patterns and emotional responses of the user. Thereafter at step 716, predicting behavioral risks, including wandering and agitation, by the cognitive health data processing and analysis module, using historical and real-time data received from the first computing device and wearable device. Thereafter at step 718, generating personalized therapy recommendations, including tailored activities and memory-based games, based on the user’s health data and cognitive profile, by the cognitive health data processing and analysis module. Thereafter at step 720, detecting safety incidents, including falls and wandering, by the cognitive health data processing and analysis module, and generating immediate alerts for transmission. Thereafter at step 722, transmitting therapy recommendations, insights, and safety incident alerts from the cloud server to the first computing device, a second computing device, and a third computing device over the network.

[00110] Referring to FIG. 8 is a block diagram 800 illustrating the details of a digital processing system 800 in which various aspects of the present disclosure are operative by execution of appropriate software instructions. The Digital processing system 800 may correspond to the computing devices (or any other system in which the various features disclosed above can be implemented).

[00111] Digital processing system 800 may contain one or more processors such as a central processing unit (CPU) 810, random access memory (RAM) 820, secondary memory 830, graphics controller 860, display unit 870, network interface 880, and input interface 890. All the components except display unit 870 may communicate with each other over communication path 850, which may contain several buses as is well known in the relevant arts. The components of Figure 6 are described below in further detail.

[00112] CPU 810 may execute instructions stored in RAM 820 to provide several features of the present disclosure. CPU 810 may contain multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, CPU 810 may contain only a single general-purpose processing unit.

[00113] RAM 820 may receive instructions from secondary memory 830 using communication path 850. RAM 820 is shown currently containing software instructions, such as those used in threads and stacks, constituting shared environment 825 and/or user programs 826. Shared environment 825 includes operating systems, device drivers, virtual machines, etc., which provide a (common) run time environment for execution of user programs 826.

[00114] Graphics controller 860 generates display signals (e.g., in RGB format) to display unit 870 based on data/instructions received from CPU 810. Display unit 870 contains a display screen to display the images defined by the display signals. Input interface 890 may correspond to a keyboard and a pointing device (e.g., touch-pad, mouse) and may be used to provide inputs. Network interface 860 provides connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems connected to the network.

[00115] Secondary memory 830 may contain hard drive 835, flash memory 636, and removable storage drive 837. Secondary memory 830 may store the data software instructions (e.g., for performing the actions noted above with respect to the Figures), which enable digital processing system 800 to provide several features in accordance with the present disclosure.

[00116] Some or all of the data and instructions may be provided on removable storage unit 840, and the data and instructions may be read and provided by removable storage drive 837 to CPU 810. Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, removable memory chip (PCMCIA Card, EEPROM) are examples of such removable storage drive 837.

[00117] Removable storage unit 840 may be implemented using medium and storage format compatible with removable storage drive 837 such that removable storage drive 837 can read the data and instructions. Thus, removable storage unit 840 includes a computer readable (storage) medium having stored therein computer software and/or data. However, the computer (or machine, in general) readable medium can be in other forms (e.g., non-removable, random access, etc.)

[00118] In this document, the term "computer program product" is used to generally refer to removable storage unit 840 or hard disk installed in hard drive 835. These computer program products are means for providing software to digital processing system 800. CPU 810 may retrieve the software instructions, and execute the instructions to provide various features of the present disclosure described above.

[00119] The term “storage media/medium” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operate in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as storage memory 830. Volatile media includes dynamic memory, such as RAM 820. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.

[00120] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fibre optics, including the wires that comprise bus (communication path) 850. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.

[00121] Reference throughout this specification to “one embodiment”, “an embodiment”, or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases “in one embodiment”, “in an embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.

[00122] Although the present disclosure has been described in terms of certain preferred embodiments and illustrations thereof, other embodiments and modifications to preferred embodiments may be possible that are within the principles of the invention. The above descriptions and figures are therefore to be regarded as illustrative and not restrictive.

[00123] Thus the scope of the present disclosure is defined by the appended claims and includes both combinations and sub-combinations of the various features described hereinabove as well as variations and modifications thereof, which would occur to persons skilled in the art upon reading the foregoing description.
, Claims:1. A system for providing personalized cognitive therapy and dementia care, comprising

a first computing device comprising a processor for executing instructions from a cognitive therapy and health monitoring module located within the first computing device, wherein the cognitive therapy and health monitoring module is configured to enable a user to register and input demographic and health information through a user interface;

a wearable device communicatively coupled to the first computing device over the network, whereby the wearable device is configured to collect real-time health metrics of the user, including heart rate, movement patterns, and location data, thereby the cognitive therapy and health monitoring module collects the real-time health metrics, including heart rate and movement patterns data from the wearable device and monitor the emotional state of the user through emotion recognition technology, thereby the cognitive therapy and health monitoring module transmit the user interaction data and health metrics to a cloud server over the network; and

the cloud server, communicatively coupled to the first computing device and the wearable device over the network, wherein the cloud server comprises a cognitive health data processing and analysis module, the cognitive health data processing and analysis module configured to process user interaction data and health metrics received from the first computing device and analyze cognitive patterns and emotional responses of the user to assess cognitive health, the cognitive health data processing and analysis module predict behavioral risks, including wandering and agitation, based on historical and real-time data, the cognitive health data processing and analysis module generates personalized therapy recommendations and insights, and detects the safety incidents, including falls and wandering, and transmits to the first computing device, a second computing device, and a third computing device.

2. The system as claimed in claim 1, wherein the processor executes instructions from the cognitive therapy and health monitoring module, the cognitive therapy and health monitoring module comprises an emotion recognition module configured to analyze the user’s emotional state by processing data from online activities, games, facial expressions, and voice tone, and dynamically adjust therapy activities and communication based on the detected emotional state.

3. The system as claimed in claim 1, wherein the processor executes instructions from the cognitive therapy and health monitoring module, the cognitive therapy and health monitoring module comprises a companion monitoring module configured to monitor the user’s engagement, emotional state, and behavior during interactions with the system, provide empathetic and adaptive communication, and recommend or modify therapy activities based on the user’s emotional state.

4. The system as claimed in claim 1, wherein the processor executes instructions from the cognitive therapy and health monitoring module, the cognitive therapy and health monitoring module comprises a photo tagging and categorization module configured to enable users or caregivers to upload, tag, and categorize photos for use in memory-based games, and transmit the tagged photos to the server for generating personalized memory-based games.

5. The system as claimed in claim 1, wherein the server executes instructions from the cognitive health data processing and analysis module, the cognitive health data processing and analysis module comprises a cognitive health data analysis module configured to process user data collected from interactions, wearable devices, and therapy activities to analyze cognitive patterns, emotional responses, and user performance for assessing cognitive health.

6. The system as claimed in claim 1, wherein the server executes instructions from the cognitive health data processing and analysis module, the cognitive health data processing and analysis module comprises a behavior prediction module configured to predict user behavior patterns based on historical and real-time data, forecast potential risks such as wandering and agitation, and generate preemptive alerts for caregivers or doctors.

7. The system as claimed in claim 1, wherein the server executes instructions from the cognitive health data processing and analysis module, the cognitive health data processing and analysis module comprises a personalized therapy module 306 configured to generate and adapt therapy plans tailored to the user’s cognitive health status, emotional state, and performance in previous activities.

8. The system as claimed in claim 1, wherein the server executes instructions from the cognitive health data processing and analysis module, the cognitive health data processing and analysis module comprises a wearable data processing module 308 configured to process data collected from wearable devices, including heart rate, physical activity, sleep patterns, and location data, to monitor the user’s health in real-time and detect anomalies such as falls or irregular heart rates.

9. The system as claimed in claim 1, wherein the server executes instructions from the cognitive health data processing and analysis module, the cognitive health data processing and analysis module comprises a memory games generating module configured to create personalized memory-based games using tagged and categorized photos uploaded by users or caregivers, to enhance the user’s memory recall abilities and engagement.

10. A method for providing personalized cognitive therapy and dementia care, comprising:

enabling a user to register and input demographic and health information by a cognitive therapy and health monitoring module on a first computing device;

collecting real-time health metrics, including heart rate, movement patterns, and location data, by a wearable device;

collecting the real-time health metrics from the wearable device by the cognitive therapy and health monitoring module;

monitoring the emotional state of the user by the cognitive therapy and health monitoring module using emotion recognition technology;

transmitting the user interaction data and health metrics from the first computing device to a cloud server over the network by the cognitive therapy and health monitoring module;

receiving the user interaction data and health metrics by a cognitive health data processing and analysis module enabled in the cloud server;

analyzing the user interaction data and health metrics by the cognitive health data processing and analysis module to assess cognitive health, including identifying cognitive patterns and emotional responses of the user;

predicting behavioral risks, including wandering and agitation, by the cognitive health data processing and analysis module, using historical and real-time data received from the first computing device and wearable device;

generating personalized therapy recommendations, including tailored activities and memory-based games, based on the user’s health data and cognitive profile, by the cognitive health data processing and analysis module;

detecting safety incidents, including falls and wandering, by the cognitive health data processing and analysis module, and generating immediate alerts for transmission; and

transmitting therapy recommendations, insights, and safety incident alerts from the cloud server to the first computing device, a second computing device, and a third computing device over the network.

Documents

Application Documents

# Name Date
1 202441093418-STATEMENT OF UNDERTAKING (FORM 3) [29-11-2024(online)].pdf 2024-11-29
2 202441093418-REQUEST FOR EARLY PUBLICATION(FORM-9) [29-11-2024(online)].pdf 2024-11-29
3 202441093418-POWER OF AUTHORITY [29-11-2024(online)].pdf 2024-11-29
4 202441093418-FORM-9 [29-11-2024(online)].pdf 2024-11-29
5 202441093418-FORM FOR STARTUP [29-11-2024(online)].pdf 2024-11-29
6 202441093418-FORM FOR SMALL ENTITY(FORM-28) [29-11-2024(online)].pdf 2024-11-29
7 202441093418-FORM 1 [29-11-2024(online)].pdf 2024-11-29
8 202441093418-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [29-11-2024(online)].pdf 2024-11-29
9 202441093418-EVIDENCE FOR REGISTRATION UNDER SSI [29-11-2024(online)].pdf 2024-11-29
10 202441093418-DRAWINGS [29-11-2024(online)].pdf 2024-11-29
11 202441093418-DECLARATION OF INVENTORSHIP (FORM 5) [29-11-2024(online)].pdf 2024-11-29
12 202441093418-COMPLETE SPECIFICATION [29-11-2024(online)].pdf 2024-11-29
13 202441093418-FORM-26 [13-03-2025(online)].pdf 2025-03-13
14 202441093418-STARTUP [20-03-2025(online)].pdf 2025-03-20
15 202441093418-FORM28 [20-03-2025(online)].pdf 2025-03-20
16 202441093418-FORM 18A [20-03-2025(online)].pdf 2025-03-20
17 202441093418-FER.pdf 2025-05-14
18 202441093418-RELEVANT DOCUMENTS [22-10-2025(online)].pdf 2025-10-22
19 202441093418-PETITION UNDER RULE 137 [22-10-2025(online)].pdf 2025-10-22
20 202441093418-Form-4 u-r 12(5) [22-10-2025(online)].pdf 2025-10-22
21 202441093418-FORM 3 [22-10-2025(online)].pdf 2025-10-22
22 202441093418-FER_SER_REPLY [22-10-2025(online)].pdf 2025-10-22
23 202441093418-CORRESPONDENCE [22-10-2025(online)].pdf 2025-10-22
24 202441093418-COMPLETE SPECIFICATION [22-10-2025(online)].pdf 2025-10-22
25 202441093418-FORM-26 [23-10-2025(online)].pdf 2025-10-23

Search Strategy

1 202441093418_SearchStrategyNew_E_SearchHistoryE_22-04-2025.pdf