Abstract: In system and method of present disclosure, a personal virtual assistant is generated that is configured with an ability to enhance user self-awareness and overall well-being. The personal virtual assistant is fed with data captured from user conscious and subliminal states by various sensors disposed on head mounted display, plurality of body wearables and brain imaging devices. The system being enabled by an artificial intelligence engine processes and analyses the data collected from user conscious, preconscious and subliminal states to derive a correlation therebetween that helps determining user mental impressions, state of mind and emotional levels while responding to external events. Based on correlated information, the personal virtual assistant recommends and instructs the user towards corrective action in a manner, tone and style most compatible with the user, like that of his friend or a companion.
DESC:
FIELD OF THE INVENTION
Embodiment of the present invention relates to a system and method for generating a personalized virtual assistant and more particularly to a system and method for generating a personalized virtual assistant that can provide real time guidance and assist in mindful decision making to a user, thereby contributing to his enhanced well-being.
BACKGROUND OF THE INVENTION
Existing “virtual assistant” applications generally respond to a user stimulus, for example a spoken or typed question, by retrieving information from a knowledge base and presenting that information through a visual indicator and/or an auditory response. However, these responses are often mechanical, do not have any real semblance of a personality, and do not react efficiently as user profile, viewpoint, sentiments, core feelings and emotional states are generally not taken into account and in-depth consideration. This creates a need to develop a customizable solution that provides a real time guidance and self-awareness to user as his true companion or guide in a manner consistent with user profile, and tailored to consider sentiments, speech, and behavioural pattern of the user.
Some work has been done in this field to capture user actions, speech, behavior and the like. This primarily involves tracking of physical and physiological state of user, emotion and cognition of user and the like for monitoring user health and fitness activity or guiding user in decision making on routine basis. However, a significant part of human existence is associated with a deep level of psychology called the “unconscious” because it is not accessible to conscious thought, and is primarily driven by emotions and instinct. As well established and accepted by all, these two mind states collectively determine user action.
However, in this respect, except for few outliers, hardly any consideration has been given to universal impulses contributing to combined “conscious-unconscious”, which is vital and equally significant in understanding user behavior and response patterns. The power of assessing user’s combined conscious and unconscious state and drawing interpretation from the correlated data has been largely ignored by neuroscientific researchers and stakeholders interested in human psychology and mining cognitive behavioural patterns.
While an individual can conveniently manipulate his behavior and responses to external events in social settings, it is extremely difficult for one to analyse and understand one’s deep state of mind, innate emotions cognition, primal reflexes at both conscious and unconscious levels. If left unaddressed, such emotions, stress and mental state may trigger unsettled feelings within user and may prove defeating for him in long run, as his distress levels elevate, mind and body stays disharmonious and unattuned, performance falls and eventually life loses its essence and meaning for him.
According to World Health Organization, more than 500 million people around the world have depression, anxiety disorders or other emotional turbulence, and it is projected that 18% of adult population will experience depression at some point in their life time. Since Covid-19 serious damage has happened to people mental health. With increasing psychological distress and suicidal tendencies, there is a strong incentive to build psychotherapeutic methods, systems and processes to alter human cognitive state and can help him easily transit from depressive to meditative and tranquil state.
The existing artificial bots are too mechanical and superficial in understanding human emotions and correspondingly addressing them in a more contextual and meaningful way. The answers to generated queries are typically too incomprehensible, feigned, fictitious and unrelatable that they can barely assist user in overcoming his depressed state. Imagining a true guide and companion who understands core human feelings, suppressed emotions, mind-body balance and other cognitive states is a matter of science fictions as of now.
In this vein, the present disclosure sets forth system and method for creating a virtual assistant for the user, embodying advantageous alternatives and improvements to existing mechanical human assistants, robots or bots, and that may address one or more of the challenges or needs mentioned herein, as well as provide other benefits and advantages.
OBJECT OF THE INVENTION
An object of the present invention is to provide a virtual assistant that can enhance user awareness and overall well-being.
Another object of the present invention is to provide an AI enabled virtual assistant that can track and comprehend user actions, behavior, state of mind, cognitive capability and emotional levels to provide meaningful suggestions to user.
Yet another object of the present invention is to provide a virtual assistant that is capable of exploring subliminal perception of user and correlating with user conscious behavior to effectively guide user in transitioning from distressed to tranquil state.
Yet another object of the present invention is to provide a NLP enabled virtual assistant that can provide recommendations and instructions to user in a manner, tone, phrases and voice most relatable to user.
Yet another object of the present invention is to provide a virtual assistant that can monitor all day activities (conscious state) and map it with user preconscious or subliminal state to determine user mindfulness and decision making capability.
In yet another embodiment, the virtual assistant acts as a true companion, guide, friend and comrade to the user as enables user to become self-aware and rational being.
SUMMARY OF THE INVENTION
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Accordingly, in accordance with first aspect of the present disclosure, a method for providing personalized assistance to a user via a personal virtual assistant is provided. The method comprising: generating a user profile from user personally identifiable information, user routine physical activity and a user routine behavior corresponding to the user routine physical activity. This is followed by recording of the real-time user physical activity and real-time user behavior corresponding to the real-time user physical activity along with user physiological data and brain wave patterns. Now, in an event a deviation is found in the real-time user behavior from the routine user behavior when mapped with the generated user profile, the processing unit determines if deviated real-time user behavior is exhibited within a time instance less than, equal to or higher than a predetermined threshold. Next, the processing unit correlates the real-time physical activity, physiological data and brain wave patterns and communicates the correlated information from to the personal virtual assistant. Finally, the personal virtual assistant validates the correlated information and recommends the user with a corrective response with enhanced self-awareness.
In another aspect of the present disclosure, a system of providing a personalized assistance to a user via a personal virtual assistant is proposed, which is configured to communicate with an artificial intelligence engine that is executed upon a processing unit of the system. Accordingly, the system comprises of a head mounted device to record real-time user physical activity and real-time user behavior corresponding to the real-time user physical activity; one or more body wearables to capture user physiological data; and one or more brain imaging devices configured to record brain wave patterns. Now, the processing unit of the system generates a user profile from user personally identifiable information, user routine physical activity and a user routine behavior corresponding to the user routine physical activity. When in an event deviation of the real-time user behavior from the routine user behavior is found when mapped with the generated user profile, the processing unit is configured to: determine, if deviated real-time user behavior is exhibited within a time instance less than, equal to or higher than a predetermined threshold; correlate the real-time physical activity, physiological data and brain wave patterns; and communicate the correlated information to a personal virtual assistant. Finally, the personal virtual assistant validates the correlated information and recommends the user with a corrective response with enhanced self-awareness.
BRIEF DESCRIPTION OF THE DRAWINGS
So that the manner in which the above recited features of the present invention can be understood in detail, a more particular to the description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, the invention may admit to other equally effective embodiments.
These and other features, benefits and advantages of the present invention will become apparent by reference to the following text figure, with like reference numbers referring to like structures across the views, wherein:
Fig. 1 illustrates an exemplary environment of user interacting with virtual assistant, in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION
While the present invention is described herein by way of example using embodiments and illustrative drawings, those skilled in the art will recognize that the invention is not limited to the embodiments of drawing or drawings described and are not intended to represent the scale of the various components. Further, some components that may form a part of the invention may not be illustrated in certain figures, for ease of illustration, and such omissions do not limit the embodiments outlined in any way. It should be understood that the drawings and detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary, the invention is to cover all modifications, equivalents, and alternatives falling within the scope of the present invention as defined by the appended claims.
As used throughout this description, the word "may" be used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense, (i.e., meaning must). Further, the words "a" or "an" mean "at least one” and the word “plurality” means “one or more” unless otherwise mentioned. Furthermore, the terminology and phraseology used herein is solely used for descriptive purposes and should not be construed as limiting in scope. Language such as "including," "comprising," "having," "containing," or "involving," and variations thereof, is intended to be broad and encompass the subject matter listed thereafter, equivalents, and additional subject matter not recited, and is not intended to exclude other additives, components, integers or steps.
Likewise, the term "comprising" is considered synonymous with the terms "including" or "containing" and term “personalized” is considered synonymous with “personal” for applicable legal purposes. Any discussion of documents, acts, materials, devices, articles, and the like are included in the specification solely for the purpose of providing a context for the present invention. It is not suggested or represented that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant to the present invention.
In this disclosure, whenever a composition or an element or a group of elements is preceded with the transitional phrase “comprising”, it is understood that we also contemplate the same composition, element or group of elements with transitional phrases “consisting of”, “consisting”, “selected from the group of consisting of, “including”, or “is” preceding the recitation of the composition, element or group of elements and vice versa.
The present invention is described hereinafter by various embodiments with reference to the accompanying drawings, wherein reference numerals used in the accompanying drawing correspond to the like elements throughout the description. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiment set forth herein. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art.
In the following detailed description, numeric values and ranges are provided for various aspects of the implementations described. These values and ranges are to be treated as examples only and are not intended to limit the scope of the claims. In addition, a number of materials are identified as suitable for various facets of the implementations. These materials are to be treated as exemplary and are not intended to limit the scope of the invention.
In accordance with one general embodiment of present disclosure, the present system and method are directed to generating a virtual assistant for user that enables him to enhance his overall well-being, self- awareness. Further, the virtual assistant helps the user transit from any emotionally turbulent state to peaceful state without having to undergo any therapeutic treatment or undergoing through the pain of explaining the therapist one’s mental state.
The significant role played by subconscious mind while performing various volitional day-to-day activities cannot be undermined. It has now been sufficiently established that activities of certain part of the brain come to consciousness keeping the rest of the activities of the brain under the veil in subconscious mind. For example, while reading a book, the words are processed at our visual cortex, while the meaning and interpretation of those words engages our frontal lobe. However, all these roles remain hidden and only activity seen is reading and enjoying a book.
However, the simple and routine activities that one performs is not merely an outcome of his only conscious behavior. Rather, the highly integrated subconscious space involve various regions of brains to effortlessly perform some basic procedural activities. Nonetheless, the role of subconscious is not merely restricted to performance of mundane routine tasks; rather the user often takes significant decisions of his life guided by unconsciously settled memories, thought patterns, emotions and the like.
The present disclosure explores role and significance of combining conscious and unconscious state of minds to understand user behavior, user motivation levels, capabilities and potential to eventually enhance user performance levels. It will be appreciated by those expert in the field that actions of an unconscious mind precede the arrival of a conscious mind—that action precedes reflection. Even intense training sessions, prolonged practice hours and committed efforts may not fully capitalize on user potential, unless an optimal mind-body balance is assured before the critical event.
Often the user may succumb to increased stress levels, audience pressure, self-expectations and falter if his mind and body are not in harmony with each other. Thus, he needs some insights into his mental state and an able guidance that can help him overcome temporal distraction and bring focus levels to mind-body coordination. In background of above, the present disclosure attempts to track and capture user conscious and measurable subconscious state of minds to draw a causal link therebetween and infer meaningful results therefrom such that an optimal brain-body balance can be achieved whenever desired by user.
Accordingly, the present disclosure assists the user in critical situations and otherwise to attain a balanced state of mind and body, which contributes to enhanced user performance levels in all spheres of life. In first embodiment, the system and method of present disclosure is configured to generate a personal virtual assistant (PVA) based on user profile generated from user biometric information, personality profile, user routine physical activity, user routine behavior and a combination thereof, which functions as a benchmark in assessing most pertinently user conscious and subconscious state.
In one exemplary embodiment, personal virtual assistant or also commonly referred as “user avatar” or “holographic avatar” for the purposes of present disclosure, is represented in visual or audial media, including digitally rendered virtual avatars, holograms, “chat bots”, or any other method known to convey digital representations of personalities. The personal virtual assistant of present disclosure assists user in achieving optimal and balanced body-mind state for enhanced user performance based on user profile, sentimental/behavioural characteristics, and user conscious & subconscious state, as explained in later sections.
In one preferred embodiment, the personal virtual assistant is configured to generate user profile from recorded user routine physical activity and user routine behavior corresponding to the recorded user physical activity. The recorded information is reflective of user conscious, subconscious or unconscious state and a correlation therebetween is vital in helping the user to develop a more awakened conscious experience at not just physical or physiological level but most importantly at neurological and atomic levels.
Referring to Fig. 1, in order to experience an insight into one’s conscious, subconscious or unconscious state, the present disclosure present a system 1000 equips the user 100 with an advanced and sophisticated head mounted display (HMD) 200 that enables the user 100 interact with his personal virtual assistant 500. Additionally, the user 100 is configured with one or more sensor enriched body wearables, such as smartwatch 300a, smart ring 300b, smart jewellery 300c, (collectively referred by numeral “300”) as shown in Fig. 1 in order to track user physical and physiological activity information, user interaction, external media inputs and user state. All such information from the HMD 200 and body wearables 300 is processed by processing unit 700 of the system 1000.
In one exemplary embodiment, the user profile may be developed through user input of personally identifiable information, including personal information such as a name, birthdate, address, occupation, height, weight, health history, and/or family information. The personally identifiable information is supplemented with mapped user action-behavior information to generate an all-inclusive user profile.
In yet another embodiment, the user response to a given stimuli may be derived from user profile that may be additionally generated based one or more existing user online profiles or data, including user created and approved social media profiles, social media activity, website browsing history, purchase data, user response to prompts from various media channels and the like etc. All these response are recorded by system to generate an all-inclusive user profile that is maintained in a dedicated repository 600.
Now, in next embodiment the system 1000 records and monitors real-time user response to a given external/internal stimulus (e.g., a question being asked, a movement, a touch, or a sound). The response to stimuli may be seen in user actions or behaviors that are generally derived from user personality trait or counter response from the user 100. However, the present disclosure attempts to monitor and record real-time user physical activity and corresponding real-time user behavior.
Accordingly, the user 100 physical and physiological information related to real-time user activity and user behavior may be obtained from multi-spectrum camera or image processing unit or imaging unit or one or more motion sensors disposed on the head mounted display 100 or body wearables 300. This information may include tracked physiological or biometric information about the user, motion data regarding movement of the user and image data corresponding to two or three dimensional image captures of the user. The other parameters of the user may be captured via different kinds of sensors embedded on head mounted display or other body wearable- pressure sensors, electromagnetic sensors, microphones, cameras, odour sensors and/or chemical sensors. Sensors include any electronic medium through which the device is operable to receive and record an input, including pressure sensors, electromagnetic sensors, microphones, cameras, and/or chemical sensors.
In next significant embodiment, system 1000 further records the user brainwaves in order to capture user brain state with different levels of consciousness using any of brain imaging techniques such as EEG neuroimaging that measures neuronal activity, functional magnetic resonance imaging f(MRI) and others (collectively referred by numeral “400”). These will help in recording how brain activity changes with change in given stimuli. For example, when the user is fully aware, alert and conscious, his brain will predominantly operate in beta brainwave patterns out of alpha, beta, delta and theta brainwave patterns known of human consciousness.
Briefly, Table 1 below summarizes general classification of brain waves Delta, Theta, Alpha, Beta and Gamma.
Brain Wave Type Frequency Range Specification
Delta 0.5-3 Hz - Slowest of all and associated with sleep
- In waking state, helps to provide access to subconscious activity, encouraging the flow into conscious thought
Theta 3-8 Hz -Deeply relaxed and meditative state of mind
-Helps in memory recollection and improvement
-Increased creativity and learning
Alpha 8-12 Hz -Creativity, relaxation, reflection
-Problem solving and visualization that helps in creativity
Beta 12-27 Hz -Increased concentration and alertness
- Better analysis and work productivity
Gamma >27 Hz -Deep sleep as in anaesthesia
-Regional learning, memory and language processing
Table 1
In one significant aspect of present disclosure, user subconscious or rather subliminal state is tracked while his consciousness is broken; say for example the user is in deep sleep or in any unconscious state for any reason like general anaesthesia, vegetative-state or minimally conscious state using user body wearables (300). Here “unconscious state” all includes experiences or responses to subliminal or preconscious stimuli. Precisely, subliminal stimuli is a one to which responses are often undetectable, even with focused attention.
On the other hand, there may be situations where the user response appears to be deviated from routine user behavior when mapped with the action-behavior information maintained in user profile. For example, if the user has responded too quickly or spontaneously to any external event/stimuli due to temporary distraction or inattention, it is clearly not a rightful depiction of user capacity, capability or logical brain functioning. The system 1000 utilizes the capabilities of HMD 200, body wearables 300 and neuroimaging devices 400 to record timestamp of user physical activity.
The system 1000 gathers such information from user visual, auditory or other sensor response to external world using the HMD 200, body wearables 300, and processes it along with user brain wave patterns on the processing unit 700 to determine if the decision making of the user is attributed to his conscious, subconscious or unconscious behavior. This is predominantly based on time within which the user 100 responds. For example, if the user responds at a time instance shorter than the predetermined threshold, say for example few milliseconds, it is inferred as “definite-not” a conscious activity, and rather an impulsive one.
Instantly, the system 1000 triggers the processing unit 700 to draw input from the HMD 100, plurality of body wearables 300, brain wave patterns and correlate the information to infer user physiological state along with emotional state that validates the user’s pre-emptive response. For such instantaneous, unthoughtful reactions the system 1000 guides user in allowing the thought flow from his subconscious to conscious for better decision making. This is achieved by a personal virtual assistant 500 designed for the user, as discussed in detail later.
However, in an event the user behavior is exhibited within a time instance which is either equal to or higher than a predetermined threshold, it is important to understand that though the user has “apparently” depicted not an impulsive behavior, yet his behavior needs a validation from his mental impressions and emotional levels. Thus, the system 1000 directs the interpretation of the physiological data along with the brain wave patterns in order to understand his true cognitive and mental behavior.
In such an event, the unusual or deviated user behavior in response to given external stimuli is validated by physiological data. For example, if the physiological signals in body are depicting variable breath pattern, fluctuating pulse rate, varying blood pressure, body temperature, skin perspiration and the brain imaging data, the user response/behavior is recorded as subconscious/subliminal response. In accordance with one exemplary embodiment, signals triggered in thalamus, basal ganglia, or upper brainstem of brain are captured and interpreted as these are the structures most commonly associated with consciousness levels.
Now, the processing unit 700 captures the brain wave pattern and measures the brain activity level. Now, if maximum of high beta waves are recorded especially in the posterior area of brain, it will indicate a type of anxiety linked to issues of worry. The processing unit 700 now maps the above recorded physiological information with the brain wave pattern for determining that anxiousness observed from physiological signals is not from excitement or positive emotional behavior, but rather concerned with worrying emotional levels.
There have been a detailed scientific research on reading of brain wave pattern which has been excluded from the description for sake of brevity. However, once the reason of this atypical behavior is concluded, it gets recorded and updated as a user response in a user profile. This kind of subliminal and preconscious information existing at sub-conscious levels can bias user motor responses and is significant in understanding user behavioural pattern and cognitive levels for mindful and conscious living.
Since, emotions are strongly indicative of human mind state, it is pertinent to observe and record activities within the limbic system of brain. The emotions are lower-level responses occurring in the subcortical regions of the brain (e.g. amygdala which is part of limbic system). Additionally, neocortex region of brain is also monitored which deal with conscious thoughts, reasoning, and decision-making). Additionally, information from eye tracking cameras of the HMD 100 can record pupil dilation and other subtle facial expressions along with measuring physiological changes such as skin conductance, heart rate, and brain activity.
It is noteworthy, that both the conscious and subconscious may be guided by the emotional drives, called as primary drive, and this information may be captured by any of invasive or non-invasive brain mapping techniques. For example, brain waves monitored by the brain computer interfaces (BCIs) can be non-invasive as conductive electrodes can be placed on the scalp of the user to detect microvolt scale electrical potentials created by many simultaneously active neurons in the cortex. Similarly, consumer-grade EEG devices, for example, can deliver high-resolution temporal information that can be adequate to detect event-related evoked potentials of the user.
For the purposes of present disclosure, five basic emotion levels are considered for objective determination of user mood and mental state. These include:
a) Happiness: State of pleasure and joy, contentment, self-satisfaction, pride and excitement
b) Sadness: failure, depression, sad, self-pity, loneliness, despair, melancholy
c) Fear: elevated blood pressure, heart rate, sweat, perspiration, tremors
d) Anger: resentment, irritability, hostility, stress, hatred, violence, disgust
e) Surprise: temporary halt in action, facial expression, pupil dilation
The objective classification of user emotion helps in instant determination reason for deviated user behavior in a given situation. The instant identification of valid reason of an atypical user behavior is important for reversing the user thought patterns from subconscious to more conscious levels, thereby readily correcting the user response.
In next working embodiment, an exemplary eye tracking sensor disposed on the head mounted display 100 can help gain insight into user attention levels and focus levels while responding to an external event. The user’s gaze is tracked for its micro movements between fixations and saccades while looking at a stimulus, and attention levels are mapped. While user’s gaze may be fixated at a specific instance, yet his attention may be deviated and his response to such instance may be absolutely unconsciously guided. Measuring attention levels is an important parameter in assessing user’s mind state while performing any physical activity, motor response or decision making, and also to understand elements of surprise in user emotions.
In a significant aspect of present disclosure, subliminal and preconscious user data is captured and measured via one or more body wearables along with brain wave patterns that helps in exploring deep faceted aspects of user personality, user behavior, mindfulness, logical thinking, awareness, unanticipated tendencies and the like. In one preferred embodiment, the tracked subliminal information persisting at subconscious level is mapped and correlated with user brain activity and user response to any stimuli in awakened or conscious state to determine cognitive perception, affective levels, task performance and resilience of user.
In one exemplary embodiment, the sensor data is obtained from various sensors disposed on head mounted display 200 along with subliminal data from other one or more body wearable 300 and brain wave patterns from brain imaging, neuroimaging and neurophysiological sources 400. The data captured in low-dimensional conscious space and unconscious space of user is processed and analysed to identify correlation therebetween by an artificial intelligence (AI) engine 750 embedded within processing unit 700 of the system 1000.
In accordance with one working embodiment of present disclosure, user emotional state is classified from collected sensor information- physical activity, physiological data, brain wave pattern and attention levels that is correlated by an AI engine 750 that makes use of:
a) linear classification methods (such as linear regression, linear discriminant analysis, K-nearest neighbour, linear support vector machine, single layer perceptron network, and simple Bayesian classifier) based feature extraction and dimensionality reduction of collective information obtained from various sensors (HMD, body wearables, EEG); or
b) non-linear classification methods (k-nearest neighbour, Support vector machine (SVM), decision tree, artificial neural network)
In following working embodiment, the AI engine 750 is configured to form classify emotional and mental levels of user based on extracted features of collective information received from various sensors. Various signals received from sensors onboarded on HMD, eye tracking sensors, body wearables or neuroimaging devices are pre-processed and relevant features are extracted. This is followed by optimization to prevent underfitting or overfitting of the model.
In accordance with one example embodiment, a combination of three machine learning algorithms such as support vector machine (SVM), K- nearest neighbour (KNN) and decision tree (DT) is opted as they can aptly handle both classification and regression problems, have few hyper parameters and makes more efficient use of computational resources. Thus each signal is independently analyzed, focusing on the unique information contained in each signal. This allows for an understanding of the individual contributions of these signals in emotional level recognition. Alternatively, multiple signals may be integrated with the SVM classifier to create a more comprehensive representation of user emotional state.
Next embodiment provides example of correlation between physiological signals, attention levels, brain wave signals and corresponding user behavior as influenced from aforementioned factors and thence invoked from user conscious/subconscious/unconscious. For example, user providing an instant response within few milliseconds with focused attention levels, glaring eyes, furrowed brows, tense jaws, lips, flared nostrils as observed from signals from HMD 200. Besides, physiological signals such as rapid heart beat, high blood pressure, perspiration, elevated tone with super activated beta waves depicts user anxiousness associated with anger and surprise. Consequently, the data correlated by the AI engine 750 infers user response being more aggressive rather than assertive. This information is utilized by personal virtual assistant 500 to commend a more conscious course of action/user response, as will be discussed later.
In accordance with other significant embodiment, a personal virtual assistant 500 is generated having an impression close to user profile such that the user sees the personal virtual assistant 500 as his own manifestation or someone he is familiar with and thus comfortable to share his core feelings and mental impressions. In accordance with one exemplary embodiment, an AI-based holographic personal virtual assistant 500 may be generated using any of 3D holographic generation technology.
Now, the personal virtual assistant 500 interacts with the AI engine 750 to monitor user activity, personal psyche, associated behavioural and mind pattern in all high and low dimensional user state and assess classified emotional state. The user 100 is now guided by this personal virtual assistant 500 who is aware of user’s true mental health and emotional state. For example, as in situation above discussed, the personal virtual assistant 500 having understood the user response correlated with his physiological signals, thought pattern and mood levels can suggest him of a corrective action, which is governed more by his consciousness than subconscious levels, thus helping the user feel motivated, mindful, healed.
In one example embodiment, the system 1000 captures archetypes in user subconscious states that manifest symbolically as archetypal images in dreams, art or other cultural forms or in form of facial expressions: brain waves, eye-movements, micro facial expressions, vital sign data (heart rate, blood pressure, respiration rate, etc.), muscle movement data, capillary dilation data, skin conductivity data, and the like. This knowledge is extracted and analysed by utilizing algorithms including clustering or regression analyses, low-dimensional embeddings (e.g., Principal Component Analysis (PCA)) and/or incremental and robust PCA generalization. These summary statistics provide different insights into the raw data.
In one further aspect of present disclosure, the personal virtual assistant 500 having obtained user mental and emotional information, interacts with the user 100 to validate if the obtained emotional state information from the AI engine 750 is valid depiction of user real mental state. This is achieved by personal virtual assistant 500 by engaging into a quasi-human interaction with the user and analyses the user responses to validate the emotional levels obtained from AI engine 750 and user feedback. In one exemplary embodiment, the virtual assistant 500 stores this validated user behavior in response to user action in the memory and database for future references.
In accordance with one exemplary embodiment, the personal virtual assistant 500 engages user 100 in a quasi-human interaction, whereby the user speech input is recorded through a microphone of HMD 200 and processed by the AI engine 750 of the processing unit 700. The personal virtual assistant 500 having an access to the processing unit 700, directs the processing unit 700 to interpret the user spoken words.
In accordance with one exemplary embodiment, the processing unit 700 achieves such interpretation using Natural language processing (NLP) technique to convert the user spoken word through Speech-To-Text (STT) into extracts words, phrases, for storage in its memory and database. This way the personal virtual assistant 500 is also keeping a personal record of common user actions and associated user behavior in background of varying user emotional state.
Here, the virtual assistant 500 is operable to have an all-time access to user profile stored in repository 600 that gets updated in real time with mapped user action-behavior information. Physical actions and user behavior or movements are processed at the processing unit 700, wherein meaning is derived from the actions by correlating the actions or movements with specific meanings and concepts and prior correlation behavior maintained in repository 600. The meanings derived from such correlations are communicated by the virtual assistant 500 to the user 100 in a friendly, soft tone to get user validation, approval or feedback on such interpretation.
Based on above analysis of unmasked and masked user expressions, behavior and actions associated with conscious or unconscious state, and along with user feedback, the virtual assistant 500 derives user’s true mental state. Apropos, the virtual assistant 500 directs the processing unit to map and correlate said masked and unmasked information with user feedback received from quasi-human interaction to obtain a value indicating the relevance of the specific meaning of words, user attitude, or his conceptual perception to stimuli to user’s mental state.
In accordance with an embodiment, the system comprises of a processing unit and a memory unit configured to store machine-readable instructions. The machine-readable instructions may be loaded into the memory unit from a non-transitory machine-readable medium, such as, but not limited to, CD-ROMs, DVD-ROMs and Flash Drives. Alternately, the machine-readable instructions may be loaded in a form of a computer software program into the memory unit. The memory unit in that manner may be selected from a group comprising EPROM, EEPROM and Flash memory. Further, the processing unit is operably connected with the memory unit configured to host user profile stored in the repository. In various embodiments, the processing unit is one of, but not limited to, a general-purpose processor, an application specific integrated circuit (ASIC) and a field-programmable gate array (FPGA).
In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, written in a programming language, such as, for example, Java, C, or assembly. One or more software instructions in the modules may be embedded in firmware, such as an EPROM. It will be appreciated that modules may comprised connected logic units, such as gates and flip-flops, and may comprise programmable units, such as programmable gate arrays or processors. The modules described herein may be implemented as either software and/or hardware modules and may be stored in any type of computer-readable medium or other computer storage device.
Further, while one or more operations have been described as being performed by or otherwise related to certain modules, devices or entities, the operations may be performed by or otherwise related to any module, device or entity. As such, any function or operation that has been described as being performed by a module could alternatively be performed by a different server, by the cloud computing platform, or a combination thereof. It should be understood that the techniques of the present disclosure might be implemented using a variety of technologies. For example, the methods described herein may be implemented by a series of computer executable instructions residing on a suitable computer readable medium. Suitable computer readable media may include volatile (e.g., RAM) and/or non-volatile (e.g., ROM, disk) memory, carrier waves and transmission media. Exemplary carrier waves may take the form of electrical, electromagnetic or optical signals conveying digital data steams along a local network or a publicly accessible network such as the Internet.
It should also be understood that, unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "controlling" or "obtaining" or "computing" or "storing" or "receiving" or "determining" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that processes and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Various modifications to these embodiments are apparent to those skilled in the art from the description and the accompanying drawings. The principles associated with the various embodiments described herein may be applied to other embodiments. Therefore, the description is not intended to be limited to the embodiments shown along with the accompanying drawings but is to be providing broadest scope of consistent with the principles and the novel and inventive features disclosed or suggested herein. Accordingly, the invention is anticipated to hold on to all other such alternatives, modifications, and variations that fall within the scope of the present invention.
,CLAIMS:We Claim:
1) A method for providing personalized assistance to a user (100) via a personal virtual assistant (500), comprising:
generating a user profile from user personally identifiable information, user routine physical activity and a user routine behavior corresponding to the user routine physical activity;
recording real-time user physical activity and real-time user behavior corresponding to the real-time user physical activity along with user physiological data and brain wave patterns;
in an event of deviation of the real-time user behavior from the routine user behavior when mapped with the generated user profile,
determining, by a processing unit (700), if deviated real-time user behavior is exhibited within a time instance less than, equal to or higher than a predetermined threshold;
correlating, by the processing unit (700), the real-time physical activity, physiological data and brain wave patterns; and
communicating the correlated information from the processing unit (700) to the personal virtual assistant (500);
wherein the personal virtual assistant (500) is configured to validate the correlated information and recommends the user (100) with a corrective response with enhanced self-awareness.
2) The method, as claim in claim 1, wherein the user personally identifiable information comprises of user name, birthdate, address, occupation, height, weight, health history, family information, user social media profile, web browsing history, user purchase data or user response to media prompts.
3) The method as claimed in claim 1, wherein the real-time user physical activity and real-time user behavior is captured from a plurality of sensors disposed on a head mounted display (200) worn by the user (100).
4) The method as claimed in claim 1, wherein the user physiological data is captured from a plurality of body wearables (300) such as smart watch, smart ring or any smart jewellery enriched with plurality of sensors.
5) The method as claimed in claim 1, wherein the user brain wave patterns are recorded using brain imaging devices (400) such as electroencephalogram, functional magnetic resonance imaging (fMRI) and other neuroimaging techniques.
6) The method as claimed in claim 1, wherein if the deviated user behavior is exhibited within a time instance less than a predetermined threshold, the real-time user behavior is marked as impulsive.
7) The method, as claimed in claim 6, wherein the user 100 is recommended to re-observe the user behavior by the personal virtual assistant 500 such that user thought flows from subconscious levels to conscious levels.
8) The method, as claimed in claim 1, wherein if the deviated user behavior is exhibited within a time instance equal to or higher than a predetermined threshold, correlating the user physiological data and the brain wave pattern to measure and classify a user emotional state.
9) The method, as claimed in claim 8, wherein the emotional state of the user is classified into happy, sad, fear, anger and surprise group.
10) The method, as claimed in claim 1, further comprising measuring attention levels in an event of the deviation of the real-time user behavior, wherein the attention levels are measured using one or more eye tracking sensors disposed on head mounted display (200).
11) The method, as claimed in claim 9, wherein the correlation between the real-time physical activity, physiological data and brain wave patterns is achieved using an artificial intelligence (AI) engine (750), wherein the AI engine (750) utilizes linear classification method or a non-linear classification method for the classification of user emotional state.
12) The method, as claimed in claim 11, wherein the AI engine (750) selects a combination of support vector machine (SVM), K- nearest neighbour (KNN) and decision tree (DT) approaches classifying the user emotional state based on the correlated real-time physical activity, physiological data and brain wave patterns.
13) The method, as claimed in claim 10, wherein the correlation between the real-time physical activity, physiological data, brain wave patterns and the attention levels is achieved using an artificial intelligence (AI) engine (750), wherein the AI engine (750) utilizes linear classification method or a non-linear classification method for the classification of user emotional state.
14) The method, as claimed in claim 1, wherein the personal virtual assistant (500) engages into a quasi- human interaction with the user (100) to obtain user feedback and validate the correlated information.
15) The method, as claimed in claim 14, wherein the quasi human interaction between the personal virtual assistant (500) and the user (100) is processed at the processing unit (700) using natural language processing technique.
16) A system (1000) for providing personalized assistance to a user (100) via a personal virtual assistant (500), which is configured to communicate with an artificial intelligence engine (750) executed upon a processing unit (700) of the system (1000), the system (1000) comprising:
a head mounted device (200) configured to record real-time user physical activity and real-time user behavior corresponding to the real-time user physical activity;
one or more body wearables (300) configured to capture user physiological data; and
one or more brain imaging devices (400) configured to record brain wave patterns;
the processing unit (700) configured to generate a user profile from user personally identifiable information, user routine physical activity and a user routine behavior corresponding to the user routine physical activity,
wherein in an event of deviation of the real-time user behavior from the routine user behavior when mapped with the generated user profile, the processing unit is configured to:
determine, if deviated real-time user behavior is exhibited within a time instance less than, equal to or higher than a predetermined threshold;
correlate the real-time physical activity, physiological data and brain wave patterns; and
communicate the correlated information to a personal virtual assistant (500); and
the personal virtual assistant (500) is configured to validate the correlated information and recommends the user (100) with a corrective response with enhanced self-awareness.
17) The system (1000), as claimed in claim 16, wherein the head mounted device (200) is configured with plurality of sensors, imaging unit, eye tracking sensors, motion sensors to capture the real-time user physical activity and real-time user behavior.
18) The system (1000), as claimed in claim 16, wherein the plurality of body wearables (300) is selected from, smart watch, smart ring or any smart jewellery enriched with plurality of sensors.
19) The system (1000), as claimed in claim 16, wherein the brain imaging devices (400) is select from , though not limited to, electroencephalogram, functional magnetic resonance imaging (fMRI) and other neuroimaging techniques.
20) The system (1000), as claimed in claim 16, wherein if the deviated user behavior is exhibited within a time instance less than a predetermined threshold, the personal virtual assistant (500) is configured to recommend the user (100) to re-observe the user behavior such that user thought flows from subconscious levels to conscious levels.
21) The system (1000), as claimed in claim 16, wherein if the deviated user behavior is exhibited within a time instance equal to or higher than a predetermined threshold, the artificial intelligence engine (750) is configured to correlate the user physiological data and the brain wave pattern to measure and classify a user emotional state.
22) The system (1000), as claimed in claim 17, wherein the eye tracking sensors are configured to measure user attention levels based on user gaze tracked between eye fixations and saccades.
23) The system (1000), as claimed in claim 16, wherein the artificial intelligence engine (750) is configured to utilize linear classification method or a non-linear classification method for classification of user emotional state.
24) The system (1000), as claimed in claim 23, wherein the artificial intelligence engine (750) selects a combination of support vector machine (SVM), K- nearest neighbour (KNN) and decision tree (DT) approaches for classifying the user emotional state based on the correlated real-time physical activity, physiological data and brain wave patterns.
25) The system (1000), as claimed in claim 16, wherein the personal virtual assistant (500) is configured to engage into a quasi- human interaction with the user (100) to obtain user feedback and validate the correlated information.
26) The system (1000), as claimed in claim 25, wherein the quasi human interaction between the personal virtual assistant (500) and the user (100) is processed at the processing unit (700) using natural language processing technique.
| # | Name | Date |
|---|---|---|
| 1 | 202321012399-PROVISIONAL SPECIFICATION [23-02-2023(online)].pdf | 2023-02-23 |
| 2 | 202321012399-FORM FOR STARTUP [23-02-2023(online)].pdf | 2023-02-23 |
| 3 | 202321012399-FORM FOR SMALL ENTITY(FORM-28) [23-02-2023(online)].pdf | 2023-02-23 |
| 4 | 202321012399-FORM 1 [23-02-2023(online)].pdf | 2023-02-23 |
| 5 | 202321012399-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [23-02-2023(online)].pdf | 2023-02-23 |
| 6 | 202321012399-EVIDENCE FOR REGISTRATION UNDER SSI [23-02-2023(online)].pdf | 2023-02-23 |
| 7 | 202321012399-DRAWINGS [23-02-2023(online)].pdf | 2023-02-23 |
| 8 | 202321012399-DRAWING [13-02-2024(online)].pdf | 2024-02-13 |
| 9 | 202321012399-COMPLETE SPECIFICATION [13-02-2024(online)].pdf | 2024-02-13 |
| 10 | 202321012399-FORM-9 [15-02-2024(online)].pdf | 2024-02-15 |
| 11 | 202321012399-STARTUP [20-02-2024(online)].pdf | 2024-02-20 |
| 12 | 202321012399-FORM28 [20-02-2024(online)].pdf | 2024-02-20 |
| 13 | 202321012399-FORM 18A [20-02-2024(online)].pdf | 2024-02-20 |
| 14 | 202321012399-ENDORSEMENT BY INVENTORS [20-02-2024(online)].pdf | 2024-02-20 |
| 15 | 202321012399-FER.pdf | 2024-05-02 |
| 16 | 202321012399-FER_SER_REPLY [15-07-2024(online)].pdf | 2024-07-15 |
| 17 | 202321012399-CLAIMS [15-07-2024(online)].pdf | 2024-07-15 |
| 18 | 202321012399-PatentCertificate04-07-2025.pdf | 2025-07-04 |
| 19 | 202321012399-IntimationOfGrant04-07-2025.pdf | 2025-07-04 |
| 20 | 202321012399-FORM FOR SMALL ENTITY [07-07-2025(online)].pdf | 2025-07-07 |
| 21 | 202321012399-EVIDENCE FOR REGISTRATION UNDER SSI [07-07-2025(online)].pdf | 2025-07-07 |
| 1 | SearchHistoryE_30-04-2024.pdf |