Sign In to Follow Application
View All Documents & Correspondence

Improved Personal Virtual Assistance For Enhanced User Well Being

Abstract: In system and method of present disclosure, an improved personal virtual assistant is generated that is configured with an ability to enhance user self-awareness and overall well-being. The personal virtual assistant is fed with data captured from user conscious and subliminal states by various sensors disposed on head mounted display, plurality of body wearables and brain imaging devices. The system being enabled by an artificial intelligence engine, processes and analyses the data collected from user conscious, preconscious and subliminal states to derive a correlation therebetween that helps determining user mental impressions, state of mind and emotional levels while responding to external events. Based on correlated information, the personalized virtual assistant recommends and instructs the user towards corrective action in a manner, tone and style most compatible with the user, like that of his friend or a companion.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
20 February 2024
Publication Number
12/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Dimension NXG Pvt. Ltd.
Office 527 & 528, 5th floor, Lodha Supremus Road No. 22, Near New Passport Office Wagle Estate, Thane West, Maharashtra -400604

Inventors

1. Purwa Rathi
Office 527 & 528, 5th floor, Lodha Supremus Road No. 22, Near New Passport Office Wagle Estate, Thane West, Maharashtra -400604
2. Abhishek Tomar
Office 527 & 528, 5th floor, Lodha Supremus Road No. 22, Near New Passport Office Wagle Estate, Thane West, Maharashtra -400604
3. Abhijit Patil
Office 527 & 528, 5th floor, Lodha Supremus Road No. 22, Near New Passport Office Wagle Estate, Thane West, Maharashtra -400604
4. Pankaj Raut
Office 527 & 528, 5th floor, Lodha Supremus Road No. 22, Near New Passport Office Wagle Estate, Thane West, Maharashtra -400604
5. Yukti Suri
Office 527 & 528, 5th floor, Lodha Supremus Road No. 22, Near New Passport Office Wagle Estate, Thane West, Maharashtra -400604

Specification

Description:CROSS-REFERENCE TO RELATED APPLICATIONS
The present patent application is a patent of addition of the main Indian Patent Application No. 202321012399 of Filing date February 23, 2023. The present application comprises an improvement in or a modification of the invention claimed in the specification of the main patent applied for in the Indian Patent Application No. 202321012399.
FIELD OF THE INVENTION
Embodiment of the present invention relates to a system and method for generating a personalized virtual assistant and more particularly to an improved system and method for generating a personalized virtual assistant that can provide real time guidance and assist in mindful decision making to a user, thereby contributing to his enhanced well-being.
BACKGROUND OF THE INVENTION
Existing “virtual assistant” applications generally respond to a user stimulus, for example a spoken or typed question, by retrieving information from a knowledge base and presenting that information through a visual indicator and/or an auditory response. However, these responses are often mechanical, do not have any real semblance of a personality, and do not react efficiently as user profile, viewpoint, sentiments, core feelings and emotional states are generally not taken into account and in-depth consideration. This creates a need to develop a customizable solution that provides a real time guidance and self-awareness to user as his true companion or guide in a manner consistent with user profile, and tailored to consider sentiments, speech, and behavioural pattern of the user.
Some work has been done in this field to capture user actions, speech, behavior and the like. This primarily involves tracking of physical and physiological state of user, emotion and cognition of user and the like for monitoring user health and fitness activity or guiding user in decision making on routine basis. However, a significant part of human existence is associated with a deep level of psychology called the “unconscious” because it is not accessible to conscious thought, and is primarily driven by emotions and instinct. As well established and accepted by all, these two mind states collectively determine user action.
However, in this respect, except for few outliers, hardly any consideration has been given to universal impulses contributing to combined “conscious-unconscious”, which is vital and equally significant in understanding user behavior and response patterns. The power of assessing user’s combined conscious and unconscious state and drawing interpretation from the correlated data has been largely ignored by neuroscientific researchers and stakeholders interested in human psychology and mining cognitive behavioural patterns.
While an individual can conveniently manipulate his behavior and responses to external events in social settings, it is extremely difficult for one to analyse and understand one’s deep state of mind, innate emotions cognition, primal reflexes at both conscious and unconscious levels. If left unaddressed, such emotions, stress and mental state may trigger unsettled feelings within user and may prove defeating for him in long run, as his distress levels elevate, mind and body stays disharmonious and unattuned, performance falls and eventually life loses its essence and meaning for him.
According to World Health Organization, more than 500 million people around the world have depression, anxiety disorders or other emotional turbulence, and it is projected that 18% of adult population will experience depression at some point in their life time. Since Covid-19 serious damage has happened to people mental health. With increasing psychological distress and suicidal tendencies, there is a strong incentive to build psychotherapeutic methods, systems and processes to alter human cognitive state and can help him easily transit from depressive to meditative and tranquil state.
The existing artificial bots are too mechanical and superficial in understanding human emotions and correspondingly addressing them in a more contextual and meaningful way. The answers to generated queries are typically too incomprehensible, feigned, fictitious and unrelatable that they can barely assist user in overcoming his depressed state. Imagining a true guide and companion who understands core human feelings, suppressed emotions, mind-body balance and other cognitive states is a matter of science fictions as of now.
In this vein, the present disclosure sets forth system and method for creating a virtual assistant for the user, embodying advantageous alternatives and improvements to existing mechanical human assistants, robots or bots, and that may address one or more of the challenges or needs mentioned herein, as well as provide other benefits and advantages.
OBJECT OF THE INVENTION
An object of the present invention is to provide a virtual assistant that can enhance user awareness and overall well-being.
Another object of the present invention is to provide an AI enabled virtual assistant that can track and comprehend user actions, behavior, state of mind, cognitive capability and emotional levels to provide meaningful suggestions to user.
Yet another object of the present invention is to provide a virtual assistant that is capable of exploring subliminal perception of user and correlating with user conscious behavior to effectively guide user in transitioning from distressed to tranquil state.
Yet another object of the present invention is to provide a NLP enabled virtual assistant that can provide recommendations and instructions to user in a manner, tone, phrases and voice most relatable to user.
Yet another object of the present invention is to provide a virtual assistant that can monitor all day activities (conscious state) and map it with user preconscious or subliminal state to determine user mindfulness and decision making capability.
In yet another embodiment, the virtual assistant acts as a true companion, guide, friend and comrade to the user as enables user to become self-aware and rational being.
SUMMARY OF THE INVENTION
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.

In accordance with first aspect of the disclosure, a method for providing assistance to a user via a personal virtual assistant is disclosed, wherein the method comprises correlating real-time user physical activity with user physiological data and brain wave patterns to obtain user emotional state. This is followed by validating the emotional state based on analysis of user behavior, user semblance and user feedback. The user emotional state is classified into at least one quadrant, and the user is assisted in transforming the emotional state in an event the classified user emotional state falls within a negative quadrant.

BRIEF DESCRIPTION OF THE DRAWINGS
So that the manner in which the above recited features of the present invention can be understood in detail, a more particular to the description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, the invention may admit to other equally effective embodiments.
These and other features, benefits and advantages of the present invention will become apparent by reference to the following text figure, with like reference numbers referring to like structures across the views, wherein:
Fig. 1 illustrates an exemplary environment of user interacting with virtual assistant, in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION
While the present invention is described herein by way of example using embodiments and illustrative drawings, those skilled in the art will recognize that the invention is not limited to the embodiments of drawing or drawings described and are not intended to represent the scale of the various components. Further, some components that may form a part of the invention may not be illustrated in certain figures, for ease of illustration, and such omissions do not limit the embodiments outlined in any way. It should be understood that the drawings and detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary, the invention is to cover all modifications, equivalents, and alternatives falling within the scope of the present invention as defined by the appended claims.
As used throughout this description, the word "may" be used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense, (i.e., meaning must). Further, the words "a" or "an" mean "at least one” and the word “plurality” means “one or more” unless otherwise mentioned. Furthermore, the terminology and phraseology used herein is solely used for descriptive purposes and should not be construed as limiting in scope. Language such as "including," "comprising," "having," "containing," or "involving," and variations thereof, is intended to be broad and encompass the subject matter listed thereafter, equivalents, and additional subject matter not recited, and is not intended to exclude other additives, components, integers or steps.
Likewise, the term "comprising" is considered synonymous with the terms "including" or "containing" and term “personalized” is considered synonymous with “personal” for applicable legal purposes. Any discussion of documents, acts, materials, devices, articles, and the like are included in the specification solely for the purpose of providing a context for the present invention. It is not suggested or represented that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant to the present invention.
In this disclosure, whenever a composition or an element or a group of elements is preceded with the transitional phrase “comprising”, it is understood that we also contemplate the same composition, element or group of elements with transitional phrases “consisting of”, “consisting”, “selected from the group of consisting of, “including”, or “is” preceding the recitation of the composition, element or group of elements and vice versa.
The present invention is described hereinafter by various embodiments with reference to the accompanying drawings, wherein reference numerals used in the accompanying drawing correspond to the like elements throughout the description. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiment set forth herein. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art.
In the following detailed description, numeric values and ranges are provided for various aspects of the implementations described. These values and ranges are to be treated as examples only and are not intended to limit the scope of the claims. In addition, a number of materials are identified as suitable for various facets of the implementations. These materials are to be treated as exemplary and are not intended to limit the scope of the invention.
In accordance with one general embodiment of present disclosure, the present system and method are directed to generating a virtual assistant for user that enables him to enhance his overall well-being, self- awareness. Further, the virtual assistant helps the user transit from any emotionally turbulent state to peaceful state without having to undergo any therapeutic treatment or undergoing through the pain of explaining the therapist one’s mental state.
The significant role played by subconscious mind while performing various volitional day-to-day activities cannot be undermined. It has now been sufficiently established that activities of certain part of the brain come to consciousness keeping the rest of the activities of the brain under the veil in subconscious mind. For example, while reading a book, the words are processed at our visual cortex, while the meaning and interpretation of those words engages our frontal lobe. However, all these roles remain hidden and only activity seen is reading and enjoying a book.
However, the simple and routine activities that one performs is not merely an outcome of his only conscious behavior. Rather, the highly integrated subconscious space involve various regions of brains to effortlessly perform some basic procedural activities. Nonetheless, the role of subconscious is not merely restricted to performance of mundane routine tasks; rather the user often takes significant decisions of his life guided by unconsciously settled memories, thought patterns, emotions and the like.
The present disclosure explores role and significance of combining conscious and unconscious state of minds to understand user behavior, user motivation levels, capabilities and potential to eventually enhance user performance levels. It will be appreciated by those expert in the field that actions of an unconscious mind precede the arrival of a conscious mind—that action precedes reflection. Even intense training sessions, prolonged practice hours and committed efforts may not fully capitalize on user potential, unless an optimal mind-body balance is assured before the critical event.
Often the user may succumb to increased stress levels, audience pressure, self-expectations and falter if his mind and body are not in harmony with each other. Thus, he needs some insights into his mental state and an able guidance that can help him overcome temporal distraction and bring focus levels to mind-body coordination. In background of above, the present disclosure attempts to track and capture user conscious and measurable subconscious state of minds to draw a causal link therebetween and infer meaningful results therefrom such that an optimal brain-body balance can be achieved whenever desired by user.
Accordingly, the present disclosure assists the user in critical situations and otherwise to attain a balanced state of mind and body, which contributes to enhanced user performance levels in all spheres of life. In first embodiment, the system and method of present disclosure is configured to generate a personalized virtual assistant (PVA) based on user profile generated from user biometric information, personality profile, user routine physical activity, user routine behavior and a combination thereof, which functions as a benchmark in assessing most pertinently user conscious and subconscious state.
In one exemplary embodiment, personal virtual assistant or also commonly referred as “user avatar” or “holographic avatar” for the purposes of present disclosure, is represented in visual or audial media, including digitally rendered virtual avatars, holograms, “chat bots”, or any other method known to convey digital representations of personalities. The personal virtual assistant of present disclosure assists user in achieving optimal and balanced body-mind state for enhanced user performance based on user profile, sentimental/behavioural characteristics, and user conscious & subconscious state, as explained in later sections.
In one preferred embodiment, the personal virtual assistant is configured to generate user profile from recorded user routine physical activity and user routine behavior corresponding to the recorded user physical activity. The recorded information is reflective of user conscious, subconscious or unconscious state and a correlation therebetween is vital in helping the user to develop a more awakened conscious experience at not just physical or physiological level but most importantly at neurological and atomic levels.
Referring to Fig. 1, in order to experience an insight into one’s conscious, subconscious or unconscious state, the present disclosure present a system 1000 equips the user 100 with an advanced and sophisticated head mounted display (HMD) 200 that enables the user 100 interact with his personal virtual assistant 500. Additionally, the user 100 is configured with one or more sensor enriched body wearables, such as smartwatch 300a, smart ring 300b, smart jewellery 300c, (collectively referred by numeral “300”) as shown in Fig. 1 in order to track user physical and physiological activity information, user interaction, external media inputs and user state. All such information from the HMD 200 and body wearables 300 is processed by processing unit 700 of the system 1000.
In one exemplary embodiment, the user profile may be developed through user input of personally identifiable information, including personal information such as a name, birthdate, address, occupation, height, weight, health history, and/or family information. The personally identifiable information is supplemented with mapped user action-behavior information to generate an all-inclusive user profile.
In one example embodiment, the system 1000 records and monitors user response to a given external/internal stimulus (e.g., a question being asked, a movement, a touch, or a sound). The response to stimuli may be seen in user actions that are generally derived from user personality trait or counter response from the user. In yet another embodiment, the user response to a given stimuli may be derived from user profile that may be additionally generated based on one or more existing user online profiles or data, including user created and approved social media profiles, social media activity, website browsing history, purchase data, user response to prompts from various media channels and the like etc. All these response are recorded by system to generate an all-inclusive user profile that is maintained in a dedicated repository 600.
Now, in next embodiment the system 1000 records and monitors real-time user response to a given external/internal stimulus (e.g., a question being asked, a movement, a touch, or a sound). The response to stimuli may be seen in user actions or behaviors that are generally derived from user personality trait or counter response from the user 100. However, the present disclosure attempts to monitor and record real-time user physical activity and corresponding real-time user behavior
In one working embodiment of present disclosure, the physical and physiological information related to real time user activity and user behavior may be obtained from multi-spectrum camera or image processing unit or imaging unit or one or more motion sensors disposed on the head mounted display 200 or body wearables 300. This information may include tracked physiological or biometric information about the user, motion data regarding movement of the user and image data corresponding to two or three dimensional image captures of the user. The other parameters of the user may be captured via different kinds of sensors embedded on head mounted display or other body wearable- pressure sensors, electromagnetic sensors, microphones, cameras, odour sensors and/or chemical sensors. Sensors include any electronic medium through which the device is operable to receive and record an input, including pressure sensors, electromagnetic sensors, microphones, cameras, and/or chemical sensors
In next significant embodiment, the system 1000 further records user brainwaves to capture user brain state with different levels of consciousness using any of brain imaging techniques such as EEG neuroimaging that measures neuronal activity, functional magnetic resonance imaging f(MRI) and others (collectively referred by numeral “400”). These will help in recording how brain activity changes with change in given stimuli. For example, when the user is fully aware, alert and conscious, his brain will predominantly operate in beta brainwave patterns out of alpha, beta, delta and theta brainwave patterns known of human consciousness.
Briefly, Table 1 below summarizes general classification of brain waves Delta, Theta, Alpha, Beta and Gamma.
Brain Wave Type Frequency Range Specification
Delta 0.5-3 Hz - Slowest of all and associated with sleep
- In waking state, helps to provide access to subconscious activity, encouraging the flow into conscious thought
Theta 3-8 Hz -Deeply relaxed and meditative state of mind
-Helps in memory recollection and improvement
-Increased creativity and learning
Alpha 8-12 Hz -Creativity, relaxation, reflection
-Problem solving and visualization that helps in creativity
Beta 12-27 Hz -Increased concentration and alertness
- Better analysis and work productivity
Gamma >27 Hz -Deep sleep as in anaesthesia
-Regional learning, memory and language processing
In one significant aspect of present disclosure, user subconscious or rather subliminal state is tracked while his consciousness is broken; say for example the user is in deep sleep or in any unconscious state for any reason like general anaesthesia, vegetative-state or minimally conscious state using user body wearables (300). Here “unconscious state” all includes experiences or responses to subliminal or preconscious stimuli. Precisely, subliminal stimuli is a one to which responses are often undetectable, even with focused attention.
On the other hand, there may be situations where the user response appears to be deviated from routine user behavior when mapped with the action-behavior information maintained in user profile. For example, if the user has responded too quickly or spontaneously to any external event/stimuli due to temporary distraction or inattention, it is clearly not a rightful depiction of user capacity, capability or logical brain functioning. The system 1000 utilizes the capabilities of HMD 200, body wearables 300 and neuroimaging devices 400 to record timestamp of user physical activity.
The system 1000 gathers such information from user visual, auditory or other sensor response to external world using the HMD 200, body wearables 300, and processes it along with user brain wave patterns on the processing unit 700 to determine if the decision making of the user is attributed to his conscious, subconscious or unconscious behavior. This is predominantly based on time within which the user 100 responds. For example, if the user responds at a time instance shorter than the predetermined threshold, say for example few milliseconds, it is inferred as “definite-not” a conscious activity, and rather an impulsive one.
Instantly, the system 1000 triggers the processing unit 700 to draw input from the HMD 200, plurality of body wearables 300, brain wave patterns and correlate the information to infer user physiological state along with emotional state that validates the user’s pre-emptive response. For such instantaneous, unthoughtful reactions the system 1000 guides user in allowing the thought flow from his subconscious to conscious for better decision making. This is achieved by a personal virtual assistant 500 designed for the user, as discussed in detail later.
However, in an event the user behavior is exhibited within a time instance which is either equal to or higher than a predetermined threshold, it is important to understand that though the user has “apparently” depicted not an impulsive behavior, yet his behavior needs a validation from his mental impressions and emotional levels. Thus, the system 1000 directs the interpretation of the physiological data along with the brain wave patterns in order to understand his true cognitive and mental behavior.
In such an event, the unusual or deviated user behavior in response to given external stimuli is validated by physiological data. For example, if the physiological signals in body are depicting e.g. variable breath pattern, fluctuating pulse rate, varying blood pressure, body temperature, skin perspiration and the brain imaging data, the user response/behavior is recorded as subconscious/subliminal response. In accordance with one exemplary embodiment, signals triggered in thalamus, basal ganglia, or upper brainstem of brain are captured and interpreted as these are the structures most commonly associated with consciousness levels.
Now, the processing unit 700 captures the brain wave pattern and measures the brain activity level. Now, if maximum of high beta waves are recorded especially in the posterior area of brain, it will indicate a type of anxiety linked to issues of worry. The processing unit 700 now maps the above recorded physiological information with the brain wave pattern for determining that anxiousness observed from physiological signals is not from excitement or positive emotional behavior, but rather concerned with worrying emotional levels
There have been a detailed scientific research on reading of brain wave pattern which has been excluded from the description for sake of brevity. However, once the reason of this atypical behavior is concluded, it gets recorded and updated as a user response in a user profile. This kind of subliminal and preconscious information existing at sub-conscious levels can bias user motor responses and is significant in understanding user behavioural pattern and cognitive levels for mindful and conscious living.
However, in an event the user behavior is exhibited within a time instance which is either equal to or higher than a predetermined threshold, it is important to understand that though the user has “apparently” depicted a conscious behavior, his mental impressions and emotional levels have to be measured along with the physiological data and brain wave patterns in order to understand his true cognitive and mental behavior.
Since, emotions are strongly indicative of human mind state, it is pertinent to observe and record activities within the limbic system of brain. The emotions are lower-level responses occurring in the subcortical regions of the brain (e.g. amygdala which is part of limbic system). Additionally, neocortex region of brain is also monitored which deal with conscious thoughts, reasoning, and decision-making). Additionally, information from eye tracking cameras of the HMD 100 can record pupil dilation and other subtle facial expressions along with measuring physiological changes such as skin conductance, heart rate, and brain activity.
It is noteworthy, that both the conscious and subconscious may be guided by the emotional drives, called as primary drive, and this information may be captured by any of invasive or non-invasive brain mapping techniques. For example, brain waves monitored by the brain computer interfaces (BCIs) can be non-invasive as conductive electrodes can be placed on the scalp of the user to detect microvolt scale electrical potentials created by many simultaneously active neurons in the cortex. Similarly, consumer-grade EEG devices, for example, can deliver high-resolution temporal information that can be adequate to detect event-related evoked potentials of the user.
For the purposes of present disclosure, five basic emotion levels are considered for objective determination of user mood and mental state. These include:
a) Happiness: State of pleasure and joy, contentment, self-satisfaction, pride and excitement
b) Sadness: failure, depression, sad, self-pity, loneliness, despair, melancholy
c) Fear: elevated blood pressure, heart rate, sweat, perspiration, tremors
d) Anger: resentment, irritability, hostility, stress, hatred, violence, disgust
e) Surprise: temporary halt in action, facial expression, pupil dilation
Further, the five basic emotions are divided amongst four quadrants with arousal states defining the vertical axis and the positive/negative valence defining the horizontal axis. Here, valence refers to the pleasantness or unpleasantness of an emotional stimulus. As shown in model below, the four quadrants showing the classified emotional states is depicted. The first quadrant represents the emotions with high valence and high arousal like “happiness.” The second quadrant indicates low valence and high arousal emotion like “anger.” The third and fourth quadrants represent emotions with low valence low arousal like “sadness” and high valence low arousal like “neutral,” respectively.
High Arousal
Fear, Anger
Surprise, Happiness
Negative Valence Positive Valence
Sad, Disgust Neutral

Low Arousal
The objective classification of user emotion helps in instant determination reason for deviated user behavior in a given situation. The instant identification of valid reason of an atypical user behavior is important for reversing the user thought patterns from subconscious to more conscious levels, thereby readily correcting the user response.
In next working embodiment, an exemplary eye tracking sensor disposed on the head mounted display 100 can help gain insight into user attention levels and focus levels while responding to an external event. The user’s gaze is tracked for its micro movements between fixations and saccades while looking at a stimulus, and attention levels are mapped. While user’s gaze may be fixated at a specific instance, yet his attention may be deviated and his response to such instance may be absolutely unconsciously guided. Measuring attention levels is an important parameter in assessing user’s mind state while performing any physical activity, motor response or decision making, and also to understand elements of surprise in user emotions.
In a significant aspect of present disclosure, subliminal and preconscious user data is captured and measured via one or more body wearables along with brain wave patterns that helps in exploring deep faceted aspects of user personality, user behavior, mindfulness, logical thinking, awareness, unanticipated tendencies and the like. In one preferred embodiment, the tracked subliminal information persisting at subconscious level is mapped and correlated with user brain activity and user response to any stimuli in awakened or conscious state to determine cognitive perception, affective levels, task performance and resilience of user.
In one exemplary embodiment, the sensor data is obtained from various sensors disposed on head mounted display 200 along with subliminal data from other one or more body wearable 300 and brain wave patterns from brain imaging, neuroimaging and neurophysiological sources 400. The data captured in low-dimensional conscious space and unconscious space of user is processed and analysed to identify correlation therebetween by an artificial intelligence (AI) engine 750 embedded within processing unit 700 of the system 1000.
In accordance with one working embodiment of present disclosure, user emotional state is classified from collected sensor information- physical activity, physiological data, brain wave pattern and attention levels that is correlated by an AI engine 750 that makes use of:
a) linear classification methods (such as linear regression, linear discriminant analysis, K-nearest neighbour, linear support vector machine, single layer perceptron network, and simple Bayesian classifier) based feature extraction and dimensionality reduction of collective information obtained from various sensors (HMD, body wearables, EEG); or
b) non-linear classification methods (k-nearest neighbour, Support vector machine (SVM), decision tree, artificial neural network)
In following working embodiment, the AI engine 750 is configured to form classify emotional and mental levels of user based on extracted features of collective information received from various sensors. Various signals received from sensors onboarded on HMD, eye tracking sensors, body wearables or neuroimaging devices are pre-processed and relevant features are extracted. This is followed by optimization to prevent underfitting or overfitting of the model.
In accordance with one example embodiment, a combination of three machine learning algorithms such as support vector machine (SVM), K- nearest neighbour (KNN) and decision tree (DT) is opted as they can aptly handle both classification and regression problems, have few hyper parameters and makes more efficient use of computational resources. Thus each signal is independently analyzed, focusing on the unique information contained in each signal. This allows for an understanding of the individual contributions of these signals in emotional level recognition. Alternatively, multiple signals may be integrated with the SVM classifier to create a more comprehensive representation of user emotional state.
Next embodiment provides example of correlation between physiological signals, attention levels, brain wave signals and corresponding user behavior as influenced from aforementioned factors and thence invoked from user conscious/subconscious/unconscious. For example, user providing an instant response within few milliseconds with focused attention levels, glaring eyes, furrowed brows, tense jaws, lips, flared nostrils as observed from signals from HMD 200. Besides, physiological signals such as rapid heart beat, high blood pressure, perspiration, elevated tone with super activated beta waves depicts user anxiousness associated with anger and surprise. Consequently, the data correlated by the AI engine 750 infers user response being more aggressive rather than assertive. This information is utilized by personal virtual assistant 500 to commend a more conscious course of action/user response, as will be discussed later.
In accordance with other significant embodiment, a personal virtual assistant 500 is generated having an impression close to user profile such that the user sees the personal virtual assistant 500 as his own manifestation or someone he is familiar with and thus comfortable to share his core feelings and mental impressions. In accordance with one exemplary embodiment, an AI-based holographic personal virtual assistant 500 may be generated using any of 3D holographic generation technology.
Thus, the personal virtual assistant 500 interacts with this AI engine 750 to devise a life-logging tool of user that can monitor user activity, personal psyche, associated behavioural and mind pattern in all high and low dimensional user state and assess classified emotional state. The user 100 is now guided by this personal virtual assistant 500 who is aware of user’s true mental health and emotional state. For example, as in situation above discussed, the personal virtual assistant 500 having understood the user response correlated with his physiological signals, thought pattern and mood levels can suggest him of a corrective action, which is governed more by his consciousness than subconscious levels, thus helping the user feel motivated, mindful and healed.
In one example embodiment, the system 1000 captures archetypes in user subconscious states that manifest symbolically as archetypal images in dreams, art or other cultural forms or in form of facial expressions: brain waves, eye-movements, micro facial expressions, vital sign data (heart rate, blood pressure, respiration rate, etc.), muscle movement data, capillary dilation data, skin conductivity data, and the like. This knowledge is extracted and analysed by utilizing algorithms including clustering or regression analyses, low-dimensional embeddings (e.g., Principal Component Analysis (PCA)) and/or incremental and robust PCA generalization. These summary statistics provide different insights into the raw data.
In one further aspect of present disclosure, the personal virtual assistant 500 having obtained user mental and emotional information, interacts with the user 100 to validate if the obtained emotional state information from the AI engine 750 is valid depiction of user real mental state. The virtual assistant 500 stores this interaction in the memory and database. This is achieved by personal virtual assistant 500 by engaging into a quasi-human interaction with the user and analyses the user responses to validate the emotional levels obtained from AI engine 750 and user feedback. In accordance with one exemplary embodiment, the personal virtual assistant 500 stores this validated user behavior in response to user action in the memory and database for future references.
In accordance with one exemplary embodiment, the personal virtual assistant 500 engages user 100 in a quasi-human interaction, whereby the user speech input is recorded through a microphone of HMD 200 and processed by the AI engine 750 of the processing unit 700. The personal virtual assistant 500 having an access to the processing unit 700, directs the processing unit 700 to interpret the user spoken words.
In accordance with one exemplary embodiment, the processing unit 700 achieves such interpretation using Natural language processing (NLP) technique to convert the spoken word through Speech-To-Text (STT) into extracts words, phrases, for storage in its memory and database. This way the personal virtual assistant 500 is also keeping a personal record of common user actions and associated user behavior in background of varying user emotional state.
Here, the virtual assistant 500 is operable to have an all-time access to user profile stored in repository 600 that gets updated in real time with mapped user action-behavior information. Physical actions and user behavior or movements are processed at the processing unit 700, wherein meaning is derived from the actions by correlating the actions or movements with specific meanings and concepts and prior correlation behavior maintained in repository 600. The meanings derived from such correlations are communicated by the virtual assistant 500 to the user 100 in a friendly, soft tone to get user validation, approval or feedback on such interpretation.
Based on above analysis of unmasked and masked user expressions, behavior and actions associated with conscious or unconscious state, and along with user feedback, the personal virtual assistant 500 derives user’s true mental state. Apropos, the personal virtual assistant 500 directs the processing unit 700 to map and correlate said masked and unmasked information with user feedback received from quasi-human interaction to obtain a value indicating the relevance of the specific meaning of words, user attitude, or his conceptual perception to stimuli to user’s mental state.
Based on the values generated, the personal virtual assistant 500 stores the highest correlated compared category or categories in the user profile to understand similarity and dissimilarity in user behavioural or emotional patterns in conscious and unconscious state. For example, if the phrase “I feel great today” is communicated by the user between his friends or in any social gathering, and determine if the user’s state of mind is truly joyous or if it’s an intentionally faked gesture.
This is achieved by the personal virtual assistant 500 as it gains correlated information about user semblance including user interactions, emotional levels, voice, pitch, tone, behavior, overall conduct, archetypal images flashing in his thoughts and the like from the processing unit 700 and gets it validated by the user feedback. It then computes the correlation value on a scale from 0 to 1 between the pieces of gathered information, wherein “0” implies no correlation and “1” includes a definite, positive correlation.
In next embodiment, the personal virtual assistant 500 now have complete capability in understanding user behavior correlated with user semblance, his actions, and his mental or emotional state. Thus, the personal virtual assistant 500 can now derive a correlation value based on mapping the semantics of words spoken by user and his corresponding behavior at conscious and unconscious levels without expressly asking for his feedback.
In one embodiment, when the phrase “I feel good today” is compared to other words, phrases, and ideas in an external database, the external server returns a correlated category and correlation value (e.g., “happiness, 0.7” or “contentment, 0.6”). In another example, if the phrase “I feel great today” is correlated to a database, the word “great” in context returns a response: “happiness, 0.9; contentment 0.9.) Thus, the personal virtual assistant 500 analyses user behavior and correlates it with user semblance such as elevated pitch, happy tone, cheerful facial expressions, confident gait and the like to understand and validate user emotional state.
This is followed by classification of validated emotional state. As can be drawn from example above, this positive correlation is depicting an uplifted emotional state which is classified in one of 4 emotional quadrants- high arousal-positive valence, low arousal-positive valence, negative valence-low arousal, and negative valence-high arousal. (Discussed above). Evidently, the emotional state in above example identified with high arousal and positive valence (Quadrant 1). The personal virtual assistant 500 records the user behavior along with the external circumstances such as record of people with whom he feels pleasant, or if he engaging in any of his favourite hobby/activity or listening to his favourite music, reading his favourite author and the like or any other stimuli.
The recorded user behavior and external situational factors are utilized by the personal virtual assistant 500 in situations when the user emotional levels fall in adverse emotional quadrants such as negative valence-low arousal and/or negative valence-high arousal (as will be discussed in later sections). Now, for example if the user kick start with a not so favourable day, had a bad argument with someone, got stuck in heavy traffic, and consequently got reprimanded from his senior, it’s quite likely that the user experiences emotional turbulence and stress, which probably may not be accounted for if he suppresses his emotions in formal social settings. In such situations, based on his spoken words such as “Hope it turns out to be great day”, the personal virtual assistant 500 does not make any objective assessment.
The personal virtual assistant 500 determines user emotional state (based on correlated physical, physiological brain wave signals), matches it with user semblance and spoken words (taken as user feedback). Since, the personal virtual assistant 500 identifies a negative correlation between user emotional state, semblance and feedback, the personal virtual assistant 500 analyses semantics of user spoken words and user behavior corresponding to his semblance and classifies his emotional state. The user may feel demotivated, lost, uninterested and uncared for if his inner feelings remain unaddressed.
In such situations, the personal virtual assistant 500 analysing the user emotions falling into an unfavourable or negative quadrant, attempts to make the user self-aware of his inner emotional level. In one alternate embodiment, the personal virtual assistant 500 motivates the user with motivational words in a voice and tone of his favourite personality or best confidant. Additionally, the personal virtual assistant 500 can present to the user 100 over his head mounted display 200 recordings from previous learnings when the user feel happy and elated and his emotions falling in high arousal-positive valence quadrant.
For example, a scene may be presented to the user 100 by the personal virtual assistant 500 that illustrate user’s scheduled get together with his friends, or meeting with closed ones, or any pleasant activity that delights the user and is known to the personal virtual assistant 500 from its database and memory collated about the user 100 (as above).
Next, even if after aforementioned efforts of personal virtual assistant 500, the user 100 continues to be in agitated state or feel stressed, the personal virtual assistant 500 may input the reaction as feedback to AI engine 750 and explore ways to make feel user contended and satisfied with the response. For example, if the user, after having an unfavourable day, chose to play his favourite game and performs poorly than his expected levels; the virtual assistant 500 may probe from the physical, physiological and brain wave data processed by the AI engine 750, and correlate the information with user action on field.
The personal virtual assistant 500 may then highlight to the user of his unbalanced state of mind, and ways to keep himself focused and attentive during his playing sessions. Thus, the personal virtual assistant 500 is able to understand true hidden meanings, patterns, covered expressions, suppressed emotions, and enables the user restore his mind body balance along with his overall well- being.
In one alternate working embodiment, the personal virtual assistant 500 is backed by a smart AI engine 550 that enables the personal virtual assistant 500 to learn a user's semblance, user voice pattern tone, attitude and associated user behavior over time. For example, in one embodiment, the personal virtual assistant 500 uses machine learning technology, including artificial neural networks or fuzzy logic, in order to develop a model of a user's voice. Speech patterns are developed and stored for identification of the user 100 and for more accurate interpretation of the spoken words and phrases. Voice models are stored with a user profile. Upon receiving vocal input from the user, the input is compared to stored voice models and a corresponding user profile identified. Alternatively, models are not limited to voice models but are also built from other user interactions and biometrics.
Interaction data includes speech, touch, movement, or any other action that is capable of being captured by the sensors and recorded in the memory and database of the personal virtual assistant 500. In one embodiment, the interaction data includes a user talking to the personal virtual assistant 500, such as recording a diary or journal entry, wherein the speech is recorded, stored, analysed for keywords, phrases, and ideas, and is stored in the memory. In another embodiment, the interaction data is stored in the memory and database of the personal virtual assistant 500 that is configured with an always-on listening module to capture time, date, length of interaction, type of interaction, and other quantitative or qualitative indicators necessary for analysis and recall by the personal virtual assistant 500.
As a user 100 continues to interact with the personal virtual assistant 500 and more raw data is collected, the personal virtual assistant 500 continues to develop and update the user profile. As the profile is developed and updated, the personal virtual assistant 500 is operable to analyse and record how user act in response to audial and visual stimuli from the outward environment in conversations, responses, or questions. The interaction development provides for increased conversational ability, as the words, phrases, and ideas that the user 100 communicates with are collected, learned, and eventually used by the personal virtual assistant 500 to converse and interact with the user 100 in a manner that he or she finds most understandable and relatable. Personal virtual assistant 500 is thereby operable to develop its own profile based on user personality profile and respond with a tone, manner, or phrasing dictated by the user personality profile.
In one preferred embodiment, the personal virtual assistant 500 interacts in a free style with the user 100, and not mandatorily in a contemporary question-and-answer style manner. The personal virtual assistant 500 builds user data over years and draw relevant concepts from previously conversed interaction and reiterate these elements to the user 100. Further, the personal virtual assistant 500 upgrades and updates the user interaction database based on daily observations such as his changing likes, dislikes, preferences, choices, opinions and provide recommendations or suggestions to user 100 based on altered user profile and preferences. Personal virtual assistant 500 equipped with ML based analytics and NLP interacts with the user in quasi-human manner and act like one of his life coach, therapist, friend, and/or entertainer.
The tone of a user's voice and the context within which a word, phrase, or idea is communicated affects the meaning of the communication. Thus, the personal virtual assistant 500 configured with natural language processing (NLP) can analyze tone and context for understanding user inner feelings in order to develop and customize the user profiles from these variables. The user profile is updated as and how the user 100 responds to a situation- by way of his tone, speech, usage of words, and vocabulary in a given context. This contributes to understanding of user state of mind and is used by the personal virtual assistant 500 while evaluating and deriving a correlation value.
Later, the user 100 can interact with the personal virtual assistant 500 to obtain correlated value for introspective analysis, to have improved life experience, and better health or mental status. None of the existing prior art discloses developing an interactive personal virtual assistant 500 through both interaction and an analysis of information gathered in both conscious and unconscious state in combination with NLP, recommendations, user profiles, user sentimental and behavioural patterns. While the prior art has generally focused on developing simple mechanical question-answer systems, no platforms to date have provided the personal virtual assistant 500 customized and personalized guidance features enabled by the present invention.
In accordance with an embodiment, the system 1000 comprises of a processing unit 700 and a memory unit configured to store machine-readable instructions. The machine-readable instructions may be loaded into the memory unit from a non-transitory machine-readable medium, such as, but not limited to, CD-ROMs, DVD-ROMs and Flash Drives. Alternately, the machine-readable instructions may be loaded in a form of a computer software program into the memory unit. The memory unit in that manner may be selected from a group comprising EPROM, EEPROM and Flash memory. Further, the processing unit includes a processor operably connected with the memory unit. In various embodiments, the processing unit is one of, but not limited to, a general-purpose processor, an application specific integrated circuit (ASIC) and a field-programmable gate array (FPGA).
In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, written in a programming language, such as, for example, Java, C, or assembly. One or more software instructions in the modules may be embedded in firmware, such as an EPROM. It will be appreciated that modules may comprised connected logic units, such as gates and flip-flops, and may comprise programmable units, such as programmable gate arrays or processors. The modules described herein may be implemented as either software and/or hardware modules and may be stored in any type of computer-readable medium or other computer storage device.
Further, while one or more operations have been described as being performed by or otherwise related to certain modules, devices or entities, the operations may be performed by or otherwise related to any module, device or entity. As such, any function or operation that has been described as being performed by a module could alternatively be performed by a different server, by the cloud computing platform, or a combination thereof. It should be understood that the techniques of the present disclosure might be implemented using a variety of technologies. For example, the methods described herein may be implemented by a series of computer executable instructions residing on a suitable computer readable medium. Suitable computer readable media may include volatile (e.g., RAM) and/or non-volatile (e.g., ROM, disk) memory, carrier waves and transmission media. Exemplary carrier waves may take the form of electrical, electromagnetic or optical signals conveying digital data steams along a local network or a publicly accessible network such as the Internet.
It should also be understood that, unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "controlling" or "obtaining" or "computing" or "storing" or "receiving" or "determining" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that processes and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Various modifications to these embodiments are apparent to those skilled in the art from the description and the accompanying drawings. The principles associated with the various embodiments described herein may be applied to other embodiments. Therefore, the description is not intended to be limited to the embodiments shown along with the accompanying drawings but is to be providing broadest scope of consistent with the principles and the novel and inventive features disclosed or suggested herein. Accordingly, the invention is anticipated to hold on to all other such alternatives, modifications, and variations that fall within the scope of the present invention.
, Claims:We Claim:

1) A method for providing assistance to a user (100) via a personal virtual assistant (500), comprising:
correlating real-time user physical activity with user physiological data and brain wave patterns to obtain user emotional state;
validating the emotional state based on analysis of user behavior, user semblance and user feedback;
classifying the user emotional state into at least one quadrant; and
assisting the user in transforming the emotional state in an event the classified user emotional state falls within a negative quadrant.

2) The method as claimed in claim 1, wherein the real-time user physical activity and real-time user behavior is captured from a plurality of sensors disposed on a head mounted display (200) worn by the user (100).

3) The method as claimed in claim 1, wherein the user physiological data is captured from a plurality of body wearables (300) such as smart watch, smart ring or any smart jewellery enriched with plurality of sensors.

4) The method as claimed in claim 1, wherein the user brain wave patterns are recorded using brain imaging devices (400) such as electroencephalogram, functional magnetic resonance imaging (fMRI) and other neuroimaging techniques.

5) The method, as claimed in claim 1, wherein the user emotional state is classified as happy, sad, fear, anger, surprise or neutral.

6) The method, as claimed in claim 1, wherein the correlation between the real-time physical activity, physiological data and brain wave patterns is achieved using an artificial intelligence (AI) engine (750), wherein the AI engine (750) utilizes linear classification method or a non-linear classification method for the classification of user emotional state.

7) The method, as claimed in claim 6, wherein the AI engine (750) selects a combination of support vector machine (SVM), K- nearest neighbour (KNN) and decision tree (DT) approaches classifying the user emotional state based on the correlated real-time physical activity, physiological data and brain wave patterns.

8) The method, as claimed in claim 1, wherein the personal virtual assistant (500) engages into a quasi- human interaction with the user (100) to obtain the user feedback and validate the correlated information.

9) The method, as claimed in claim 8, wherein the quasi human interaction between the personal virtual assistant (500) and the user (100) is processed at the processing unit (700) using natural language processing technique.

10) The method, as claimed in claim 1, wherein the quadrants comprises of high arousal-positive valence, low arousal-positive valence, negative valence-low arousal, and negative valence-high arousal.

11) The method, as claimed in claim 1, wherein the user semblance comprises of user interactions, emotional levels, voice, pitch, tone, behavior, overall conduct, archetypal images flashing in user thoughts.
12) The method, as claimed in claim 1, wherein in an event of a positive correlation between the user emotional state, user semblance and the user feedback, recording user behavior and associated user semblance with the user emotional state corresponding to the real-time user physical activity.

13) The method, as claimed in claim 1, wherein in an event of negative correlation between the user emotional state and the user feedback, the personal virtual assistant (500) is configured to analyse semantics of spoken words along with user behavior to classify user emotional state.

14) The method, as claimed in claim 1, wherein for the user emotional state falling within the negative quadrant, the personal virtual assistant (500) is configured to transform the user emotional state by apprising user of this inner emotional state, motivating the user in a tone and manner of user’s favourite personality, or presenting over the user worn head mounted device 200 depictions from previous situations wherein user emotional state is classified under high arousal-positive valence quadrant.

15) The method, as claimed in claim 11, wherein the personal virtual assistant (500) utilizes machine learning technology, including artificial neural networks or fuzzy logic or natural language processing technique to develop a model of a user's voice, analyze user tone, speech, vocabulary and words usage in a given context.

Documents

Application Documents

# Name Date
1 202423011819-FORM FOR STARTUP [20-02-2024(online)].pdf 2024-02-20
2 202423011819-FORM FOR SMALL ENTITY(FORM-28) [20-02-2024(online)].pdf 2024-02-20
3 202423011819-FORM 1 [20-02-2024(online)].pdf 2024-02-20
4 202423011819-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [20-02-2024(online)].pdf 2024-02-20
5 202423011819-DRAWINGS [20-02-2024(online)].pdf 2024-02-20
6 202423011819-COMPLETE SPECIFICATION [20-02-2024(online)].pdf 2024-02-20
7 202423011819-FORM-9 [26-02-2024(online)].pdf 2024-02-26
8 202423011819-ENDORSEMENT BY INVENTORS [26-02-2024(online)].pdf 2024-02-26
9 202423011819-STARTUP [01-03-2024(online)].pdf 2024-03-01
10 202423011819-FORM28 [01-03-2024(online)].pdf 2024-03-01
11 202423011819-FORM 18A [01-03-2024(online)].pdf 2024-03-01
12 Abstact.jpg 2024-03-14
13 202423011819-FER.pdf 2024-05-20
14 202423011819-FER_SER_REPLY [22-07-2024(online)].pdf 2024-07-22
15 202423011819-CLAIMS [22-07-2024(online)].pdf 2024-07-22
16 202423011819-PatentCertificate06-11-2025.pdf 2025-11-06
17 202423011819-IntimationOfGrant06-11-2025.pdf 2025-11-06

Search Strategy

1 search_strategyE_12-04-2024.pdf