Abstract: Method and system for automatically responding to specified situation. Emotional states are received and compared with stored emotional states sets to identify set having highest overlap percentage (HOP) and highest cluster value (HCV). If HOP is not hundred, received states are clustered into clusters and corresponding HCVs are computed, else second state of mind (SOM) value is set as zero. Cluster having HCV is identified. Difference of hundred and HOP is determined as conscious inexperience value (CIV). Second SOM value is product of CIV and HCV of identified cluster. First SOM value is zero if HOP is zero, else HOP is determined as subconscious experience value (SEV). First SOM is product of SEV and HCV of identified set. Final SOM value is computed based on first and second SOM values and compared with threshold value, while comparing SEV with CIV. Associated control system is configured to respond based on comparisons.
Claims:
1. A method for identifying and automatically responding to a specified situation, the method comprising:
receiving one or more emotional states of a subject from an emotion recognition subsystem (104);
determining the highest overlap percentage between the received emotional states and one or more sets of stored emotional states using a processing subsystem (128), wherein each of the sets has a corresponding predefined highest cluster value;
clustering the received emotional states into one or more clusters and computing corresponding highest cluster values when the highest overlap percentage is not equal to hundred, and alternatively setting a second state of mind value as zero when the highest overlap percentage is equal to hundred;
identifying a cluster from the one or more clusters having the determined highest cluster value;
determining a conscious inexperience value as the difference of hundred and the highest overlap percentage;
computing the second state of mind value as a product of the conscious inexperience value and the determined highest cluster value of the identified cluster;
setting a first state of mind value as zero when the highest overlap percentage is equal to zero, and alternatively identifying a set from the one or more sets of stored emotional states that has the highest overlap with the received emotional states when the highest overlap percentage is non-zero;
determining a subconscious experience value as the highest overlap percentage;
computing the first state of mind value as a product of the subconscious experience value and the highest cluster value associated with the identified set;
computing the final state of mind value based on the first state of mind value and the second state of mind value;
comparing the final state of mind value to at least one predetermined threshold value corresponding to the specified situation and the subconscious experience value with the conscious inexperience value; and
automatically configuring an associated control system (136) to execute a response based on the comparison of the final state of mind value with the predetermined threshold value and the subconscious experience value with the conscious inexperience value.
2. The method as claimed in claim 1, further comprising determining the one or more emotional states based on one or more parameters measured by one or more sensors (106) associated with the subject by the emotion recognition system (102), wherein the one or more sensors (106) measure one or more of a biometric parameter, a contextual parameter, and an environmental setting corresponding to the subject.
3. The method as claimed in claim 1, further comprising:
organizing one or more of the stored emotional states that have the highest probability of occurrence in the specified situation in a single set by the processing subsystem (128); and
providing an indication that the subject is experiencing an occurrence of the specified situation by a system (100) configured to identify and respond to the specified situation upon determining that the received emotional sets have the highest overlap with the single set.
4. The method as claimed in claim 1, wherein a number and a type of the received emotional states clustered in each cluster is based on a number and type of emotional states that are determined to have the highest probability of occurrence in a specified situation that is to be monitored.
5. The method as claimed in claim 1, wherein computing corresponding cluster values comprises computing a cluster value for each cluster in the one or more clusters based on a sum of individual numerical values associated with each of the received emotional states clustered in the cluster, and wherein the individual numerical values are assigned to each of the received emotional states based on a determined probability of their occurrence in a specified situation that is to be monitored.
6. The method as claimed in claim 1, wherein computing corresponding cluster values comprises computing a cluster value for only those clusters in the one or more clusters which comprise the one or more received emotional states that are determined to have the highest probability of occurrence in the specified situation.
7. The method as claimed in claim 1, wherein automatically executing a selected response based on the comparison comprises:
automatically executing one or more responses predetermined for the specified situation upon determining that the final state of mind value exceeds the at least one predetermined threshold value and the subconscious experience value exceeds the conscious inexperience value, wherein the predetermined responses are selected from successful responses during a past occurrence of the specified situation; and
automatically executing one or more of a set of preprogrammed responses predefined for the specified situation upon determining that the final state of mind value exceeds the at least one predetermined threshold value and the subconscious experience value is less than the conscious inexperience value.
8. The method as claimed in claim 7, further comprising:
storing one or more of the identified set, the highest cluster value associated with the identified set, and the automatically executed response in an associated memory device (130); and
providing the stored information as feedback to a system (100) configured to identify and respond to the specified situation to train the system (100) to provide a faster response during a future occurrence of the specified situation when compared to a past occurrence of the specified situation.
9. The system as claimed in claim 1, wherein the stored emotional states and the received emotional states comprise one or more of a sad emotional state, anger emotional state, fear emotional state, neutral emotional state, disgust emotional state, surprise emotional state and happy emotional state, and wherein the specified situation comprises an emergency situation, a medical illness, an accident, an electric shock, an equipment malfunction, a drug overdose, a fire situation, and a situation affecting law and order of a place.
10. A system (100) for identifying and automatically responding to a specified situation, the system (100) comprising:
a processing subsystem (128) configured to receive one or more emotional states of a subject (102), the processing subsystem (128) configured to:
receive one or more emotional states of a subject from an emotion recognition subsystem (104);
determine the highest overlap percentage between the received emotional states and one or more sets of stored emotional states, wherein each of the sets has a corresponding predefined highest cluster value;
cluster the received emotional states into one or more clusters and compute corresponding highest cluster values when the highest overlap percentage is not equal to hundred, and alternatively set a second state of mind value as zero when the highest overlap percentage is equal to hundred;
identify a cluster from the one or more clusters having the determined highest cluster value;
determine a conscious inexperience value as the difference of hundred and the highest overlap percentage;
compute the second state of mind value as a product of the conscious inexperience value and the determined highest cluster value of the identified cluster;
set a first state of mind value as zero when the highest overlap percentage is equal to zero, and alternatively identify a set from the one or more sets of stored emotional states that has the highest overlap with the received emotional states when the highest overlap percentage is non-zero;
determine a subconscious experience value as the highest overlap percentage;
compute the first state of mind value as a product of the subconscious experience value and the highest cluster value associated with the identified set;
compute the final state of mind value based on the first state of mind value and the second state of mind value;
compare the final state of mind value to at least one predetermined threshold value corresponding to the specified situation and the subconscious experience value with the conscious inexperience value; and
automatically configure an associated control system (136) to execute a response based on the comparison of the final state of mind value with the predetermined threshold value and the subconscious experience value with the conscious inexperience value.
11. The system (100) as claimed in claim 10, further comprising an emotion recognition subsystem (104) configured to receive one or more measurements from a plurality of sensors (106) associated with the subject (102), and process the measurements to detect one or more emotional states of the subject (102).
12. The system (100) as claimed in claim 11, wherein the plurality of sensors (106) comprise one or more of an image sensor (108) configured to detect facial expressions, a microphone sensor (110) configured to detect vocal expressions, a gait sensor (112) configured to detect body movements and gestures, an Electrocardiogram (ECG) sensor (114) configured to detect heart activity expressions, an electroencephalogram (EEG) sensor (116) configured to detect brain activity expressions, a thermistor sensor (118) configured to detect fingertip temperature expressions, an electromyography (EMG) sensor (120) configured to detect skeletal muscles activity expressions, an electro-dermal activity (EDA) sensor (122) configured to detect electro-dermal activity expressions, and an EQ-radio sensor (124) configured to wirelessly detect human emotion using radio frequency (RF) signals.
13. The system (100) as claimed in claim 10, further comprising:
a memory (130) configured to store one or more of the identified set, the highest cluster value associated with the identified set, and the automatically executed response as feedback information;
a machine learning system (134) that is configured to receive the feedback information from the memory (130) to train the system (100) to provide faster identification of the specified situation, selection of the most suitable response, and execution of the most suitable response during a future occurrence of the specified situation as compared to a past occurrence of the specified situation.
14. The system (100) as claimed in claim 10, wherein the system (100) comprises one or more of a driver assistance system in a vehicle, a safety system for operators using heavy machinery, an emergency response system, and a guided learning system, and the associated control system (136) comprises one or more of a vehicle electronic control unit, a motor control unit for heavy machinery, a communications unit, and a control unit that regulates operation of a functional unit in a device.
15. The system (100) as claimed in claim 14, further configured to:
identify one or more of inclement weather conditions, drowsiness of a driver of a vehicle, a medical emergency corresponding to the driver or a passenger in the vehicle, an accident involving the vehicle, and a malfunction of an equipment as the specified situation based on the comparison; and
automatically configure the associated control system (136) to execute one or more of activating fog lamps or hazard lights, initiating adaptive breaking, controlling driver assistance functions in a vehicle, calling a medical facility, calling a designated emergency contact, dialing a designated emergency assistance number, navigating the vehicle to a the medical facility, providing one or more of audio, visual, tactile alerts, safely shutting down malfunctioning equipment, stopping the vehicle at a safe location, and safely controlling the operation of the vehicle or the equipment upon identification of the corresponding specified situation.
, Description:
RELATED ART
[0001] The present disclosure generally relates to automatically responding to a situation. More specifically, the present disclosure relates to a system and a method for identifying and responding to an emergency situation based on a perceived state of mind of a subject at a given instant of time.
[0002] Health monitoring systems monitor the health condition of a subject and warn in case of any emergency situation. Currently available health monitoring systems utilize sensors and transducers to gather physical data of the subject to continually monitor the health condition of the subject. Such health monitoring systems, generally, compare certain baseline physical parameters with the collected physical data. However, a physical response to a stress-inducing task or situation may vary considerably among individuals. Therefore, a baseline-based comparison may not provide accurate identification of an emergency condition.
[0003] Humans display physiological traits that reflect their current emotional state. It can be advantageous to monitor and analyze these physiological traits in addition to the physical parameters to determine a state of mind of the subject more accurately. The state of mind information, in turn, may be utilized to determine or predict an ill-health scenario of the subject.
[0004] US Patent Publication No. 20160358085A1, for example, describes an adaptive system and method for modeling the behavioral and non-behavioral state of individual human subjects according to a set of past observations through sensed signals via a multi-modal pattern recognition algorithm. The system may take into consideration both subjective parameters that are learnt from the user over time and contextual factors that are provided to the system to achieve the model development. The disclosed system, as per an embodiment, appears to be used to propose a better alternative or distract the person toward a healthier choice, avoid other bad habits, and promote good habits. However, the disclosed system does not provide any method to account for the variation in human emotions in response to the same stimulus received during a specific situation, for example, an emergency situation experienced by a subject in real-time.
[0005] Therefore, there is a need of an intelligent health monitoring system which may use information about a current emotional state of a person to identify if the person is in an emergency situation, and may further provide artificial immunity that learns and provides expedited response to the emergency situation based on past ill-health experiences.
SUMMARY
[0006] According to an exemplary aspect of the present disclosure, a method for identifying and automatically responding to a specified situation is presented. The method includes receiving one or more emotional states of a subject from an emotion recognition subsystem. The method further includes determining the highest overlap percentage between the received emotional states and one or more sets of stored emotional states using a processing subsystem, wherein each of the sets has a corresponding predefined highest cluster value. The method also includes clustering the received emotional states into one or more clusters and computing corresponding highest cluster values when the highest overlap percentage is not equal to hundred. Alternatively, the method includes setting a second state of mind value as zero when the highest overlap percentage is equal to hundred. The method further includes identifying a cluster from the one or more clusters having the determined highest cluster value. Additionally, the method includes determining a conscious inexperience value as the difference of hundred and the highest overlap percentage. The method includes computing the second state of mind value as a product of the conscious inexperience value and the determined highest cluster value of the identified cluster. The method further includes setting a first state of mind value as zero when the highest overlap percentage is equal to zero. Alternatively, the method includes identifying a set from the one or more sets of stored emotional states that has the highest overlap with the received emotional states when the highest overlap percentage is non-zero. The method also includes determining a subconscious experience value as the highest overlap percentage and computing the first state of mind value as a product of the subconscious experience value and the highest cluster value associated with the identified set. The method includes computing the final state of mind value based on the first state of mind value and the second state of mind value. The method further includes comparing the final state of mind value to at least one predetermined threshold value corresponding to the specified situation and the subconscious experience value with the conscious inexperience value. The method also includes automatically configuring an associated control system to execute a response based on the comparison of the final state of mind value with the predetermined threshold value and the subconscious experience value with the conscious inexperience value.
[0007] According to an exemplary aspect of the present disclosure, a system for identifying and automatically responding to a specified situation is presented. The system includes a processing subsystem configured to receive one or more emotional states of a subject. The processing subsystem is configured to receive one or more emotional states of a subject from an emotion recognition subsystem. The processing subsystem is further configured to determine the highest overlap percentage between the received emotional states and one or more sets of stored emotional states, wherein each of the sets has a corresponding predefined highest cluster value. The processing subsystem is also configured to cluster the received emotional states into one or more clusters and computing corresponding highest cluster values when the highest overlap percentage is not equal to hundred. Alternatively, the processing subsystem is configured to set a second state of mind value as zero when the highest overlap percentage is equal to hundred. Additionally, the processing subsystem is configured to identify a cluster from the one or more clusters having the determined highest cluster value. The processing subsystem is further configured to determine a conscious inexperience value as the difference of hundred and the highest overlap percentage. The processing subsystem is also configured to compute the second state of mind value as a product of the conscious inexperience value and the determined highest cluster value of the identified cluster. The processing subsystem is configured to set a first state of mind value as zero when the highest overlap percentage is equal to zero. Alternatively, the processing subsystem is configured to identify a set from the one or more sets of stored emotional states that has the highest overlap with the received emotional states when the highest overlap percentage is non-zero. The processing subsystem is also configured to determine a subconscious experience value as the highest overlap percentage. The processing subsystem is configured to compute the first state of mind value as a product of the subconscious experience value and the highest cluster value associated with the identified set. The processing subsystem is further configured to compute the final state of mind value based on the first state of mind value and the second state of mind value. The processing subsystem is configured to compare the final state of mind value to at least one predetermined threshold value corresponding to the specified situation and the subconscious experience value with the conscious inexperience value. The processing subsystem is further configured to automatically configure an associated control system (136) to execute a response based on the comparison of the final state of mind value with the predetermined threshold value and the subconscious experience value with the conscious inexperience value.
[0008] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described earlier, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE FIGURES
[0009] The accompanying drawings, which are incorporated herein and constitute a part of this disclosure, illustrate exemplary embodiments, and together with the description, serve to explain the disclosed principles. The same numbers are used throughout the figures to reference like features and components, wherein:
[0010] FIG. 1 illustrates an exemplary representation of a system for identifying and responding to a specified situation by determining a state of mind of a subject, in accordance with an embodiment of the present disclosure;
[0011] FIGs. 2A and 2B illustrates a flowchart depicting a method for identifying and responding to a specified situation by determining a state of mind of a subject, in accordance with an exemplary embodiment of the present disclosure;
[0012] FIGs. 3-8 illustrate tables listing exemplary parameters and computations of first, second and final state of mind values based on these parameters, in accordance with an exemplary embodiment of the present disclosure
DETAILED DESCRIPTION
[0013] In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one skilled in the art that these specific details are only exemplary and not intended to be limiting. Additionally, it may be noted that the systems and/or methods are shown in block diagram form only in order to avoid obscuring the present disclosure. It is to be understood that various omissions and substitutions of equivalents may be made as circumstances may suggest or render expedient to cover various applications or implementations without departing from the spirit or the scope of the present disclosure. Further, it is to be understood that the phraseology and terminology employed herein are for the purpose of clarity of the description and should not be regarded as limiting.
[0014] Furthermore, in the present description, references to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. The appearance of the phrase “in one embodiment” in various places in the specification is not necessarily referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Further, the terms “a” and “an” used herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described, which may be requirements for some embodiments but not for other embodiments.
[0015] The following description presents exemplary systems and methods for monitoring a subject and responding to a specified situation involving the subject. For clarity, the present embodiments are described herein with reference to an exemplary system and method for identifying and automatically responding to an emergency situation based on a perceived state of mind of the subject at a given time based on a probability of occurrence of different emotions in the emergency situation. For example, the present system may be incorporated into a vehicle or a wearable such as a helmet worn by a bike rider for identifying and automatically responding to an emergency situation based on a perceived state of mind of the subject. However, by suitably defining the state of mind and associated emotions, the present disclosure can be used to identify and respond to any specified situation.
[0016] For example, the present systems and methods may be used to identify inclement weather conditions based on a perceived state of mind of a driver of a vehicle and automatically implement responsive actions suitable to address the inclement weather conditions. Specifically, upon identifying an emotional state that indicates an expected response to inclement weather, the present systems and methods may configure an associated control system such as an electronic control unit (ECU) within the vehicle to automatically activate fog lamps, hazard lights, and/or initiate adaptive breaking.
[0017] In another example, the present systems and methods may be used to identify drowsiness of a heavy machinery operator and initiate operator assistance functions such as providing audio, visual, and/or tactile alerts, and/or initiate a safe shutdown of the heavy machinery. In yet another example, the present systems and methods may be used as a guided learning system, for example, that monitors actions of a novice user of a device or application, identifies erroneous actions, and executes correct actions to train the user. An exemplary environment that is suitable for practicing various implementations of the present systems and methods is discussed in detail with reference to FIGs. 1-2.
[0018] FIG. 1 illustrates an exemplary representation of a system 100 that is configured to determine a state of mind (SOM) of a subject and initiate an automatic response to the determined SOM that is indicative of a specified situation. As used herein, the term “SOM” may refer to the cognitive and/or the physical state of the subject. For example, human cognitive state refers to the state of a person's cognitive processes that defines his/her SOM. This may include, but is not limited to emotions, mood and interestedness, or any other psychological quantities; such as fatigue, stress, hunger, appreciation, alertness, frustration, anxiety, excitation, depression, arousal, and drowsiness.
[0019] In one embodiment, the system 100 is configured to identify the SOM of the subject based on predominant emotional state of the subject. To that end, the system 100 includes an independent emotion recognition subsystem 104 configured to measure and analyze various physiological traits of the subject 102. In an alternative embodiment, however, the emotion recognition subsystem 104 may be an integral part of the system 100 configured to measure and analyze data corresponding to various physiological traits that are indicative of the emotional state of the subject 102. In particular, the emotion recognition subsystem 104 may include a plurality of sensors 106, which may be attached to the body of the subject 102, worn by the subject 102 and/or disposed at one or more locations in the surroundings of the subject 102. The emotion recognition subsystem 104 may further include a processing unit (not shown), such as a Digital Signal Processor (DSP), communicatively coupled to the plurality of sensors 106, and which may act as a hub to collect and analyze measured physiological data from the plurality of sensors 106 for further processing and/or transmission.
[0020] Accordingly, in one embodiment, the emotion recognition subsystem 104 may be implemented in a belt worn by the subject 102 (as shown in FIG. 1), a wrist watch worn by the subject 102, a pocket device being carried by the user in one of the pockets in his/her clothing, or the like. Alternatively, the emotion recognition subsystem 104 may be implemented in the form of one or more contactless sensors 106 such as a contactless heart rate sensor that may be disposed in the vicinity of the subject 102 without direct contact with the body of the subject 102. In some examples, the emotion recognition subsystem 104, including the associated plurality of sensors 106 and the processing unit, may be battery powered for the purpose of portability and mobility of the subject 102.
[0021] In certain embodiments, the emotion recognition subsystem 104 may utilize a Body Area Network (BAN) for communication and transmission between the sensors 106 and the processing unit. Wireless BANs provides many advantages in health care applications, including communication efficiency and cost-effectiveness. Indeed, using BAN, physiological signals obtained by body sensors can be effectively processed to obtain reliable and accurate physiological estimations. At the same time, the ultra-low power consumption by the BAN makes the batteries of the sensors 106 last longer. Another important benefit of BAN is their scalability and integration with other network infrastructure, e.g. BANs may interface with Wireless Sensor Networks (WSNs), radio frequency identification tags (RFID), Bluetooth, Bluetooth Low Energy (BLE), video surveillance systems, wireless personal area network (WPAN), wireless local area networks (WLAN), internet, and/or cellular networks.
[0022] In one exemplary embodiment, as schematically illustrated in FIG. 1, the plurality of sensors 106 may include an image sensor 108, such as a camera, for detecting facial expressions that may be used to determine emotion ‘E1.’ The sensors 106 may further include a microphone sensor 110 for detecting vocal expressions that may be used to determine emotion ‘E2’, and a gait sensor 112 for detecting body movements and gestures that may be used to determine emotion ‘E3.’ Additionally, the sensors 106 may include an Electrocardiogram (ECG) sensor 114 for detecting heart activity that may be used to determine emotion ‘E4,’ and an electroencephalogram (EEG) sensor 116 for detecting brain activity expressions as emotion ‘E5’. Further, the sensors 106 may include a thermistor sensor 118 for detecting fingertip temperature that may be used to determine emotion ‘E6’, and an electromyography (EMG) sensor 120 for detecting skeletal muscles activity that may be used to determine emotion ‘E7.’ The sensors 106 may also include an electro-dermal activity (EDA) sensor 122 for detecting electro-dermal activity that may be used to determine emotion ‘E8’, and an EQ-radio sensor 124 (as developed by Massachusetts Institute of Technology) for wirelessly detecting human emotion using radio frequency (RF) signals as emotion ‘E9’. It may be understood that the number and types of sensors 106 listed here are only exemplary, and the number and type of the sensors 106 may vary based on the type of application, and environment setting, among various other factors. Further, the placement of the plurality of sensors 106 with respect to the subject 102 for collecting corresponding physiological data, as shown in FIG. 1, is for illustration purposes only and shall not be construed as limiting to the present disclosure.
[0023] As previously noted, the emotion recognition subsystem 104 may collect the measured physiological data from the plurality of sensors 106 using wired and/or wireless means, without any limitations. In some examples, the emotion recognition subsystem 104 is configured to extract significant features from the collected physiological data using a metaheuristic optimization algorithm, such as but not limited to, ant colony optimization algorithm, and artificial bee colony optimization. The metaheuristic optimization algorithm may allow selection of the significant features from the collected physiological data by selecting suitable features and/or removing some redundant and unwanted features. For instance, the image sensor 108 may capture multiple images of face of the subject 102, but among those multiple images, images of frontal face may be most relevant to determine the facial emotions as frontal face includes all facial features. Thus, the emotion recognition subsystem 104 may use the metaheuristic optimization algorithm to discard images, for example including half-faces, and bent faces, for optimizing the data collected from the plurality of sensors 106. It may be understood that the emotion recognition subsystem 104 may analyze the collected physiological data and determine the emotional states of the subject 102 independently with respect to each sensor from the plurality of sensors 106 at any given instant of time. That is, the emotion recognition subsystem 104 is configured to receive the one or more measurements from the plurality of sensors 106, and process the measurements to detect one or more emotional states of the subject. The techniques for determining emotional state from the corresponding sensed physiological data are already known in the art, and thus, have not been described herein for brevity of the present disclosure.
[0024] In some examples, the emotion recognition subsystem 104 may generate a set of emotional states by using measured physiological data collected by the plurality of sensors 106 and analyzing the measured physiological data using the processing unit. In the present exemplary embodiment, the different types of emotional states include one or more of a sad state, angry state, fearful state, neutral state, disgusted state, surprised state, and happy state. As noted earlier, the plurality of sensors 106 is configured to independently detect distinct emotional states of the subject 102. It may be understood that the number and types of emotional states that could be detected by the plurality of sensors 106 are not limited to the ones listed herein, and may vary depending on the application and requirements addressed by the present system 100. Additionally, in a presently contemplated embodiment, the emotional states may be broadly classified as external emotions and internal emotions. For example, emotion identified using the features determined from face, voice and body gestures, which can be sensed externally using the sense organs, such as by viewing the face, listening to the voice, and observing the body poses, respectively, are classified as external emotions. Further, the emotions identified by using measured physiological signals such as ECG, EEG, fingertip temperature, EMG, EDA and EQ-radio sensor signal are termed as internal emotions.
[0025] In certain embodiments, the system 100 may also be configured to determine contextual information about the subject 102 such as the location of the subject 102. For this purpose, the system 100 may include a location sensor 126 disposed in communication with the emotion recognition subsystem 104. The location sensor 126 may use one or more of Global Positioning System (GPS), Global Navigation Satellite System (GLONASS), Galileo system, Indian Regional Navigation Satellite System (IRNSS), Beidou system, triangulation or trilateration of cellular or Wi-Fi networks, and other similar methods for determining current location information about the subject 102 at the given instant of time. In some examples, the location sensor 126 may not be a dedicated sensor, but may be implemented using existing location detection means, for example, GPS sensor available in personal cell phone of the subject 102, without any limitations.
[0026] According to certain aspects of the present disclosure, the system 100 employs a psychology inspired SOM determination to accurately determine the mental state of a particular person. In one example, the system 100 functions as an Artificial Immune System (AIS), which is a computationally intelligent, rule-based machine learning system, inspired by the principles and processes of the vertebrate immune system, that leverages historical data to provide faster responses to identified situations. In such case, the system 100 may implement suitable algorithms that are typically modeled after the immune system's characteristics of learning and memory for use in the present system 100. The SOM determination utilizes the concept of conscious and subconscious minds combined with artificial intelligence to provide an analytical approach for identifying emotional state indicative of health of the subject. Typically, when the human mind senses events in the surroundings, the sensed data may be analyzed by the subconscious part of brain to identify any previous occurrences of similar events and/or associations to similar stimuli. If a similar event has already occurred in the past, then the subconscious mind aids the body in implementing a suitable response based on the learning from the previous occurrences. However, in case of a new event, the conscious part of the brain analyzes the event information and selects an appropriate response. Specifically, the conscious part of the brain simplifies the sensed data via a sequence of events to respond to the event happening in the surroundings. In a similar manner, the system 100 analyzes the determined emotional states to deduce a SOM of the subject 102, and may further predict if the subject 102 is in an emergency situation based on the deduced SOM. The system 100 may further choose to execute or configure an associated control system (136) to execute a predetermined response based on the predicted emergency situation.
[0027] Generally, a decision taking capability of a human being depends on the amount of information available to him or her before taking the decision. The conscious and subconscious parts of brain play a crucial role in taking any decisions. Cognitive neuroscience suggests that all human decisions are based on circumstances. However, a decision for the same situation varies from person to person, as the decision-taking capability depends on the individuality and personality of a person. Hence, determining a mathematical model that emulates the complex decision taking capability of the human brain is very challenging. The conscious part of brain simplifies data, which is a sequence of events to take a decision regarding what is happening in the surroundings. Based on the past experiences, the human brain simplifies the situation in its own way and tries to take the best possible decision upon encountering a new situation. The SOM determination, as described in the present disclosure, is adapted to emulate the steps executed by the human brain to identify a specified situation experienced by the subject 102. The system 100 determines the SOM of the subject 102 using a generalized mental approach for decision taking, without accounting for the certain personality and individuality components of the subject 102 that may not be altruistic in nature.
[0028] The present system 100 utilizes external emotions determined from facial, voice and body movement and gestures’ measurements, as well as internal emotions, like ECG signals, EEG signals, fingertip temperature, EMG, EDA and EQ-radio sensor signals to determine the SOM of the subject 102. The system 100 receives the emotions of the subject 102 as inputs, processes and simplifies these emotions to determine the current SOM of the subject 102. The system 100 may further generate an appropriate response based on the identified SOM of the subject 102. Certain examples of the identification of the SOM and generating an appropriate response will be described with reference to FIGs. 2A and 2B. According to an embodiment of the present disclosure, the system 100 also employs machine learning. A first-level machine learning stores the datasets of previously unknown or different emotions in a repository and a second level of machine learning provides artificial immunity by storing the best responses and actions taken during past events and situations. For instance, during the first occurrence of an emergency situation, the system will be completely inexperienced, the extent of inexperience indicated using a calculated conscious inexperience value. In this instance, the system 100 may be configured to contact multiple emergency numbers of friends, relations, and nearby hospitals. Assuming that, among all those contacted, one was successful. The system 100 learns from the first experience such that during the second occurrence of a similar emergency situation, the system 100 will choose the best emergency action, which proved to be successful during the previous occurrence based on the learnings from the past experience. The learnings may be indicated using a subconscious experience value, and use of the learnings may help in providing faster and more suitable response to the emergency situation. Furthermore, in certain embodiments, if the subconscious experience value is determined to be greater than the conscious inexperience value, the best response stored as a part of the subconscious experience will be executed, else the system 100 executes the multiple responses.
[0029] In one embodiment, the system 100 employs the proposed SOM determination to learn from past experiences to identify suitable responses to different situations. In particular, the SOM determination entails calculating the SOM value using the detected emotional states, where each emotional state is assigned a numerical value for each identified situation that may be experienced by the subject 102. In one embodiment, the numerical value of an emotional state is defined based at least in part on a relativistic probability of an occurrence of the emotional state in a specific emergency situation, for example, in the event of a vehicle collision. That is, values for each of the emotions are assigned based on their significance in identifying the mental-state. Thus, the emotions may typically be exhibited by the subject 102 during the specific emergency situation are given the highest value.
[0030] For instance, during the ill-health scenario the probability of being happy is less, while the probability of being sad is high, and the probability of being neutral is intermediate. Therefore, the happy state may be assigned a numerical value of ‘0’, the sad state may be assigned numerical value of ‘10’, and the neutral emotional state may be assigned numerical value of ‘5’ (i.e. the average of numerical values of the happy state and the sad state). Since, in one example, the disgusted state and the surprised state may fall between happy state and neutral state, these emotional states may be assigned a numerical value of ‘3’. Similarly, the angry state and the fearful state may fall between sad state and neutral emotional state, these emotional states may be assigned a numerical value of ‘9’. FIG. 3 depicts a ‘Table 1’ that includes a list of certain emotional states, corresponding symbols, and exemplary numerical values assigned to the emotional states when monitoring the subject 102 for a specified situation, for example, a heart attack.
[0031] In one embodiment, the system 100 includes a processing subsystem 128 for identifying the SOM of the subject 102 from determined emotional states, and executing an appropriate response based on the identified SOM. To that end, the processing subsystem 128 may generally be implemented as a combination of a processor (not shown) and a memory 130 operatively coupled with each other. Herein, the memory may be capable of storing machine executable instructions, and the processor may be capable of executing the stored machine executable instructions for performing tasks such as parsing sets of emotional states, and other functions associated with the SOM determination and executing a corresponding response. Examples of the memory 130 include, but are not limited to, volatile memory devices (e.g., registers, cache, RAM) and/or non-volatile memory devices (e.g., ROM, EEPROM, flash memory, etc.). The processor may be embodied as one or more of various processing devices, such as a multi-core processor, a single core processor, a coprocessor, a microprocessor, a controller, a DSP, and/or a processing circuitry with or without an accompanying DSP. Furthermore, processor may be embodied as various other processing devices including integrated circuits such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. Moreover, the processor may be a distributed or a unified system, without any limitations.
[0032] As illustrated in FIG. 1, the processing subsystem 128 is disposed in signal communication with the emotion recognition subsystem 104 in order to receive the set of emotional states (represented using the symbol ‘D’ in FIG. 1) as well as the contextual information (represented using the symbol ‘C’ in FIG. 1), including the location information, about the subject 102 based on the measurements collected by the plurality of sensors 106. In one example, the emotion recognition subsystem 104 and the processing subsystem 128 may be integrated on the same circuit board. In other examples, the processing subsystem 128 may be a local unit which may communicate with the emotion recognition subsystem 104 using short-range wireless communication means, such as, but not limited to, Bluetooth, Infrared, and ZigBee. In some other examples, the processing subsystem 128 may be a cloud-based platform, and in such scenarios, the emotion recognition subsystem 104 may be adapted to transmit the set of identified emotional states to the processing subsystem 128 via a communications link (as represented in FIG. 1). The communications link, for example, may be made available using a mobile phone of the subject 102 utilizing GSM, EDGE, 3G, 4G, 5G or any other suitable communication standards for accessing Internet, or any other gateway device capable of providing Internet access.
[0033] As schematically illustrated in FIG. 1, the processing subsystem 128 may receive the set of emotional states ‘D’ from the emotion recognition subsystem 104. The processing subsystem 128 may verify if the received set of emotional states ‘D’ matches a previously stored set of emotional states that are representative of emotional states and corresponding SOM and actions determined during past experiences. If set of received emotional states fails to match an emotional set representative of a past experience, the processing subsystem 128 routes the set ‘D’ to a conscious mind processing unit 132 for further processing. Specifically, the conscious mind processing unit 132 may be configured to act as a “conscious mind” in a human being that determines a SOM of the subject 102 by analyzing the received set of emotional states without the aid of any learning from past experience. To that end, the conscious mind processing unit 132 may further include suitable processing circuitry, storage circuitry, one or more computer programs, and/or a set of stored instructions. In one embodiment, the conscious mind unit 132 is communicatively coupled to the processing subsystem 128 to analyze the new set of received emotional states. In an alternative embodiment, however, the conscious mind processing unit 132 may be an integral part of the processing subsystem 128.
[0034] As noted earlier, the emotion recognition subsystem 104 may identify a set of emotional states corresponding to the subject 102 by analyzing the physiological data measured by the plurality of sensors 106. The processing subsystem 128 receives and processes the set of emotional states to determine first SOM second SOM values, which in turn, are used to determine a final SOM value. Specifically, the processing subsystem 128 determines an overlap percentage and a difference percentage based on a match and a difference between the received emotional states and one or more sets of stored emotional states, respectively. Each of these sets of stored emotional states may have a predetermined highest cluster value associated with it. In one embodiment, the processing subsystem 128 computes the final SOM value as a linear summation of the first SOM value and the second SOM value. The first SOM value may correspond to a first product of the overlap percentage and the highest cluster value associated with a stored set of emotional states having the highest overlap with the received emotional states. The second SOM value may correspond to a second product of the difference percentage (100 – highest overlap percentage) and the highest cluster value computed for a cluster of received emotions that have not been previously encountered completely, and are consequently clustered using the conscious mind processing unit 132.
[0035] In a first scenario, where each of the received emotional states is present in at least one of the sets of stored emotional states, the processing subsystem 128 configures the subconscious mind processing unit 134 to calculate a percentage overlap of the received emotional states with each of the sets of stored emotional states. Additionally, the subconscious mind processing unit 134 is configured to select the set having the highest overlap percentage and identify the highest cluster value associated with the selected set. The subconscious mind processing unit 134 is further configured to compute the first SOM value as a first product of the highest overlap percentage and the highest cluster value associated with the selected set. When all received emotional states are found in a single set of stored emotional states, the overlap percentage is determined to be 100 percent or 1, whereas the difference percentage (100 – highest overlap percentage) is determined to be zero. In such a scenario, the first SOM value that corresponds to the first product will be equal to the highest cluster value associated with the selected single set. Furthermore, in such a scenario, the second SOM value corresponds to the zero.
[0036] In a second scenario, where none of the received emotional states is present in any of the sets of stored emotional states, the processing subsystem 128 configures the conscious mind processing unit 132 to cluster the newly encountered emotional states into one or more clusters, each cluster having a predetermined number of emotional states. According to aspects of the present disclosure, the system 100 employs the clustering of emotional states in order to circumvent any ambiguity derived from the detected set of emotional states, for example, if an emotional state identified from a facial expression does not match with emotional state identified from voice input or from ECG measurements. The conscious mind processing unit 132 is configured to calculate cluster values for the one or more clusters based on the numerical values of the emotional states therein. Subsequently, the conscious mind processing unit 132 is configured to identify the highest cluster value. As the overlap percentage will be zero, and the difference percentage will be 100 percent or 1 in the second scenario, the conscious mind processing unit 132 determines the first SOM value as zero, and the second SOM value as a product of the highest cluster value and 1. Accordingly, in the second scenario, the final SOM value that corresponds to the sum of the first and second SOM values will be equal to the highest cluster value.
[0037] In a third scenario, at least some of the received emotional states may have been previously encountered by the system 100, and hence, are present in at least one of the sets of stored emotional states, whereas the remaining emotional states may be unknown or different (for example, when stored E1 is not same as received E1). In such a scenario, the processing subsystem 128 configures the subconscious mind processing unit 134 to compute the first SOM value as described with reference to the first scenario. That is, the subconscious mind processing unit 134 is configured to identify the set of stored emotional states that has the highest overlap with the received emotional states that are present in the sets of stored emotional states, and the highest cluster value associated with the identified set. Subsequently, the subconscious mind processing unit 134 is configured to compute the first SOM value as a first product of the highest overlap percentage and the highest cluster value associated with the identified set.
[0038] Additionally, the processing subsystem 128 configures the conscious mind processing unit 132 to compute the second SOM value as described with reference to the second scenario. That is, the conscious mind processing unit 132 is configured to cluster the received emotional states when these states include a few previously un-encountered emotional states, and compute corresponding cluster values based on the emotional states included therein. The conscious mind processing unit 132 identifies the cluster with the highest cluster value and computes the second SOM value as a second product of a difference percentage (100-highest overlap percentage) and the computed highest cluster value. Finally, the processing subsystem 128 is configured to compute the final SOM value as a sum of the first and second SOM values. A sample determination of the first, second, and final SOM values is described using an exemplary scenario in subsequent paragraphs.
[0039] In an exemplary implementation, the processing subsystem 128 receives the detected emotional states E1 to E9 from different sensors 108-124. If the received emotional states include at least two distinct emotional states, the processing subsystem 128 configures the conscious mind processing unit 132 to cluster two or more distinct emotional states in each cluster. In particular, the conscious mind processing unit 132 may be configured to cluster two or more than two emotional states based on a number and type of emotional states defined as being indicative of a specified emergency situation or any other situation of interest.
[0040] When considering nine (9) distinct emotional states for identifying an emergency situation that is defined using three distinct emotional states, the 9 emotional states may be clustered into groups of three to provide a maximum of 9C3 combinations, provided there are 9 distinct kinds of emotions. The value of each cluster depends on the numerical values of the emotional states present in that cluster. Consider a case when the emotion-recognition results for a given instant of time are as listed in ‘Table 2’ of FIG. 4. The number of combinations depends on the number of distinct emotional states. In the given example shown in Table 2, ‘S’ appears 2 times, ‘F’ appears 4 times, ‘SP’ appears 1 time, ‘A’ appears 1 time and ‘H’ appears 1 time. Thus, five distinct emotional states were identified from the seven significant emotional states listed in Table 1. Hence, the number of clusters will be 5C3, that is, equal to 10, among maximum of 7C3 clusters as there are 7 distinct emotional states. FIG. 5 depicts Table 3 that shows the 10 possible clusters and calculation of corresponding cluster values. As evident from the depictions of FIG. 5, cluster 2 has the highest cluster value. Therefore, when considering the first or second scenario where either all or none of the received emotional states are previously known, the final SOM value will be equal to the highest cluster value, that is, ‘65.’ Subsequently, the processing subsystem 128 is configured to compare the final SOM value with a predetermined threshold value to identify if an emergency situation has occurred and execute a suitable response. The processing subsystem 128 stores the selected set of received emotional states (S, F, A) and the corresponding response in the associated memory 130.
[0041] Another exemplary set of emotional states ‘De’ is listed in ‘Table 4’ of FIG. 6. Upon comparing the set ‘De’ with the set ‘D’ of Table 2, it may be evident that the stored set ‘D’ is partially equivalent to the set ‘De,’ and thus, is similar to the previously noted third scenario where some emotional states match previously stored states, whereas some do not match. There are four distinct emotions present in set ‘De’, which are Sad (S), Fear (F), Angry (A), and Happy (H). The combinations, taken three emotional states at a time, are shown as corresponding clusters in Table 5 of FIG. 7. Furthermore, the cluster values calculated based on the numerical values associated with the corresponding emotional states is shown in Table 6 of FIG. 8. As shown in Table 6, cluster 1 has the highest cluster value.
[0042] The processing subsystem 128 may be configured to calculate an overlap percentage, which is the percentage overlap of incoming received emotions shown in Table 4 that match the sets of already stored emotions shown in Table 2. The highest overlap (stored E1= received E1= S and stored E2= received E2 = F) is between set De and the stored set D shown in Table 2. Therefore, with two matching emotional states out of a total of nine emotional states, the highest overlap percentage for the emotions listed in Table 4 may be equal to a computed value (2/9 * 100) that is 22.22 percent. The first SOM value, thus, corresponds to ‘14.44’ that is a first product of the overlap percentage 22.22 percent and the highest cluster value ‘65’ associated with the cluster 2 of Table 3 that has the highest number of stored emotional states that match the received emotional states. Here, the difference percentage may be determined to be 77.78 percent. Accordingly, a second SOM value of ‘43.55’ may be computed as a product of the difference percentage of 77.78 percent and the highest cluster value of ‘56’ computed for the received emotional states that are clustered by the conscious mind processing unit 132 and are shown in Table 6 of FIG. 8. Furthermore, a final SOM value may be computed to be ‘57.99’ based on a summation of the first and second SOM values.
[0043] In certain embodiments, the processing subsystem 128 is configured to ignore certain clusters that do not comprise any emotional states that define a designated emergency situation for expediting the computation of the final SOM value. Instead, the processing subsystem 128 is configured to compute an associated cluster value for a selected cluster only if the set of emotional states therein comprises at least one of the emotional states that define a designated emergency situation that needs to be monitored. For example, in any emergency situation such as a heart attack, the emotional states expected to be exhibited by the subject 102 may include a sad emotional state (S), anger emotional state (A), and fear emotional state (F). Here, the SOM for such an emergency situation will be defined as a sum of these emotional states, which is SOM = (F+A+S). Accordingly, the processing subsystem 128 is configured to compute an associated cluster value for a selected cluster only if the set of emotional states therein comprises the sad emotional state, the anger emotional state, and/or the fear emotional state. Further, if all three of the sad, anger, and fear emotional state are detected along with or without other emotional states, the processing subsystem 128 is configured to only determine the cluster value for the cluster having all three of the sad emotional state, the anger emotional state and the fear emotional state, i.e. cluster (S,A,F). Accordingly, the SOM value may be determined to correspond to the highest cluster value associated with the cluster (S, A, F).
[0044] In implementations where a heart attack emergency is to be monitored, the sad, angry and fear emotional states may be assigned highest numerical values, and thus by extension, the cluster having the said three emotional states would evidently have the highest cluster value. Similarly, for monitoring other types of emergencies or situations of interest, emotional states expected during the respective situations may be assigned highest numerical values, thus aiding in identifying the respective emergency or situation of interest by identifying the cluster having the highest cluster value. It may be understood that in such scenarios it may become moot to calculate the cluster values for other clusters as the processing subsystem 128 is only concerned with the cluster exhibiting the highest cluster value. The highest cluster value may then be used to determine the final SOM value, as described previously.
[0045] In certain embodiments, the processing subsystem 128 is configured to compare the final SOM value with a predetermined threshold value indicative of a specified situation being monitored. The threshold value acts a benchmark value for SOM, which indicates an ill-health scenario, a physical emergency, or any other situation of interest for the subject 102, and is thereby used to avoid possible false triggering. In one example, the threshold value represents an emergency state of mind (SOM-E) value which is used for comparison with the final SOM value to indicate an emergency situation for the subject 102 in case the final SOM value is equal to or greater than the threshold value. In other words, if the highest cluster value exceeds the predetermined threshold value then the State of Mind of the concerned subject is considered as the Emergency State of Mind. It may be understood that the threshold value is fixed only for the emergency situations, and if the highest cluster value is less than the fixed threshold it may be concluded that the person is not in an emergency situation, and hence no action, or a different action may be required. In the present embodiment, the threshold value is predetermined using a number of simulations, expert input, and/or real-life data. In some examples, the threshold value that is indicative of a specific situation may further be customized for each individual subject, such as the subject 102, based on real-life data for the subject 102 collected over a period of time.
[0046] If an emergency situation is identified based on the comparison, the processing subsystem 128 may be configured to execute a predefined response (represented using the symbol ‘R’ in FIG. 1). Hereinafter, the terms “response” and “action” have been interchangeably used without any limitations. Referring back to the exemplary depictions of FIG. 1, in case of determination of the emergency situation, such as a heart failure of the subject 102 or the like, the processing subsystem 128 may be configured to generate an appropriate predefined response. The predefined response, may include but is not limited to, making calls to an emergency ambulance service, nearby hospital, and/or one or more predefined family contacts. Similarly, in case the subject 102 is travelling in an autonomous or a semi-autonomous vehicle, the processing subsystem 128 may configure an associated control system (136) such as the vehicle ECU to stop the vehicle, or automatically navigate the vehicle to the nearest medical facility as per the corresponding predefined response.
[0047] Furthermore, in certain embodiments, the processing subsystem 128 automatically executes the response by comparing the subconscious experience value with the conscious inexperience value. Certain exemplary calculations of the subconscious experience value and the conscious inexperience value will be presented in subsequent sections. According to aspects of the present disclosure, the experience of the system (100) to respond to a situation is measured using the subconscious experience value and the inexperience of a situation is measured using the conscious inexperience value. If the experience value for the situation is more the inexperience value, then the system (100) follows the best response executed by the system (100) during a past occurrence of that experience. Alternatively, when encountering a new situation for which the system (100) will have maximum inexperience, the system (100) executes a sequence of predefined set of responses for responding to the new situation.
[0048] In one embodiment, the processing subsystem 128 may further take the contextual information ‘C’ about the subject 102 into consideration for selecting the appropriate response to the emergency situation. For this purpose, the subconscious mind processing unit 134 may be configured to determine contextual information such as an environment setting, for example, from the location information of the subject 102. For instance, if a location sensor 126 determines the subject 102 to be moving at a fast pace, then it may be concluded that the subject 102 may be driving or is located in a moving vehicle. If an emergency situation is identified, the processing subsystem 128 may utilize the contextual information of the subject 102 for selecting an appropriate response. In the present example, the processing subsystem 128 may send a signal to an associated vehicle control unit to stop the vehicle immediately, in addition to executing other predetermined responses, such as calling the emergency numbers.
[0049] It may be understood that the executed response could vary from person to person and from situation to situation. For example, if the system 100 determines the subject 102 to have fallen asleep while driving, the system 100 can alert the subject 102 or notify appropriate people such as co-passengers so that the subject 102 can take necessary action. In addition to issuing an alarm such as a loud beep sound to alert the subject 102, the system 100 can issue some pre-recorded audio messages for specific health condition as smart warning, advice, or reminder. If a serious or dangerous health condition is identified, the system 100 may issue smart audio warning to the subject 102 and automatically access the subject’s mobile phone to make contact with nearby medical center, doctor, or family member through an available communications network. The system 100 may be programmed such that a call to emergency number, like ‘911’, is immediately made and the subject's name and medical history are provided therewith. At the same time, the system 100 may also provide the 911 operator with the subject's location, by sending the location coordinates as determined by the location sensor 126. In some examples, the system 100 can also receive instructions or other information from the healthcare center, doctor, or family member to monitor and aid the subject 102, while recording the necessary information for reporting, storage, and later retrieval.
[0050] In the present disclosure, the processing subsystem 128 may also provide machine learning capability to the present system 100. In particular, the processing subsystem 128 may implement two levels of machine learning in the present system 100. The first level machine learning system stores the datasets of emotional states and the corresponding highest cluster value. The second level of machine learning system adds artificial immunity by storing the most successful previously executed responses for the situations.
[0051] To that end, the subconscious mind processing unit 134 may be implemented as a neural network using one or more processing devices and associated firmware, and/or software, which implements machine learning algorithms to add artificial immunity to the present system 100. The artificial immunity is achieved by feeding the action taken during a past emergency situation back to the processing subsystem 128 to train the neural network to respond more expeditiously and efficiently in the event of a subsequent occurrence of a similar emergency situation. In other words, the neural network trains the present SOM determination methodology to learn about a specific person over a period of time using his/her past experiences, similar to understanding the usual emotional behavior of a person. That is, the neural network applies machine learning to the increasing number of sets of stored emotional states (equivalent to more experience about the mental state of a person) stored over a period of time to train the SOM determination methodology to identify the SOM of the subject 102 more quickly and accurately. An exemplary method for computing the SOM values and responding to a specified situation is described in detail with reference to FIGs. 2A and 2B.
[0052] FIGs. 2A and 2B illustrates an exemplary method for identifying a SOM of a subject and responding to a specified situation indicated by the identified SOM of the subject. In FIGs. 2A and 2B, the exemplary method is illustrated as a collection of blocks in a logical flow chart, which represent operations that may be implemented in hardware, software, or combinations thereof. The various operations are depicted in the blocks to illustrate the functions that are performed in the exemplary method. In the context of software, the blocks represent computer instructions that, when executed by one or more processing subsystems, perform the recited operations. The order in which steps of the exemplary methods are described is not intended to be construed as a limitation, and any number of the described blocks may be combined in any order to implement the exemplary methods disclosed herein, or an equivalent alternative method. Additionally, certain blocks may be deleted from the exemplary methods or augmented by additional blocks with added functionality without departing from the spirit and scope of the subject matter described herein. For clarity, the present embodiments are described with reference to identifying and responding to an emergency situation based on an identified SOM of the subject using the system 100 of FIG. 1. However, embodiments of the present method may be used to similarly identify and respond to any situation during which the subject may be assumed to exhibit certain specific emotional states.
[0053] At step 202, the processing subsystem 128 is configured to receive one or more emotional states of the subject that are detected based on one or more measurements by the plurality of sensors 106. At step 204, the processing subsystem 128 is configured to compute a highest overlap percentage of the received emotional states with one or more stored sets of emotional states. Each of the stored sets of emotional states are indicative of a group of emotional states that are associated with occurrence of a specified situation, for example, a heart attack, drowsiness, fire, or accident. Each of these sets of stored emotional states may have a predetermined cluster value that depends upon a numerical value of the emotional states included therein. Additionally, each of the sets of stored emotional states may be associated with one or more responses executed during a past occurrence of a specified situation that it represents. Therefore, a high overlap percentage may indicate that the system (100) has had previous experience of the specified situation.
[0054] Accordingly, at step 206, it is determined if the highest overlap percentage is equal to hundred. If the highest overlap percentage is determined to be equal to hundred, at step 208, a second SOM value is set to a zero value, and subsequently, the control moves to step 210 (denoted by ‘A’ in FIGs. 2A and 2B). Alternatively, if the highest overlap percentage is determined to be less than hundred, at step 212, the received emotional states are clustered into one or more clusters and a cluster value is computed for each of the clusters based on a numerical value of the emotional states therein. As previously noted, the number of emotional states clustered into a single cluster may depend upon a number of emotional states that are defined for a specific situation that is to be monitored. At step 214, a cluster having the highest cluster value is identified. Further, at step 216, a conscious inexperience value is computed by calculating a difference between 100 and the highest overlap percentage. Subsequently, at step 218, the second SOM value is computed as a product of the conscious inexperience value and the highest cluster value of the identified cluster. At step 220, it is determined if the highest overlap percentage is equal to zero. If the highest overlap percentage is equal to zero, at step 222, a first SOM value is set to zero, and subsequently the control moves to step 228 (denoted by ‘B’ in FIGs. 2A and 2B). However, if the highest overlap percentage is not equal to zero, the control moves to step 210. Specifically, the step 210 is executed when the highest overlap percentage is non-zero, that is, the highest overlap percentage is between one percent and hundred percent, both values included.
[0055] At step 210, a stored set of emotional states having the highest overlap percentage is identified. Additionally, at step 224, the highest overlap percentage is designated as the subconscious experience value. Further, at step 226, the first SOM value is computed as a product of the subconscious experience value and the highest cluster value associated with the set identified at step 210. Subsequently, at step 228, the final SOM value is computed as a sum of the first and second SOM values. At step 230, the final SOM value is compared with a designated threshold value determine is a specified situation such as an emergency situation has occurred. In one embodiment, if the final SOM value is less than the threshold value, the system 100 may identify the situation as a non-critical situation and perform no responsive action. Alternatively, the system 100 may perform a default action such as such as conducting additional checks, issuing audio and/or visual alerts to the subject 102. However, if the final SOM value is equal to or greater than the threshold value, the specified or emergency situation is identified. Accordingly, at step 232, the system 100 further determines if the subconscious experience value is greater than the conscious inexperience value. If the subconscious experience value is greater than the conscious inexperience value, at step 234, the system 100 executes the most successful response that was executed during a previous occurrence of the emergency situation. However, if the subconscious experience value is less than or equal to the conscious inexperience value, at step 236, the system 100 executes a predefined set of suitable responses for the identified situation.
[0056] Embodiments of the present system 100 can be implemented, for example, in a home, a nursing home, a hospital, a vehicle, an office, or an industry to monitor, identify, and automatically respond to specified situations and conditions experienced by a subject. It may be understood that the system 100 may be applicable for monitoring and responding to emergency health conditions of the subject 102, such as the subject 102 having a fit, an asthma attack, or the like. The system 100 may also be implemented in public places like malls, auditorium, movie theaters, where a set of contactless sensors may provide the measurement data corresponding to multiple persons in the public spaces. For example, a camera may record facial expressions to be fed to the present system 100 for identifying SOM of the person(s) in view of the camera. In case a person is determined to be experiencing an emergency situation, the system 100 may generate an appropriate response to alleviate the emergency.
[0057] Similarly, the system 100 may be implemented to automatically respond during an accident involving a vehicle of the subject 102. It may be understood that the accident situation may include, but not limited to, the vehicle being hit by another vehicle, the vehicle falling in a river, or failure of the vehicle’s braking system. The system 100 may be in communication with the control unit of the vehicle and/or a cloud server, for example, via an associated mobile phone. The plurality of sensors 106 which are required to capture the inputs may be integrated into the vehicle, e.g., in the vehicle’s dashboard, driver’s seat, steering column, etc. The system 100 identifies an emergency situation experienced by the driver, and further takes a decision to stop the vehicle at that instant and call the emergency numbers, autonomously drive the vehicle to the nearest hospital, or the like. In some examples, the system 100 may also be configured to display suitable alerts, helpful hints, or simply the sensor data stream on a display device to continuously show the health conditions of the driver.
[0058] The present method, as implemented by the present system 100, operates like a human mind, which consists of conscious and subconscious parts. Typically, the sensory information is processed, analyzed, and simplified by both parts of the brain to come up with a sensible decision at the end. The sensible decision is different for different persons and depends on a person’s personality and individuality. Similarly, the proposed SOM determination method eliminates the dependency on personality components while taking decisions about responding to an ill health of a subject. The present system 100, utilizing the SOM information, can alert a patient when medically significant events occur in order to enable adequate and timely intervention and alerts to seek treatment of disorders and their symptoms.
[0059] In the present system 100, human intelligence is modeled using the proposed state of mind algorithm such that the system 100 is able to provide artificial immunity, which is achieved by using the second level machine learning system. Here, the decisions regarding the best and successful actions taken during prior emergency situations are fed back so that if the same emergency situation occurs again in the future then the system 100 will not search for the best actions again, but uses the best or most successful action taken during a previous instance. Use of machine learning allows the present system 100 to provide faster and more accurate decision making ability in emergency situations, thus enabling efficient real-time implementation.
[0060] It may be contemplated by a person skilled in the art that the information obtained from SOM about health condition of the subject 102 may generally be more reliable when determining if the subject 102 is in emergency situation, as compared to simply using physical readings of the subject 102. For example, different users may have baseline resting heart rates that vary considerably. When determining a subject’s health condition based on heart rate, knowledge of a resting heart rate may be crucial. For example, a heart rate of ‘90’ beats per minute for one subject may be indicative of excitement, while for another subject, such a rate may be at or near a baseline resting rate. So simply relying on the physical parameters, such as the heart rate in this example, may wrongly depict the condition of the subject 102. However, in the same example, determining the SOM of the subject 102, which takes into consideration a holistic view of the emotions of the subject 102 could help to personalize the system 100 for the concerned human subject.
[0061] Furthermore, the system 100 of the present disclosure uses intelligence for selecting and clustering the emotions in order to determine state of mind of a person, and further adds artificial immunity by identifying and responding to an emergency situation of the subject based on learning from previous experiences. The proposed system 100 is a continuous system and does not use any global threshold for the emotions, since the threshold varies from person to person as different people behave differently in the same situation according to their personality. Further, the system 100 trains itself to the behavior of a specific person and customizes the output for the person as he/she uses it over a period of time, which is similar to understanding a behavior of a person in any human relationship.
[0062] Additionally, the system 100 selects the action taken depending on the application where the proposed system is being used. That is, the system 100 allows the response to vary adaptively based on the current environment setting of the person. Environment setting information may include the factors that relate to input information associated with people, places, motion, geographical location, etc. For instance, when used in a hospital, the system 100 will alert a doctor with appropriate skills and/or physical location upon identifying an emergency situation. When the system 100 is being used as “driver protection system,” the system 100 may configure an associated control system such as a vehicle electronic control unit to stop the car and call an emergency number during an emergency. The actions may be predetermined and stored in the system 100 according to the application the system 100 intends to perform. In some examples, the selected response may further be optimized and interpolated for other situations using artificial intelligence. All these features make the system 100 more accurate and efficient for identifying and responding to situations in real-time in multiple application areas.
[0063] The foregoing descriptions of specific embodiments of the present disclosure have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the present disclosure to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The exemplary embodiments presented herein were chosen and described in order to best explain the principles of the present disclosure and its practical application, to thereby enable others skilled in the art to best utilize the present disclosure and various embodiments with various modifications as are suited to certain contemplated uses.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 201841000512-IntimationOfGrant26-06-2024.pdf | 2024-06-26 |
| 1 | 201841000512-STATEMENT OF UNDERTAKING (FORM 3) [05-01-2018(online)]_5.pdf | 2018-01-05 |
| 2 | 201841000512-PatentCertificate26-06-2024.pdf | 2024-06-26 |
| 2 | 201841000512-STATEMENT OF UNDERTAKING (FORM 3) [05-01-2018(online)].pdf | 2018-01-05 |
| 3 | 201841000512-REQUEST FOR EXAMINATION (FORM-18) [05-01-2018(online)]_8.pdf | 2018-01-05 |
| 3 | 201841000512-Annexure [15-04-2024(online)].pdf | 2024-04-15 |
| 4 | 201841000512-REQUEST FOR EXAMINATION (FORM-18) [05-01-2018(online)].pdf | 2018-01-05 |
| 4 | 201841000512-FORM 3 [15-04-2024(online)].pdf | 2024-04-15 |
| 5 | 201841000512-Written submissions and relevant documents [15-04-2024(online)].pdf | 2024-04-15 |
| 5 | 201841000512-POWER OF AUTHORITY [05-01-2018(online)]_7.pdf | 2018-01-05 |
| 6 | 201841000512-POWER OF AUTHORITY [05-01-2018(online)].pdf | 2018-01-05 |
| 6 | 201841000512-Correspondence to notify the Controller [01-04-2024(online)].pdf | 2024-04-01 |
| 7 | 201841000512-US(14)-ExtendedHearingNotice-(HearingDate-02-04-2024).pdf | 2024-04-01 |
| 7 | 201841000512-FORM 18 [05-01-2018(online)].pdf | 2018-01-05 |
| 8 | 201841000512-Correspondence to notify the Controller [27-02-2024(online)].pdf | 2024-02-27 |
| 8 | 201841000512-FORM 1 [05-01-2018(online)].pdf | 2018-01-05 |
| 9 | 201841000512-FORM-26 [27-02-2024(online)].pdf | 2024-02-27 |
| 10 | 201841000512-US(14)-HearingNotice-(HearingDate-01-04-2024).pdf | 2024-02-20 |
| 11 | 201841000512-DRAWINGS [05-01-2018(online)]_6.pdf | 2018-01-05 |
| 11 | 201841000512-FER.pdf | 2021-10-17 |
| 12 | 201841000512-CLAIMS [16-09-2021(online)].pdf | 2021-09-16 |
| 12 | 201841000512-DRAWINGS [05-01-2018(online)].pdf | 2018-01-05 |
| 13 | 201841000512-COMPLETE SPECIFICATION [16-09-2021(online)].pdf | 2021-09-16 |
| 13 | 201841000512-DECLARATION OF INVENTORSHIP (FORM 5) [05-01-2018(online)]_4.pdf | 2018-01-05 |
| 14 | 201841000512-DECLARATION OF INVENTORSHIP (FORM 5) [05-01-2018(online)].pdf | 2018-01-05 |
| 14 | 201841000512-FER_SER_REPLY [16-09-2021(online)].pdf | 2021-09-16 |
| 15 | 201841000512-COMPLETE SPECIFICATION [05-01-2018(online)]_2.pdf | 2018-01-05 |
| 15 | 201841000512-FORM 3 [16-09-2021(online)].pdf | 2021-09-16 |
| 16 | 201841000512-COMPLETE SPECIFICATION [05-01-2018(online)].pdf | 2018-01-05 |
| 16 | Correspondence by Agent_Form 1, Form 5_22-06-2018.pdf | 2018-06-22 |
| 17 | Form5_As Filed_22-06-2018.pdf | 2018-06-22 |
| 17 | Correspondence by Agent_Power of Attorney, Declaration_22-06-2018.pdf | 2018-06-22 |
| 18 | Form1_After Filling_22-06-2018.pdf | 2018-06-22 |
| 19 | Correspondence by Agent_Power of Attorney, Declaration_22-06-2018.pdf | 2018-06-22 |
| 19 | Form5_As Filed_22-06-2018.pdf | 2018-06-22 |
| 20 | 201841000512-COMPLETE SPECIFICATION [05-01-2018(online)].pdf | 2018-01-05 |
| 20 | Correspondence by Agent_Form 1, Form 5_22-06-2018.pdf | 2018-06-22 |
| 21 | 201841000512-COMPLETE SPECIFICATION [05-01-2018(online)]_2.pdf | 2018-01-05 |
| 21 | 201841000512-FORM 3 [16-09-2021(online)].pdf | 2021-09-16 |
| 22 | 201841000512-DECLARATION OF INVENTORSHIP (FORM 5) [05-01-2018(online)].pdf | 2018-01-05 |
| 22 | 201841000512-FER_SER_REPLY [16-09-2021(online)].pdf | 2021-09-16 |
| 23 | 201841000512-DECLARATION OF INVENTORSHIP (FORM 5) [05-01-2018(online)]_4.pdf | 2018-01-05 |
| 23 | 201841000512-COMPLETE SPECIFICATION [16-09-2021(online)].pdf | 2021-09-16 |
| 24 | 201841000512-CLAIMS [16-09-2021(online)].pdf | 2021-09-16 |
| 24 | 201841000512-DRAWINGS [05-01-2018(online)].pdf | 2018-01-05 |
| 25 | 201841000512-DRAWINGS [05-01-2018(online)]_6.pdf | 2018-01-05 |
| 25 | 201841000512-FER.pdf | 2021-10-17 |
| 26 | 201841000512-US(14)-HearingNotice-(HearingDate-01-04-2024).pdf | 2024-02-20 |
| 27 | 201841000512-FORM-26 [27-02-2024(online)].pdf | 2024-02-27 |
| 28 | 201841000512-Correspondence to notify the Controller [27-02-2024(online)].pdf | 2024-02-27 |
| 28 | 201841000512-FORM 1 [05-01-2018(online)].pdf | 2018-01-05 |
| 29 | 201841000512-FORM 18 [05-01-2018(online)].pdf | 2018-01-05 |
| 29 | 201841000512-US(14)-ExtendedHearingNotice-(HearingDate-02-04-2024).pdf | 2024-04-01 |
| 30 | 201841000512-Correspondence to notify the Controller [01-04-2024(online)].pdf | 2024-04-01 |
| 30 | 201841000512-POWER OF AUTHORITY [05-01-2018(online)].pdf | 2018-01-05 |
| 31 | 201841000512-Written submissions and relevant documents [15-04-2024(online)].pdf | 2024-04-15 |
| 31 | 201841000512-POWER OF AUTHORITY [05-01-2018(online)]_7.pdf | 2018-01-05 |
| 32 | 201841000512-REQUEST FOR EXAMINATION (FORM-18) [05-01-2018(online)].pdf | 2018-01-05 |
| 32 | 201841000512-FORM 3 [15-04-2024(online)].pdf | 2024-04-15 |
| 33 | 201841000512-REQUEST FOR EXAMINATION (FORM-18) [05-01-2018(online)]_8.pdf | 2018-01-05 |
| 33 | 201841000512-Annexure [15-04-2024(online)].pdf | 2024-04-15 |
| 34 | 201841000512-STATEMENT OF UNDERTAKING (FORM 3) [05-01-2018(online)].pdf | 2018-01-05 |
| 34 | 201841000512-PatentCertificate26-06-2024.pdf | 2024-06-26 |
| 35 | 201841000512-STATEMENT OF UNDERTAKING (FORM 3) [05-01-2018(online)]_5.pdf | 2018-01-05 |
| 35 | 201841000512-IntimationOfGrant26-06-2024.pdf | 2024-06-26 |
| 1 | SS_201841000512E_08-02-2021.pdf |