Abstract: A METHOD AND A SYSTEM FOR CONTROLLING EMOTIONAL STATE OF A USER ABSTRACT Disclosed herein is a method(300) and a system(107,400) for controlling an emotional state of a user(101). The system(107,400) receives one or more physiological parameters of the user(101) from the wearable device(103) and one or more emotional state parameters of the user(101) from the user device(109). Based on the received data, the system(107,400) determines an emotional state of the user(101). Further, the system(107,400) transmits the determined emotional state, the one or more physiological parameters, and the one or more emotional state parameters of the user(101) to a server(111) over the communication network(113,409). The system(107,400) receives one or more actions to be performed in response to the determined emotional state of the user(101), from the server(111). Based on the received one or more actions, the system(107,400) operates at least one of the wearable device(103) and the user device(109) for controlling the emotional state of the user(101). Fig. 1a
DESC:FORM 2
THE PATENTS ACT 1970
[39 OF 1970]
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
[See section 10; rule 13]
TITLE: “A METHOD AND A SYSTEM FOR CONTROLLING
EMOTIONAL STATE OF A USER”
Name & Address of the Applicant:
TITAN COMPANY LIMITED, ‘Integrity’, No. 193, Veerasandra, Electronics City P.O., Off Hosur Main Road, Bangalore - 560100
Nationality: India
The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD
The present subject matter is generally related to a wearable device, but not exclusively, to a method and a system for controlling an emotional state of a user.
BACKGROUND
Emotional state significantly affects physiological and psychological behaviour of a person. As an example, a happy emotional state of the person helps in improving person’s health and work performance, while sad emotional state of the person cause health problems, mental disturbances and degrades work performance of the person. Compared to the mood, which is a conscious state of mind or predominant emotion in a time, emotional state often refers to a mental state which arises spontaneously rather than through conscious effort. The emotional state is often accompanied by physical and physiological changes that are relevant to the human organs and tissues such as brain, heart, skin, blood flow, muscle, facial expressions, voice, etc.
Therefore, determining emotional state precisely and timely becomes essential to improve the physiological and psychological behaviour of the person.
Nowadays, majority of the population is dependent on electronic gadgets such as mobile phones, tablets, laptops and the like for purposes such as entertainment, work, services and the like. Therefore, any kind of information, notifications, and the like are accessed by the person [alternatively referred as a user] using one or more electronic gadgets via various applications and websites. However, the user may not be aware of own emotional state at any instant of time, therefore the user could not determine which information must be accessed using the electronic gadgets or which activity must be performed to self-improve the emotional state. Further, the conventional electronic gadgets do not track an emotional state of the user for providing content that will help the user to improve the emotional state. Although, few applications and websites render personalized content to the user based on tracking the user’s browsing activity, social network of the user on digital platforms and the like, such applications and websites do not determine emotional state of the user. Thus, the conventional electronic gadgets, applications and websites fail to provide assistance to the user for improving the emotional state.
The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
SUMMARY
The present disclosure discloses a method of controlling an emotional state of a user. The method comprises receiving, by a system, one or more physiological parameters of the user from a wearable device and one or more emotional state parameters of the user from a user device. Further, the method comprises determining, by the system, an emotional state of the user based on the received one or more physiological parameters and the received one or more emotional state parameters. Upon determining the emotional state of the user, the method comprises transmitting, by the system, the determined emotional state, the one or more physiological parameters, and the one or more emotional state parameters of the user to a server over the communication network. Further, the method comprises receiving, by the system, one or more actions to be performed in response to the determined emotional state of the user, from the server. Thereafter, the method comprises operating, by the system, at least one of the wearable device and the user device based on the received one or more actions for controlling the emotional state of the user.
Further, the present disclosure discloses a system for controlling an emotional state of a user. The system comprises a wearable device, a user device, a processor communicatively coupled to the wearable device and the user device, and a memory communicatively coupled to the processor. Here, the processor receives one or more physiological parameters of the user from the wearable device and one or more emotional state parameters of the user from the user device. Based on the received one or more physiological parameters and the received one or more emotional state parameters, the processor determines an emotional state of the user. Further, the processor transmits the determined emotional state, the one or more physiological parameters, and the one or more emotional state parameters of the user to a server over the communication network. In response to the determined emotional state of the user, the processor receives one or more actions to be performed from the server. Thereafter, the processor operates at least one of the wearable device and the user device based on the received one or more actions for controlling the emotional state of the user.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, explain the disclosed principles. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and regarding the accompanying figures, in which:
Fig. 1a shows an exemplary architecture for controlling an emotional state of a user in accordance with some embodiments of the present disclosure.
Fig. 1b shows an exemplary wearable device in accordance with some embodiments of the present disclosure.
Fig. 2 shows a block diagram of a system for controlling an emotional state of a user in accordance with some embodiments of the present disclosure.
Fig. 3 shows a flow chart illustrating a method of controlling an emotional state of a user in accordance with some embodiments of the present disclosure.
Fig. 4 is a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
It should be appreciated by those skilled in the art that any flow diagrams and timing diagrams herein represent conceptual views of illustrative device embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether such computer or processor is explicitly shown.
DETAILED DESCRIPTION
In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the specific forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
The terms “comprises”, “comprising”, “includes”, “including” or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device, or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
The present disclosure relates to a method and a system for controlling an emotional state of a user. Each user may have unique emotional state in response to certain circumstances, which may be expressed while performing one or more activities on a user device. Positive emotional state may improve work efficiency and health of the user, while negative emotional state may cause health problems in the user. Therefore, controlling the emotional state of the user may be necessary to improve physiological and psychological behaviour of the user. Further, accuracy in determination of emotional state may be essential for effectively controlling the emotional state of the user. In the present disclosure, for accurate determination emotional state of the user, the system may receive one or more physiological parameters of the user from a wearable device and one or more emotional state parameters of the user from the user device. The one or more emotional state parameters may indicate emotion of the user, and the one or more physiological parameters of the user may indicate intensity associated with the emotion of the user. Thus, in case, the user hides the emotion while performing one or more activities on the user device, the emotional state can still be precisely determined by analysing the measured one or more physiological parameters of the user along with the one or more emotional state parameters. The system, therefore, may perform combined analysis of the one or more emotional state parameters and the one or more physiological parameters for accurate determination of an emotional state of the user, which is utilised by a server to generate one or more actions to be performed by the system for operating at least one of the wearable device and the user device. The objective of the operating the at least one of the wearable device and/or the user device based on the received one or more actions may be to encourage the user for interacting with other users in a physical space. Because social interaction in the physical space may affect the emotional state of the user in a constructive manner. This may help the user to recover from anxiety, fight loneliness, make better decisions, adopt healthier lifestyles and maintaining a balance between personal, social and work life.
Fig. 1a shows an exemplary architecture for controlling an emotional state of a user in accordance with some embodiments of the present disclosure.
As shown in Fig. 1a, the architecture 100 may include a user 101, a wearable device 103, one or more sensors 105 (sensor 1051 to sensor 105n), a system 107, a user device 109 and a server 111. In an embodiment, the system 107 may be associated with the user device 109. As an example, the user device 109 may include, but not limited to, mobile phones, laptops, tablets, and voice assistants. Further, the user device 109 may have integrated transceiver antennas to transmit and receive radio signals. In an embodiment, the system 107 may reside in the user device 109 and/ or the wearable device 103. In another embodiment, the system 107 may be remotely located and be associated with the wearable device 103 and/ or the user device 109.
In an embodiment, the wearable device 103 may be equipped with one or more wearable antenna for wirelessly communicating with other electronic devices and the server 111. As an example, the wearable device 103 may be in shape of a band which may be worn by the user 101 around his wrist as shown in the Fig. 1b. Alternatively, the wearable device 103 may be in other forms which may include, but not limited to, a pendent, a badge, a belt and the like, that may be clipped to the user’s body or garment. As an example, the wearable device 103 may be a smart watch integrated with an electronic display 115 as shown in the Fig. 1b. Further, the wearable device 103 may be a battery-operated device. In an embodiment, the wearable device 103 may be associated with one or more sensors 105. Further, the one or more sensors 105 may be wearable. In an embodiment, the one or more sensors 105 may be integrated into the wearable device 103. As an example, the one or more sensors 105 may include, but not limited to, a Galvanic Skin Response (GSR) sensor, a temperature sensor, a Photoplethysmogram (PPG) sensor and an Inertial Measurement Unit (IMU) sensor. In an embodiment, each of the one or more sensors 105 may be configured to measure one or more physiological parameters of the user 101. The one or more physiological parameters comprises sweat gland activity, skin temperature, heart rate, user movement, and physical activity of the user 101. As an example, the sweat gland activity of the user 101 may be measured by the GSR sensor, variations in the skin temperature of the user 101 may be measured by the temperature sensor, variations in the heart rate of the user 101 may be measured by the PPG sensor, and the user movement and the physical activity of the user 101 may be measured by the IMU sensor. Each of the one or more physiological parameters measured by the one or more sensors 105 may be indicative of emotion of the user 101. Each of the one or more sensors 105 may transmit the data measured to the system 107, via a communication network (not shown in the Fig. 1a). The communication network may be at least one of wired communication network and wireless communication network. In an embodiment, the wearable device 103 and the user device 109 may be in constant communication. For example, the communication network may include, but not limited to, Bluetooth, Wi-Fi, Infrared, and NFC.
In an exemplary scenario, the user 101 may perform one or more activities on the user device 109. Upon detecting the user’s activity on the user device 109, the one or more sensors 105 configured in the wearable device 103 may measure the one or more physiological parameters of the user 101. Alternatively, the one or more sensors 105 may be configured to continuously measure the one or more physiological parameters. Further, the measured one or more physiological parameters may be received by the system 107 from the one or more sensors 105. Furthermore, the user device 109 may transmit one or more emotional state parameters of the user 101 to the system 107. The one or more emotional state parameters may comprise user’s response to queries provided by one or more applications running on the user device 109, browsing data of the user 101 and/or data related to verbal and/or textual expressions of the user 101 while performing the one or more activities using the user device 109. Upon receiving the one or more physiological parameters and the one or more emotional state parameters, the system 107 may determine an emotional state of the user 101. Thereafter, the system 107 may encrypt the emotional state of the user 101 along with the corresponding one or more physiological parameters and one or more emotional state parameters to generate encrypted data for transmitting to the server 111. The system 107 may encrypt to secure data to be transmitted to the server 111 over a communication network 113 [alternatively referred as communication link]. Upon receiving the encrypted data from the system 107, the server 111 may generate one or more actions to be performed by mapping the emotional state, the one or more physiological parameters, and the one or more emotional state parameters of the user 101 with prestored emotional states of a plurality of users, respective one or more physiological parameters, and respective one or more emotional state parameters of the plurality of users. Further, the generated one or more actions may be transmitted from the server 111 to the system 107 over the communication network 113. Thereafter, the system 107 may operate the wearable device 103 and/or the user device 109 for controlling the emotional state of the user 101. Here, the system 107 may operate the wearable device 103 and/or the user device 109 based on the received one or more actions. The one or more actions may comprise displaying upcoming events of interest for the user 101 on the wearable device 103 and/or the user device 109. Also, the one or more actions may comprise scheduling one or more meeting events with one or more users of the plurality of users present within a predetermined distance. Here, each of the one or more users may have an interest score matched with a predetermined interest score of the user 101.
As an example, the user 101 may browse a food delivery application on a mobile phone . While browsing, the user 101 may come across a picture of Pizza on a display screen. Upon viewing the picture of Pizza, the user 101 may express a happy emotion and may want to place an order for Pizza. Here, the emotion of the user 101 may be determined by following user’s activity on the mobile phone after viewing the picture of Pizza. Upon sensing the user’s activity on the mobile phone, the GSR sensor, the temperature sensor, the PPG sensor and the IMU sensor may capture data related to variations in the sweat gland activity, the skin temperature, the heart rate, and the user movement and physical activity of the user 101, based on the browsing data accessed from the mobile phone. The measured physiological parameters may be received by the system 107 and may be analysed along with the user’s activity on the mobile phone to determine the emotional state of the user 101. As an example, the system 107 may receive the physiological parameters including increase in the sweat gland activity, no variations in the skin temperature, and the heart rate, and increase in user movement and physical activity of the user 101, which may be indicative of the happy emotion of the user 101. Further, the system 107 may analyse the received physiological parameters of the user 101 along with the text input by the user 101, facial expression of the user 101 and spoken sentences of the user 101 and may determine emotional state of the user 101 as happy. Here, the variations in the physiological parameters may be necessary to detect true emotion of the user 101. In some scenarios, the user 101 hides the true emotion while browsing the mobile phone. In such cases, the emotion may not be accurately detected by analysing the emotional parameters alone. Hence, the measurement of physiological parameters of the user 101 by the sensors 105 may be required to analyse in a combined manner with the emotional state parameters received from the mobile phone for identifying the true emotion of the user 101.
As an example, the user 101 may browse a chat application on the mobile phone. Here, sentences input by the user 101 during user’s activity on the mobile phone may be analysed to determine emotion of the user 101 using predefined sentiment analysis techniques. For example, the user 101 may input a smiley in a message box of the chat application, which may indicate that the user 101 is happy. Also, the user 101 may type sentences indicating an argument in the message box of the chat application. Here, the sentences typed by the user 101 may be used to determine the emotion. Alternatively, the user expression may also be determined by capturing an image of the user 101 by a camera of the user device 109 when the user 101 places the order for Pizza. By analysing the image, emotion of the user 101 during performing the browsing activity on the user device 109 may be identified using predefined facial expression recognition techniques. As an example, the facial expression of the user 101 may be associated with a happy emotion. In an embodiment, micro-expressions can be detected to determine emotion of the user 101. Additionally, the user expression may also be determined by monitoring voice signals of the user 101 while performing the browsing activity on the user device 109. A microphone of the user device 109 may be used to obtain the voice signals. Also, the wearable device 103 may also include a microphone to capture voice signals of the user 101. The monitored voice signals of the user 101 may be analysed using predefined speech emotion recognition techniques to identify emotion of the user 101 during performing the browsing activity on the user device 109. As an example, the user 101 may utter words “it’s my favourite”, the emotion associated with the vice signals may be detected as happy. The results of the analysis may be represented in terms of one or more emotional parameters and may be transmitted from the user device 109 to the system 107 for further processing.
In another example, the user 101 may be browsing sad songs on the user device 109. Based on the browsing data accessed from the user device 109, the one or more sensors 105 may capture data related to the change in the physiological parameters such as heart rate, skin temperature, sweat gland activity and user movement, and physical activity of the user 101. The measured physiological parameters and the emotional state parameters may be received by the system 107 to determine the emotional state of the user 101. The system 107 may detect a decrease in the sweat gland activity, no variation in the skin temperature, decrease in the heart rate, and decrease in user movement and physical activity of the user 101, which may be indicative of the sad emotion of the user 101. By analysing the physiological parameters along with the emotional state parameters, the system 107 may detect the emotional state of the user 101 as sad. In another example, apart from browsing data, the system 107 may determine the emotion of the user 101 based on data related to expressions of the user 101 during one or more activities such as a telephonic conversation, a video call, chatting, reading e-books, documents, posts and the like. In another example, the system 107 may be associated with virtual assistants configured in the user device 109, which in turn provides the system 107 access to all the data in the user device 109 which the virtual assistant has access to. Therefore, the data retrieved from the user device 109 which may be related to one or more applications configured in the user device 109 may also be used for determining emotion of the user 101.
Further, in some embodiments, the system 107 may provide one or more queries to the user 101 on the user device 109 to obtain a user’s response. As an example, the one or more queries may be displayed on the screen of the user device 109 in the form of pop-up or notifications. In another example, the one or more queries may be provided via virtual assistants such as voice assistants which are configured in the user device 109. In some embodiments, the one or more queries may be related to the browsing data of the user 101 or a place visited by the user 101 or a food item ordered by the user 101 and the like, which helps the system 107 to understand the emotion of the user 101. As an example, the query may be “Did you like the food that you ordered?”, “Did you like the place you visited?” and the like. Based on the response received from the user 101, the system 107 may determine the emotional state parameter of the user 101.
In some embodiments, the system 107 may generate encoded data from the determined emotional state of the user 101, the one or more physiological parameters, and the one or more emotional state parameters of the user 101 by utilising predefined encryption techniques to ensure security of the data. Further, the system 107 may transmit the encoded data to the server 111 associated with the system 107. In some embodiments, the server 111 may receive the encoded data and perform decryption on the encoded data to retrieve the emotional state of the user 101, the one or more physiological parameters, and the one or more emotional state parameters of the user 101. In alternative embodiments, the system 107 may transmit the one or more physiological parameters, and the one or more emotional state parameters of the user 101 to the server 111 during the one or more activities of the user 101 on the user device 109, and the server 111 may determine the emotion of the user 101.
Further, the server 111 may compare the received emotional state, the received one or more physiological parameters, and the received one or more emotional state parameters of the user 101 with prestored emotional states of a plurality of users, respective one or more physiological parameters, and respective one or more emotional state parameters of the plurality of users. In some embodiments, the server 111 may perform mapping using predefined machine learning techniques. The predefined machine learning techniques may include, but not limited to, Linear Regression (LR), Naive Bayes (NB), Support Vector Machines (SVM), Long Short-Term Memory (LSTM) and Convolutional Neural Network (CNN). Based on the mapping, the server 111 may generate one or more actions to be performed by the user 101. The one or more actions may comprise at least one of displaying upcoming events of interest for the user 101 on one or more of the wearable device 103 and the user device 109, and scheduling one or more meeting events with one or more users of the plurality of users present within a predetermined distance. Each of the one or more users may be provided with an interest score upon measuring the one or more physiological parameters and the one or more emotional state parameters. Further, the interest score may be matched with a predetermined interest score of the user 101. As a first example, the one or more actions may be related to meeting with friends of the user 101, having the same interests as the user 101. As a second example, the system 107 may determine that the user 101 is interested in musical events based on the one or more physiological parameters of the user 101, the one or more emotional state parameters and the emotional state of the user 101. In such scenarios, the server 111 may generate an action such as suggesting one or more upcoming musical events. Further, the server 111 may also indicate if one or more contacts of the user 101 are attending the musical event to motivate the user 101 to attend the musical event. Furthermore, the server 111 may recommend the one or more contacts of the user 101 to attend the musical event.
Further, the server 111 may transmit the generated one or more actions to the system 107. Upon receiving the one or more actions from the server 111, the system 107 may operate at least one of the wearable device 103 and the user device 109. In the first example, the system 107 may automatically open an application in the mobile phone or the smart watch showing list of the nearest friends/ family members, present within 1 kilometre distance of the user 101 to schedule meeting events with the nearest friends/ family members in physical space. The list of the nearest friends/ family members may be generated in real-time based on current location of the user 101 and current locations of the friends and family members using navigation applications. Upon viewing the list on the mobile phone or the smart watch, the user 101 may visit one or more of the nearest friends/ family members to meet physically, which may affect the emotional state of the user 101. In the second example , the system 107 may display a list of upcoming musical events on the smart watch and/ or the mobile phone. Here, the list of upcoming musical events may be generated based on matching interest score of the friends and family members, who prefer to attend the upcoming musical events, with a predetermined interest score of the user 101. .Further, upon viewing the list of the upcoming musical events on the smart watch and/or the mobile phone, the user 101 may physically visit to one or more places where the musical events are organised, which may affect the emotional state of the user 101. Further, change in the one or more physiological parameters of the user 101, change in the one or more emotional state parameters and corresponding change in the emotional state of the user 101 may be monitored after operating the wearable device 103 or the user device 109 based on the received one or more actions. The change in the emotional state of the user 101 after interacting in physical space may be tracked by the system 107 for receiving further one or more actions generated by the server 111 to effectively control the emotional state of the user 101.
Fig. 2 shows a block diagram of a system for controlling an emotional state of a user in accordance with some embodiments of the present disclosure.
In some implementations, the system 107 may include an I/O interface 201, a processor 203, a and memory 207. The I/O interface 201 may be configured to receive one or more physiological parameters from the one or more sensors 105 configured in the wearable device 103, one or more emotional state parameters of the user 101 from the user device 109, one or more actions to be performed from the server 111 in response to the determined emotional state of the user 101. The processor 203 may be configured to receive the one or more physiological parameters, the one or more emotional state parameters, and the one or more actions through the I/O interface 201. Further, the processor 203 may retrieve data from the memory 207 and interact with the modules 211 to process the received data and control operation of the wearable device 103 and the user device 109. In the system 107, the memory 207 may store data 209 received through the I/O interface 201, modules 211 and the processor 203. In one embodiment, the data 209 may also include sensor data 2091, user device data 2092, emotional state data 2093, encrypted data 2094, server data 2095 and other data 2096. The other data 2096 may store data, including temporary data and temporary files, generated by the modules 211 for performing the various functions of the system 107.
In some embodiments, the data 209 stored in the memory 207 may be processed by the modules 211 of the system 107. In an example, the modules 211 may be communicatively coupled to the processor 203 configured in the system 107. The modules 211 may be present outside the memory 207 as shown in Fig.2a and implemented as hardware. As used herein, the term modules may refer to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor 203 (shared, dedicated, or group) and memory 207 that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
In some embodiments, the modules 211 may include, for example, a determination module 213, an encryption module 215, a transceiver module 217, and other modules 219. The other modules 219 may be used to perform various miscellaneous functionalities of the system 107. It will be appreciated that aforementioned modules may be represented as a single module or a combination of different modules. Furthermore, a person of ordinary skill in the art will appreciate that in an implementation, the one or more modules 211 may be stored in the memory 207, without limiting the scope of the disclosure. The said modules 211 when configured with the functionality defined in the present disclosure will result in a novel hardware.
In an embodiment, the transceiver module 217 may be configured to receive one or more physiological parameters of the user 101 from the wearable device 103, and one or more emotional state parameters of the user 101 from the user device 109. Further, the transceiver module 217 may retrieve the encrypted data 2094 from the memory. Alternatively, the transceiver module 217 may directly receive the encrypted data 2094 from the encryption module 215. The transceiver module 217 may be associated with an integrated patch antenna configured in the wearable device 103. Additionally, the transceiver module 217 may receive one or more actions to be performed from the server 111. The one or more actions may be generated by the server 111 by mapping the emotional state, the one or more physiological parameters, and the one or more emotional state parameters of the user 101 with prestored emotional states of a plurality of users, respective one or more physiological parameters, and respective one or more emotional state parameters of the plurality of users. At the server 111, the mapping may be performed using the predefined machine learning techniques. Upon receiving the one or more actions from the server 111, the transceiver module 217 may store the one or more actions in the memory as the server data 2095. Alternatively, the transceiver module 217 may directly provide the one or more actions to the processor for further processing.
In an embodiment, the determination module 213 may be configured to determine an emotional state of the user 101. The determination module 213 may retrieve one or more physiological parameters of the user 101 from the memory. Particularly, the determination module 213 may retrieve the sweat gland activity, skin temperature, heart rate, user movement, and physical activity of the user 101 from sensor data 2091 stored in the memory. The determination module 213 may interpret the GSR signal as representative of intensity of type of emotion or emotional arousal of the user 101. As an example, the determination module 213 may map increase in sweat gland activity to a positive emotion such as happy or joyful emotion, or a negative emotion such as nervous or panic emotion. Further, the determination module 213 may interpret the variation in the skin temperature as emotional highs and lows of the user 101. As an example, the determination module 213 may map increase in skin temperature to angry or stressful emotion. As an example, the determination module 213 may map increase in heart rate to nervous or panic emotion. As an example, the determination module 213 may map increase in user movement, and physical activity to happy emotion. Further, the determination module 213 may retrieve one or more emotional state parameters of the user 101 from the memory. Particularly, the determination module 213 may retrieve the user’s response to queries, browsing data of the user 101 and data related to verbal and textual expressions of the user 101 from the user device data 2092 stored in the memory. Here, the determination module 213 may utilize predefined sentiment analysis techniques, predefined facial expression recognition techniques, and predefined speech emotion recognition techniques to identify emotion of the user 101. As an example, the determination module 213 may identify the emotion as one of anger, disgust, fear, happiness, sadness, and surprise. Further, the determination module 213 may perform combined analysis of the one or more physiological parameters and the one or more emotional state parameters to accurately determine emotional state of the user 101. As an example, the determination module 213 may determine the emotional state as one of happy, sad, angry, confused, and indifferent. Further, the determination module 213 may store the emotional state of the user 101 in the memory for further processing. Alternatively, the determination module 213 may provide the emotional state of the user 101 to the encryption module 215.
In an embodiment, the encryption module 215 may retrieve the emotional state of the user 101 from the emotional state data 2093 stored in the memory. Alternatively, the encryption module 215 may directly receive the determined emotional state of the user 101 from the determination module 213. The encryption module 215 may also retrieve the one or more physiological parameters of the user 101 and the one or more emotional state parameters of the user 101 from the from the sensor data 2091 and the user device data 2092, respectively. The encryption module 215 may generate encrypted data 2094 by applying predefined encryption techniques on the emotional state of the user 101, the one or more physiological parameters and the one or more emotional state parameters. The encryption module 215 may generate the encrypted data 2094 for secure transmission of the emotional state, the one or more physiological parameters and the one or more emotional state parameters, to the server 111. Further, the encryption module 215 may store the encrypted data 2094 in the memory for further processing. Alternatively, the encryption module 215 may provide the encrypted data 2094 to the transceiver module 217.
In an embodiment, the processor may operate at least one of the wearable device 103 and the user device 109 based on the received one or more actions. The one or more actions may comprise at least one of displaying upcoming events of interest for the user 101 on one or more of the wearable device 103 and the user device 109, and scheduling one or more meeting events with one or more users of the plurality of users present within a predetermined distance. Here, each of the one or more users may have an interest score matched with a predetermined interest score of the user 101. As an example, the processor may operate the wearable device 103 or the user device 109 to display notification to the user 101 to schedule an informal meeting with one or more users in the vicinity of the user 101. Here, the one or more users may also use the wearable devices 103. In another example, the processor may operate the wearable device 103 or the user device 109 to display notification related to events which the user 101 may be interested in, on the display 115 of the wearable device 103 or on the display of the user device 109. As an example, the processor may display notification regarding sports events on the display 115 of the wearable device 103 or on the display of the user device 109, based on the one or more action received from the server 111. Here, the wearable device 103 and the user device 109 may be operated by the processor for controlling the emotional state of the user 101 by encouraging the user 101 to physically interact in social space.
Fig. 3 shows a flow chart illustrating a method of controlling an emotional state of a user in accordance with some embodiments of the present disclosure.
As illustrated in Fig.3, the method 300 includes one or more blocks illustrating a method of controlling an emotional state of a user 101. The order in which the method 300 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
At block 301, the method may include receiving, by a system 107, one or more physiological parameters of the user 101 from a wearable device 103 and one or more emotional state parameters of the user 101 from a user device 109. As, the one or more physiological parameters may be associated with an emotional state of a user 101 wearing the wearable device 103, the one or more sensors 105 may be configured in the wearable device 103 to measure the one or more physiological parameters. The one or more physiological parameters may comprise sweat gland activity, skin temperature, heart rate, user movement, and physical activity of the user 101. Further, the one or more emotional state parameters may be received from the user device 109. The one or more emotional state parameters may comprise at least one of user’s response to queries provided by one or more applications running on the user device 109, browsing data of the user 101 and data related to one of verbal and textual expressions of the user 101 while performing one or more activities using the user device 109.
At block 303, the method may include determining, by the system 107, an emotional state of the user 101 based on the received one or more physiological parameters and the received one or more emotional state parameters. Here, the sweat gland activity, skin temperature, heart rate, user movement, and physical activity of the user 101 received from the wearable device 103 may be analysed along with the user’s response to queries, browsing data of the user 101 and data related to verbal and textual expressions of the user 101 received from the user device 109. The combined analysis may be performed to determine the emotional state of the user 101.
At block 305, the method may include transmitting, by the system 107, the determined emotional state, the one or more physiological parameters, and the one or more emotional state parameters of the user 101 to a server 111 over the communication network 113. Upon determining the emotional state of the user 101, the received one or more physiological parameters, the received one or more emotional state parameters and the determined emotional state may be encrypted using one or more predefined encryption techniques to generate encrypted data. The encrypted data may be generated to ensure secure transmission from the system 107 to the server 111 over the communication network 113.
At block 307, the method may include receiving, by the system 107, one or more actions to be performed in response to the determined emotional state of the user 101, from the server 111. The one or more actions may be generated by the server 111, upon receiving the encrypted data 2094 comprising determined emotional state, the one or more physiological parameters, and the one or more emotional state parameters of the user 101, from the system 107. At the server 111, the one or more actions may be generated by mapping the emotional state, the one or more physiological parameters, and the one or more emotional state parameters of the user 101 with prestored emotional states, one or more physiological parameters, and one or more emotional state parameters of the plurality of users. The mapping may be performed using predefined machine learning techniques. Thereafter, the generated one or more actions may be transmitted by the server 111 to the system 107.
At block 309, the method may include operating, by the system 107, at least one of the wearable device 103 and the user device 109 based on the received one or more actions for controlling the emotional state of the user 101. The one or more actions may comprise displaying upcoming events of interest for the user 101 on the wearable device 103 and/or the user device 109. The one or more actions may also comprise scheduling one or more meeting events with one or more users of the plurality of users present within a predetermined distance. Each of the one or more users may have an interest score matched with a predetermined interest score of the user 101.
Computer System
Fig. 4 is a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
In an embodiment, Fig. 4 illustrates a block diagram of an exemplary computer system 400 for implementing embodiments consistent with the present disclosure. In an embodiment, the computer system 400 can be the system 107 that is used for providing personalized recommendations based on emotions. The computer system 400 may include a central processing unit (“CPU” or “processor”) 402. The processor 402 may include at least one data processor for executing program components for executing user- or system-generated business processes. A user may include a person, a person using a device such as those included in this invention, or such a device itself. The processor 402 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc.
The processor 402 may be disposed in communication with one or more input/output (I/O) devices (411 and 412) via I/O interface 401. The I/O interface 401 may employ communication protocols/methods such as, without limitation, audio, analog, digital, stereo, IEEE-1394, serial bus, Universal Serial Bus (USB), infrared, PS/2, BNC, coaxial, component, composite, Digital Visual Interface (DVI), high-definition multimedia interface (HDMI), Radio Frequency (RF) antennas, S-Video, Video Graphics Array (VGA), IEEE 802.n /b/g/n/x, Bluetooth, cellular (e.g., Code-Division Multiple Access (CDMA), High-Speed Packet Access (HSPA+), Global System For Mobile Communications (GSM), Long-Term Evolution (LTE), WiMax, or the like), etc. Using the I/O interface 401, computer system 400 may communicate with one or more I/O devices (411 and 412).
In some embodiments, the processor 402 may be disposed in communication with a communication network 409 via a network interface 403. The network interface 403 may communicate with the communication network 409. The network interface 403 may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/Internet Protocol (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc. Using the network interface 403 and the communication network 409, the computer system 400 may communicate with a wearable device 103, one or more sensors 105, a user device 109 and a server 111. The communication network 409 can be implemented as one of the different types of networks, such as intranet or Local Area Network (LAN) and such within the organization. The communication network 409 may either be a dedicated network or a shared network, which represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), etc., to communicate with each other. Further, the communication network 409 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, etc. In some embodiments, the processor 402 may be disposed in communication with a memory 405 (e.g., RAM, ROM, etc. not shown in Fig. 4) via a storage interface 404. The storage interface 404 may connect to memory 405 including, without limitation, memory drives, removable disc drives, etc., employing connection protocols such as Serial Advanced Technology Attachment (SATA), Integrated Drive Electronics (IDE), IEEE-1394, Universal Serial Bus (USB), fibre channel, Small Computer Systems Interface (SCSI), etc. The memory drives may further include a drum, magnetic disc drive, magneto-optical drive, optical drive, Redundant Array of Independent Discs (RAID), solid-state memory devices, solid-state drives, etc.
The memory 405 may store a collection of program or database components, including, without limitation, a user interface 406, an operating system 407, a web browser 408 etc. In some embodiments, the computer system 400 may store user/application data, such as the data, variables, records, etc. as described in this invention. Such databases may be implemented as fault-tolerant, relational, scalable, secure databases such as Oracle or Sybase.
The operating system 407 may facilitate resource management and operation of the computer system 400. Examples of operating systems include, without limitation, Apple Macintosh OS X, UNIX, Unix-like system distributions (e.g., Berkeley Software Distribution (BSD), FreeBSD, NetBSD, OpenBSD, etc.), Linux distributions (e.g., Red Hat, Ubuntu, Kubuntu, etc.), International Business Machines (IBM) OS/2, Microsoft Windows (XP, Vista/7/8, etc.), Apple iOS, Google Android, Blackberry Operating System (OS), or the like. The User interface 406 may facilitate display, execution, interaction, manipulation, or operation of program components through textual or graphical facilities. For example, user interfaces may provide computer interaction interface elements on a display system operatively connected to the computer system 400, such as cursors, icons, check boxes, menus, scrollers, windows, widgets, etc. Graphical User Interfaces (GUIs) may be employed, including, without limitation, Apple Macintosh operating systems’ Aqua, IBM OS/2, Microsoft Windows (e.g., Aero, Metro, etc.), Unix X-Windows, web interface libraries (e.g., ActiveX, Java, Javascript, AJAX, HTML, Adobe Flash, etc.), or the like.
In some embodiments, the computer system 400 may implement the web browser 408 stored program components. The web browser 408 may be a hypertext viewing application, such as Microsoft Internet Explorer, Google Chrome, Mozilla Firefox, Apple Safari, etc. Secure web browsing may be provided using Secure Hypertext Transport Protocol (HTTPS) secure sockets layer (SSL), Transport Layer Security (TLS), etc. Web browsers may utilize facilities such as AJAX, DHTML, Adobe Flash, JavaScript, Java, Application Programming Interfaces (APIs), etc. In some embodiments, the computer system 400 may implement a mail server stored program component. The mail server may be an Internet mail server such as Microsoft Exchange, or the like. The mail server may utilize facilities such as Active Server Pages (ASP), ActiveX, American National Standards Institute (ANSI) C++/C#, Microsoft .NET, CGI scripts, Java, JavaScript, PERL, PHP, Python, WebObjects, etc. The mail server may utilize communication protocols such as Internet Message Access Protocol (IMAP), Messaging Application Programming Interface (MAPI), Microsoft Exchange, Post Office Protocol (POP), Simple Mail Transfer Protocol (SMTP), or the like. In some embodiments, the computer system 400 may implement a mail client stored program component. The mail client may be a mail viewing application, such as Apple Mail, Microsoft Entourage, Microsoft Outlook, Mozilla Thunderbird, etc.
Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present invention. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., non-transitory. Examples include Random Access Memory (RAM), Read-Only Memory (ROM), volatile memory, non-volatile memory, hard drives, Compact Disc (CD) ROMs, Digital Video Disc (DVDs), flash drives, disks, and any other known physical storage media.
Advantages of the embodiment of the present disclosure are illustrated herein.
In an embodiment, the present disclosure provides a method and a system for controlling an emotional state of a user.
In an embodiment, the present disclosure combines the one or more physiological parameters of the user measured by the sensors configured in the wearable device and the one or more emotional state parameters of the user from a user device. Combined analysis of the sensor data with the user device data associated with the user, enables accurate determination of emotional state of the user, which in turn helps in operating the wearable device and the user device in more reliable and better manner for performing one or more actions. Consequently, the user may be encouraged to interact in physical space to recover from negative emotional state.
In an embodiment, the present disclosure provides a method and a system for helping the user to fight loneliness, to make better decisions, adopt healthier lifestyles and help in maintaining a balance between personal, social and work life.
The terms "an embodiment", "embodiment", "embodiments", "the embodiment", "the embodiments", "one or more embodiments", "some embodiments", and "one embodiment" mean "one or more (but not all) embodiments of the invention(s)" unless expressly specified otherwise.
The terms "including", "comprising", “having” and variations thereof mean "including but not limited to", unless expressly specified otherwise. The enumerated listing of items does not imply that any or all the items are mutually exclusive, unless expressly specified otherwise.
The terms "a", "an" and "the" mean "one or more", unless expressly specified otherwise.
A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
When a single device or article is described herein, it will be clear that more than one device/article (whether they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether they cooperate), it will be clear that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the embodiments of the present invention are intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
Referral Numerals:
Reference Number Description
100 Architecture
101 User
103 Wearable device
105 One or more sensors
107 System
109 User device
111 Server
113 Communication link
115 Display of wearable device
201 I/O Interface
203 Processor
207 Memory
209 Data
2091 Sensor data
2092 User device data
2093 Emotional state data
2094 Encrypted data
2095 Server data
2096 Other data
211 Modules
213 Determination module
215 Encryption module
217 Transceiver module
219 Other modules
400 Computer system
401 I/O interface of the computer system
402 Processor of the computer system
403 Network interface of the computer system
404 Storage interface of the computer system
405 Memory of the computer system
406 User interface of the computer system
407 Operating system
408 Web browser
409 Communication network
411 Input devices
412 Output devices
,CLAIMS:We claim:
1. A method (300) of controlling an emotional state of a user (101), the method comprising:
receiving (301), by a system (107,400), one or more physiological parameters of the user (101) from a wearable device (103) and one or more emotional state parameters of the user (101) from a user device (109);
determining (303), by the system (107,400), an emotional state of the user (101) based on the received one or more physiological parameters and the received one or more emotional state parameters;
transmitting (305), by the system (107,400), the determined emotional state, the one or more physiological parameters, and the one or more emotional state parameters of the user (101) to a server (111) over the communication network (113,409);
receiving (307), by the system (107,400), one or more actions to be performed in response to the determined emotional state of the user (101, from the server (111); and
operating (309), by the system (107,400), at least one of the wearable device (103) and the user device (109) based on the received one or more actions for controlling the emotional state of the user (101).
2. The method (300) as claimed in claim 1, wherein the one or more physiological parameters comprises sweat gland activity, skin temperature, heart rate, user movement, and physical activity of the user (101).
3. The method (300) as claimed in claim 1, wherein the one or more emotional state parameters comprises at least one of user’s response to queries provided by one or more applications running on the user device (109), browsing data of the user (101) and data related to one of verbal and textual expressions of the user (101) while performing one or more activities using the user device (109).
4. The method (300) as claimed in claim 1, comprises encrypting the emotional state, the one or more physiological parameters and the one or more emotional state parameters of the user (101) prior to transmitting to the server (111) using one or more predefined encryption techniques.
5. The method (300) as claimed in claim 1, wherein the one or more actions are generated by the server (111) by mapping the emotional state, the one or more physiological parameters, and the one or more emotional state parameters of the user (101) with prestored emotional states of a plurality of users, respective one or more physiological parameters, and respective one or more emotional state parameters of the plurality of users, wherein the mapping is performed utilizing predefined machine learning techniques.
6. The method (300) as claimed in claim 5, wherein the one or more actions comprise at least one of displaying upcoming events of interest for the user (101) on one or more of the wearable device (103) and the user device (109), and scheduling one or more meeting events with one or more users of the plurality of users present within a predetermined distance, wherein each of the one or more users has an interest score matched with a predetermined interest score of the user (101).
7. A system (107,400) for controlling an emotional state of a user (101), the system (107,400) comprising:
a wearable device (103);
a user device (109);
a processor communicatively coupled to the wearable device (103) and the user device (109); and
a memory communicatively coupled to the processor, wherein the memory stores the processor-executable instructions, which, on execution, causes the processor to:
receive one or more physiological parameters of the user (101) from the wearable device (103) and one or more emotional state parameters of the user (101) from the user device (109);
determine an emotional state of the user (101) based on the received one or more physiological parameters and the received one or more emotional state parameters;
transmit the determined emotional state, the one or more physiological parameters, and the one or more emotional state parameters of the user (101) to a server (111) over the communication network (113,409);
receive one or more actions to be performed in response to the determined emotional state of the user (101), from the server (111); and
operate at least one of the wearable device (103) and the user device (109) based on the received one or more actions for controlling the emotional state of the user (101).
8. The system (107,400) as claimed in claim 7, wherein the one or more physiological parameters comprises sweat gland activity, skin temperature, heart rate, user movement, and physical activity of the user (101), measured by the one or more sensors (105) configured in the wearable device (103).
9. The system (107,400) as claimed in claim 8, wherein the one or more sensors (105) comprises at least one Galvanic Skin Response (GSR) sensor for measuring the sweat gland activity of the user (101), at least one temperature sensor for measuring variations in the skin temperature of the user (101), at least one Photoplethysmographic (PPG) sensor for measuring variations in the heart rate of the user (101), and at least one Inertial Measurement Unit (IMU) sensor for measuring the user movement and the physical activity of the user (101).
10. The system (107,400) as claimed in claim 7, wherein the one or more emotional state parameters comprises at least one of user’s response to queries provided by one or more applications running on the user device (109), browsing data of the user (101) and data related to one of verbal and textual expressions of the user (101) while performing one or more activities using the user device (109).
11. The system (107,400) as claimed in claim 7, wherein the processor is configured to encrypt the emotional state, the one or more physiological parameters and the one or more emotional state parameters of the user (101) prior to transmitting to the server (111) using one or more predefined encryption techniques.
12. The system (107,400) as claimed in claim 7, wherein the one or more actions are generated by the server (111) by mapping the emotional state, the one or more physiological parameters, and the one or more emotional state parameters of the user (101) with prestored emotional states of a plurality of users, respective one or more physiological parameters, and respective one or more emotional state parameters of the plurality of users, wherein the mapping is performed utilizing predefined machine learning techniques.
13. The system (107,400) as claimed in claim 12, wherein the one or more actions comprise at least one of displaying upcoming events of interest for the user (101) on one or more of the wearable device (103) and the user device (109), and scheduling one or more meeting events with one or more users of the plurality of users present within a predetermined distance, wherein each of the one or more users has an interest score matched with a predetermined interest score of the user (101).
| # | Name | Date |
|---|---|---|
| 1 | 202041001962-IntimationOfGrant02-07-2024.pdf | 2024-07-02 |
| 1 | 202041001962-STATEMENT OF UNDERTAKING (FORM 3) [16-01-2020(online)].pdf | 2020-01-16 |
| 2 | 202041001962-PatentCertificate02-07-2024.pdf | 2024-07-02 |
| 2 | 202041001962-PROVISIONAL SPECIFICATION [16-01-2020(online)].pdf | 2020-01-16 |
| 3 | 202041001962-POWER OF AUTHORITY [16-01-2020(online)].pdf | 2020-01-16 |
| 3 | 202041001962-FER_SER_REPLY [24-08-2022(online)].pdf | 2022-08-24 |
| 4 | 202041001962-FORM 1 [16-01-2020(online)].pdf | 2020-01-16 |
| 4 | 202041001962-FER.pdf | 2022-02-24 |
| 5 | 202041001962-FORM 18 [07-06-2021(online)].pdf | 2021-06-07 |
| 5 | 202041001962-DRAWINGS [16-01-2020(online)].pdf | 2020-01-16 |
| 6 | 202041001962-DECLARATION OF INVENTORSHIP (FORM 5) [16-01-2020(online)].pdf | 2020-01-16 |
| 6 | 202041001962-COMPLETE SPECIFICATION [17-05-2021(online)].pdf | 2021-05-17 |
| 7 | 202041001962-Proof of Right [21-07-2020(online)].pdf | 2020-07-21 |
| 7 | 202041001962-DRAWING [17-05-2021(online)].pdf | 2021-05-17 |
| 8 | 202041001962-FORM 13 [17-09-2020(online)].pdf | 2020-09-17 |
| 8 | 202041001962-APPLICATIONFORPOSTDATING [09-04-2021(online)].pdf | 2021-04-09 |
| 9 | 202041001962-AMENDED DOCUMENTS [17-09-2020(online)].pdf | 2020-09-17 |
| 9 | 202041001962-PostDating-(09-04-2021)-(E-6-92-2021-CHE).pdf | 2021-04-09 |
| 10 | 202041001962-APPLICATIONFORPOSTDATING [12-01-2021(online)].pdf | 2021-01-12 |
| 10 | 202041001962-PostDating-(12-01-2021)-(E-6-5-2021-CHE).pdf | 2021-01-12 |
| 11 | 202041001962-APPLICATIONFORPOSTDATING [12-01-2021(online)].pdf | 2021-01-12 |
| 11 | 202041001962-PostDating-(12-01-2021)-(E-6-5-2021-CHE).pdf | 2021-01-12 |
| 12 | 202041001962-AMENDED DOCUMENTS [17-09-2020(online)].pdf | 2020-09-17 |
| 12 | 202041001962-PostDating-(09-04-2021)-(E-6-92-2021-CHE).pdf | 2021-04-09 |
| 13 | 202041001962-APPLICATIONFORPOSTDATING [09-04-2021(online)].pdf | 2021-04-09 |
| 13 | 202041001962-FORM 13 [17-09-2020(online)].pdf | 2020-09-17 |
| 14 | 202041001962-DRAWING [17-05-2021(online)].pdf | 2021-05-17 |
| 14 | 202041001962-Proof of Right [21-07-2020(online)].pdf | 2020-07-21 |
| 15 | 202041001962-COMPLETE SPECIFICATION [17-05-2021(online)].pdf | 2021-05-17 |
| 15 | 202041001962-DECLARATION OF INVENTORSHIP (FORM 5) [16-01-2020(online)].pdf | 2020-01-16 |
| 16 | 202041001962-DRAWINGS [16-01-2020(online)].pdf | 2020-01-16 |
| 16 | 202041001962-FORM 18 [07-06-2021(online)].pdf | 2021-06-07 |
| 17 | 202041001962-FER.pdf | 2022-02-24 |
| 17 | 202041001962-FORM 1 [16-01-2020(online)].pdf | 2020-01-16 |
| 18 | 202041001962-POWER OF AUTHORITY [16-01-2020(online)].pdf | 2020-01-16 |
| 18 | 202041001962-FER_SER_REPLY [24-08-2022(online)].pdf | 2022-08-24 |
| 19 | 202041001962-PROVISIONAL SPECIFICATION [16-01-2020(online)].pdf | 2020-01-16 |
| 19 | 202041001962-PatentCertificate02-07-2024.pdf | 2024-07-02 |
| 20 | 202041001962-STATEMENT OF UNDERTAKING (FORM 3) [16-01-2020(online)].pdf | 2020-01-16 |
| 20 | 202041001962-IntimationOfGrant02-07-2024.pdf | 2024-07-02 |
| 1 | 2303E_23-02-2022.pdf |