Abstract: A wearable device 100 to detect mental state of a user is disclosed. The wearable device 100 includes an image capturing unit 102 to capture images of the user, sensors 104 to detect physiological parameters of the user. Captured images and the physiological parameters of the user can be processed and analysed to determine emotion of the user, detected emotion can be displayed on a display unit 110, and based on emotion of the user mood boosting activities can be recommended and displayed on the display unit. Also, emotion of the user at a moment can be detected and transmitted to associated mobile computing devices 114 to detect mood of the user.
TECHNICAL FIELD
[0001] The present disclosure relates to health monitoring. More particularly the present disclosure relates to a wearable device to recognize emotions of a user and based on emotions recommend mood-boosting activities.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] Recently smart watches have become increasingly popular, as people are interested in exercise and physical health. Various studies show that within sports market, in particular amongst those who engage in running, cycling or other physical activities, there is a high use of such devices. There has been a desire to improve the smart watches to broaden their use in other terms also.
[0004] In human life, mental state plays important role, everyone experienced such as frustration, boredom impatience, anger and other during different stages of the day and may result in certain physiological and behavioural manifestations, such as a change in heartbeat or facial expressions. Although attempts have been made to identify user's emotions, existing systems may still lack the ability to connect with a user emotionally. Thus, an advanced device is required, where such changes in human emotional states may be employed in an intelligent manner to provide emotion-based recommendations with enhanced practical usability.
[0005] Various systems and methods have been developed for observing human emotions in psychology, the neurosciences, and machine learning studies. Existing solution involves the different techniques and methods to identify the varying human behaviour but are less adequate to analyze the mood of a person due to difference in the structural design and lacking of various required sensors and advanced tools. Also, existing systems are bulky and the user cannot carry these with them always, thus there is a need to add this feature in daily use gadgets.
[0006] There is, therefore, a need need to overcome above mentioned problems by bringing solution that facilitates in recognizing mental state of the user instantly. The solution facilitates in recommending the user based on recognized mental state to improve the mental state quickly.
OBJECTS OF THE PRESENT DISCLOSURE
[0007] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0008] It is an object of the present disclosure to provide a wearable device to detect mental state of the user.
[0009] It is an object of the present disclosure to provide a wearable device to recognize emotion of a user at a certain moment.
[0010] It is an object of the present disclosure to provide a wearable device to recommend mood-boosting activities to improve emotions.
[0011] It is an object of the present disclosure to provide a wearable device that notifies concerned person automatically regarding the mental state of the user.
[0012] It is an object of the present disclosure to provide a wearable device that assists corporates to detect mental states of their employees.
[0013] It is an object of the present disclosure to provide a wearable device that is user-friendly, robust and accurate.
SUMMARY
[0014] The present disclosure relates to health monitoring. More particularly the present disclosure relates to a wearable device to recognize emotions of a user and based on emotions recommend mood-boosting activities.
[0015] An aspect of the present disclosure pertains to a wearable device to detect emotions of a user, the device may including an image capturing unit configured to capture images of the user, one or more sensors configured to detect physiological parameters of the user and correspondingly generate a first set of signals, a processing unit.
[0016] In an aspect, the processing unit may be operatively coupled with the image capturing unit and the plurality of sensors, the processing unit may be including one or more processors coupled with a memory, the memory storing instructions executable by the one or more processors and configured to receive the images of the user and the first set of signals, extract facial attributes from the images and values of the physiological parameters from the first set of signals respectively, match the facial attributes with a set of pre-defined facial attribute, compare the values of the physiological parameters with a dataset including pre-defined threshold values, determine mental state of the user, based on the facial attributes and the values of the physiological parameters, and generate, output signals based on the mental state of the user, and transmit the generated output signals to a display unit and associated one or more mobile computing devices, where the output signals may pertain emotion of the user and activities to be performed by the user.
[0017] In an aspect, the one or more sensors may be selected from a group of infrared sensor, oximeter, heart rate sensor, and heat detector.
[0018] In an aspect, the processing unit may be configured to detect blink rate information of the user from the received images, and correspondingly determine mental state of the user.
[0019] In an aspect, the physiological parameters may include any or a combination of pulse rate, temperature, oxygen level, and heat radiation generated from body of the user.
[0020] In an aspect, the activities may include any or a combination of listen music, listen comedy, walk, exercise, sleep, and take rest.
[0021] In an aspect, the display unit may be operatively coupled with the processing unit, and the display unit is selected from a group consisting of light emitting diode (LED), liquid crystal display (LCD), organic light emitting diode (OLED), and LED matrix.
[0022] In an aspect, the processing unit may be operatively coupled with an alert unit, and the alert unit may be actuated upon receiving a set of warning signals by the processing unit.
[0023] In an aspect, the alert unit may include any or a combination of light emitting diode, buzzer, and alarm.
[0024] In an aspect, the processing unit may be communicatively coupled with the one or more mobile computing devices through a communication unit, and the communication unit may include any or combination of Wireless Fidelity (Wi-Fi) Module, Bluetooth, Li-Fi, Wireless Local Area Network (WLAN), and ZigBee.
[0025] In an embodiment, the one or more mobile computing devices may include any or a combination of mobile terminal, laptop and tablet.
[0026] In an embodiment, the wearable device may include a power source configured to supply electric power to the image capturing unit, one or more sensors, processing unit, display unit, and alert unit.
[0027] In an aspect, the power source may include any or a combination of rechargeable battery, lithium (Li) ion cell, rechargeable cells, solar cell, solar battery, electrochemical cells, storage battery, and secondary cell.
[0028] Various objects, features, aspects and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
BRIEF DESCRIPTION OF DRAWINGS
[0029] The accompanying drawings are included to provide a further understanding of the present disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure. The diagrams are for illustration only, which thus is not a limitation of the present disclosure.
[0030] In the figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
[0031] FIG 1 illustrates a block diagram of a wearable device in accordance with an embodiment of the present disclosure.
[0032] FIG. 2A-2B illustrates an exemplary view of wearable device displaying messages, in accordance with an embodiment of the present disclosure.
[0033] FIG. 3 illustrates an exemplary functional components of a processing unit of the proposed device, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
[0034] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the scope of the present disclosure as defined by the appended claims.
[0035] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details.
[0036] The present disclosure relates to health monitoring. More particularly the present disclosure relates to a wearable device to recognize emotions of a user and based on emotions recommend mood-boosting activities.
[0037] The present disclosure elaborates a wearable device such as wrist watch, band, and the likes, which can be used to collect physiological parameters of user and facial attributes, and based on collected information mental state of the user can be detected. When mental state of the user is detected, emotion can be analysed such as sad, anger, and the likes and correspondingly mood-boosting activities can be suggested by the wearable device to the user to change the mood.
[0038] As illustrated in FIG. 1, a wearable device 100 (interchangeably referred as device 100, hereinafter) can be used to detect emotion of a user. The device 100 can be band, watch which can be wear by the user on wrist. The device 100 can include an image capturing unit 102 to capture images of the user, and one or more sensors 104 (collectively referred to as sensors 104, and individually referred to as sensor 104 herein after), can be configured to detect physiological parameters of the user. Also, a processing unit 108 can be configured within the device 100, the processing unit 108 can be operatively coupled with the image capturing unit 102 and the sensors 104 to analyse collected information to determine emotion of the user in real-time.
[0039] In an embodiment, the device 100 can be communicatively coupled with mobile computing devices 114 (collectively referred to as mobile computing devices 114, and individually referred to as mobile computing device 114 herein after), the mobile computing device 114 can be configured to receive information from the associated device 100 through applications residing on the mobile computing device 114. The mobile computing device 114 can include but not limited to a mobile terminal, laptop, phone, tablet etc. In an implementation, the device 100 can be accessed by applications residing on any operating system, including but not limited to, AndroidTM, iOSTM, and the like.
[0040] In an embodiment, the image capturing unit 102 can be selected from a group of camera, scanner, face reader, but not limited to likes. The image capturing unit 102 can be configured to capture images of the user and correspondingly generate a first set of signals, which can be transmitted to the processing unit 108. The processing unit 108 can extract facial attributes from the received first set of signals, and based on the extracted facial attributes mental state of the user can be detected at that moment, and information can be transmitted to associated mobile computing devices 114.
[0041] In an embodiment, the processing unit 108 can be communicatively coupled with the one or more mobile computing devices through a communication unit (not shown). The communication unit can include any or a combination of Wireless Fidelity (Wi-Fi) Module, Bluetooth, Li-Fi, Wireless Local Area Network (WLAN), ZigBee, and the likes.
[0042] In an exemplary embodiment, the device 100 can be connected with the mobile phone using Bluetooth or Wi-Fi.
[0043] In an exemplary embodiment, when the watch is worn by the user on wrist, the camera attach in the watch can capture images of the user, and facial attributes can be detected which can determine emotion of the user, and detected emotion can be transmitted to mobile computing device associated to the user and other concerned person. For example, emotion of an employee can be tracked by employer, and the employer can handle situation accordingly.
[0044] In an embodiment, the sensors 104 can be configured to detect physiological parameters of the user, the physiological parameters can be selected from but not limited to pulse rate, temperature, oxygen level, and heat radiation generated from body of the user. In another embodiment, the sensors 104 can include infrared sensor, oximeter, heart rate sensor, heat detector, and the likes.
[0045] In an exemplary embodiment, the watch can include temperature sensors to measure body temperature, and when the body temperature is beyond the normal temperature, the user can be sad, dizzy, and the likes.
[0046] In an embodiment, a display unit 110 can be operatively coupled with the processing unit 108, the display unit 110 can be selected from a group consisting but not limited to light emitting diode (LED), liquid crystal display (LCD), organic light emitting diode (OLED), and LED matrix. The display unit 110 can be configured to display information such as time, weather, steps walk by the user, detected values of the health parameters, emotions, notifications, and recommendations.
[0047] In an exemplary embodiment, upon detection of a sad emotion, the display unit 100 can provide recommendations to the user to listen comedy and to talk to friends to change mood. Also, when user is not feeling well, and emotion found dizzy, the user can be recommended to take medicines and rest.
[0048] In an exemplary embodiment, detected emotion and physiological parameters can be detected and transmitted to user’s phone and other concerned person’s phone, so that they can assist the user accordingly to change mood.
[0049] In an embodiment, an alert unit 112 can be configured with the device 100, the alert unit 112 can be light emitting diode, buzzer, alarm, and likes. Upon detection of an emotion like sad, anger, the user can be notified to perform activities such as listen music, listen comedy, walk, exercise, sleep, and take rest.
[0050] In an embodiment, the device 100 can include a power source configured to supply electric power to the image capturing unit 102, sensors 104, processing unit 108, display unit 110, and alert unit 112.
[0051] In an embodiment, the power source can include any or a combination of rechargeable battery, lithium (Li) ion cell, rechargeable cells, solar cell, solar battery, electrochemical cells, storage battery, and secondary cell.
[0052] As illustrated in Fig. 2A and 2B, the device 100 can be like a wrist band, having a display unit 110 which can be operatively coupled with the processing unit 108. He display unit can display detected emotion and recommendation of the display unit. When the user is sad ( as shown in FIG. 2A), the display unit 110 can display sad emoji and text “sad”, also recommendation “listen jokes” can be displayed on the display unit. Similarly, when the user is angry ( as shown in FIG. 2B), the display unit 110 can display anger emoji and text “Anger”, also recommendation “go for walk” can be displayed on the display unit 110.
[0053] As illustrated in FIG. 3, a processing unit 108 can include one or more processor(s) 302. The one or more processor(s) 302 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 302 can be configured to fetch and execute computer readable instructions stored in a memory 304 of the processing unit 118. The memory 304 can store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 304 can include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the likes.
[0054] In an embodiment, the processing unit 118 can also include an interface(s) 306. The interface(s) 306 may comprise a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 306 may facilitate communication of device 100. The interface(s) 306 may also provide a communication pathway for one or more components of wearable device 100. Examples of such components include, but are not limited to, processing engine(s) 308 and database 322.
[0055] In an embodiment, a processing engine(s) 308 can be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) 308. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) 308 may be processor executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) 308 may include a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s) 308. In such examples, the processing unit 118 can include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to processing unit 118 and the processing resource. In other examples, the processing engine(s) 308 may be implemented by electronic circuitry. A database 322 can include data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) 308.
[0056] In an embodiment, the processing engine(s) 308 can include an extraction unit 310, a matching unit 312, a comparison unit 314, a classification and training unit 316, a signal generation unit 318, and other unit(s) 320. The other unit(s) 320 can implement functionalities that supplement applications or functions performed by the device 100 or the processing engine(s) 308.
[0057] In an embodiment, the database 322 can include data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) 308.
[0058] It would be appreciated that units being described are only exemplary units and any other unit or sub-unit may be included as part of the device 100. These units too may be merged or divided into super- units or sub-units as may be configured.
[0059] In an embodiment, the processing unit 118 can be configured to receive images captured by the image capturing unit 102, and first set of signal in electric form, where the first set of signals pertain values of the physiological parameters of the user, and further transmits the images and first set of signals to the extraction unit 310.
[0060] In an embodiment, the extraction unit 310 can be configured to extract facial attributes from the received images to detect emotion on human face at that moment, and extracted facial attributed can be transmitted to the matching unit 312.
[0061] In an embodiment, the extraction unit 310 can extract value of physiological parameters of the user. The physiological parameters can include any or a combination of pulse rate, temperature, oxygen level, heat radiation generated from body of the user, and etc. The extracted values can be transmitted to the comparison unit 314.
[0062] In an embodiment, the matching unit 312 can be configured to compare the extracted values of the facial attributes with pre-stored threshold values. For example, when the human is sad, no smile is found on the face and some lines are found on forehead, these features can be matched, similarly when the user is angry, the eyebrows come down and together, the eyes glare, also lip corners can be fund narrowed.
[0063] In an embodiment, the comparison unit 314 can be configured to compare the received values of the physiological parameters with a dataset including pre-defined threshold values. For example, the average heart rate of human is 80 beats per minute, but in anger the pulse rate can accelerate to 180 beats per minute. Similarly, anger can inflate blood pressure from 120/80 to 220/130, increasing the risk of heart attack or stroke.
[0064] In an exemplary embodiment, when the user is angry, body transmit heat at high rate, as we see ears and eyes can be read due to heat radiation from the body, the body heat also can be detected and compared with the pre-defined threshold values.
[0065] In an embodiment, based on based on the facial attributes and the values of the physiological parameters, mental state of the user can be detected, and the signal generation unit 318 can transmit output signals to a display unit 110 and associated one or more mobile computing devices 114, wherein the output signals pertain emotion of the user and activities to be performed by the user.
[0066] In an exemplary embodiment, when pulse rate is high and anger emotion is detected on user’s face, it can be determined that user is in anger, and based on emotion, recommendation such as “go for walk” or “drink water” or “keep calm” can be transmitted to the display unit 110.
[0067] In an embodiment, the signal generation unit 318 can generate a set of warning signals, which can be transmitted to an alert unit 112, and the alert unit 112 can produce audible sound to alert the to perform recommended activities.
[0068] In an embodiment, the output signals can be transmitted to the associated mobile computing devices 114 to alert concerned person such as family member, employer, and the likes to detect mood of the user at particular moment.
[0069] In an embodiment, the classification and training unit 316 can be configured to receive the extracted facial attributes and values of physiological parameters, from the extraction unit 310 in machine readable form or binary form and update and train the classification and training unit 316 based on received facial attributes and value of physiological parameters. In another embodiment, the classification and training unit 316 can be trained and updated based on the received facial attributes and the physiological parameters. A deep leaning model can be trained based on the received facial attributes and physiological parameters, and analysed information where the deep leaning model can be stored in the database 322. In yet another embodiment, once the dataset is trained correctly, a deep learning algorithm can be configured to perform repetitive, and routine tasks within a shorter period of time.
[0070] In an embodiment, the classification and training unit 316 can be configured to store many user facial attributes and the physiological parameters recorded for trend analysis and to predict more accurately. Also, the classification and training unit 316 can be configured to store a set of training datasets to train a machine learning model for determining facial attributes and mental state of the user based on the training. The plurality of training datasets can include historical information related to mental state of the user, emotion, facial attributes in different mood, and value of physiological parameters in different mood.
[0071] In an exemplary embodiment, the processing engine 308 can be further configured in the form of an Artificial Neural Network like the following, but not limited to Convolutional Neural Network (CNN) and Deep Neural Network (DNN). In an exemplary embodiment, the processing engine 308 can include deep learning based classifiers, where the deep learning based classifiers can include KNN classifiers, MLP neural networks and the likes.
[0072] Moreover, in interpreting the specification, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and “comprising” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refer to at least one of something selected from the group consisting of A, B, C ….and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc.
[0073] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE INVENTION
[0074] The proposed invention provides a wearable device to detect mental state of the user.
[0075] The proposed invention provides a wearable device to recognize emotion of a user at a certain moment.
[0076] The proposed invention provides a wearable device to recommend mood-boosting activities to improve emotions.
[0077] The proposed invention provides a wearable device to notify concerned person automatically regarding the mental state of the user.
[0078] The proposed invention provides a wearable device to assists corporates to detect mental states of their employees.
[0079] The proposed invention provides a wearable device which is user-friendly, robust and accurate.
We Claims:
1. A wearable device 100 to detect emotions of a user, the device comprising:
an image capturing unit 102 configured to capture images of the user;
a plurality of sensors 104 configured to detect physiological parameters of the user and correspondingly generate a first set of signals; and
a processing unit 108 operatively coupled with the image capturing unit 102 and the plurality of sensors 104, the processing unit 108 including one or more processors coupled with a memory, the memory storing instructions executable by the one or more processors and configured to:
receive the images of the user and the first set of signals;
extract facial attributes from the images and values of the physiological parameters from the first set of signals respectively;
match the facial attributes with a set of pre-defined facial attribute;
compare the values of the physiological parameters with a dataset including pre-defined threshold values;
determine mental state of the user, based on the facial attributes and the values of the physiological parameters; and
generate, output signals based on the mental state of the user, and transmit the generated output signals to a display unit and associated one or more mobile computing devices, wherein the output signals pertain emotion of the user and activities to be performed by the user.
2. The wearable device as claimed in claim 1, wherein the plurality of sensors 104 are selected from a group of infrared sensor, oximeter, heart rate sensor, and heat detector.
3. The wearable device as claimed in claim 1, wherein the processing unit 108 is configured to detect blink rate information of the user from the received images, and correspondingly determine mental state of the user.
4. The wearable device as claimed in claim 1, wherein the physiological parameters include any or a combination of pulse rate, temperature, oxygen level, and heat radiation generated from body of the user.
5. The wearable device as claimed in claim 1, wherein the activities include any or a combination of listen music, listen comedy, walk, exercise, sleep, and take rest.
6. The wearable device as claimed in claim 1, wherein the display unit is operatively coupled with the processing unit, and the display unit is selected from a group consisting of light emitting diode (LED), liquid crystal display (LCD), organic light emitting diode (OLED), and LED matrix.
7. The wearable device as claimed in claim 1, wherein the processing unit is operatively coupled with an alert unit 112, wherein the alert unit 112 is actuated upon receiving a set of warning signals by the processing unit, and wherein the alert unit 112 includes any or a combination of light emitting diode, buzzer, and alarm.
8. The wearable device as claimed in claim 1, wherein the processing unit 108 is communicatively coupled with the one or more mobile computing devices 114 through a communication unit, wherein the communication unit comprises any or combination of Wireless Fidelity (Wi-Fi) Module, Bluetooth, Li-Fi, Wireless Local Area Network (WLAN), and ZigBee.
9. The wearable device as claimed in claim 1, wherein the one or more mobile computing devices 114 comprises any or a combination of mobile terminal, laptop and tablet.
10. The wearable device as claimed in claim 1, wherein the wearable device 100 comprises a power source configured to supply electric power to the image capturing unit 102, the plurality of sensors 104, the processing unit 108, the display unit 110, and the alert unit 112, wherein the power source comprises any or a combination of rechargeable battery, lithium (Li) ion cell, rechargeable cells, solar cell, solar battery, electrochemical cells, storage battery, and secondary cell.
| # | Name | Date |
|---|---|---|
| 1 | 202111038542-STATEMENT OF UNDERTAKING (FORM 3) [25-08-2021(online)].pdf | 2021-08-25 |
| 2 | 202111038542-POWER OF AUTHORITY [25-08-2021(online)].pdf | 2021-08-25 |
| 3 | 202111038542-FORM FOR STARTUP [25-08-2021(online)].pdf | 2021-08-25 |
| 4 | 202111038542-FORM FOR SMALL ENTITY(FORM-28) [25-08-2021(online)].pdf | 2021-08-25 |
| 5 | 202111038542-FORM 1 [25-08-2021(online)].pdf | 2021-08-25 |
| 6 | 202111038542-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [25-08-2021(online)].pdf | 2021-08-25 |
| 7 | 202111038542-EVIDENCE FOR REGISTRATION UNDER SSI [25-08-2021(online)].pdf | 2021-08-25 |
| 8 | 202111038542-DRAWINGS [25-08-2021(online)].pdf | 2021-08-25 |
| 9 | 202111038542-DECLARATION OF INVENTORSHIP (FORM 5) [25-08-2021(online)].pdf | 2021-08-25 |
| 10 | 202111038542-COMPLETE SPECIFICATION [25-08-2021(online)].pdf | 2021-08-25 |
| 11 | 202111038542-Proof of Right [03-12-2021(online)].pdf | 2021-12-03 |
| 12 | 202111038542-FORM 18 [08-07-2023(online)].pdf | 2023-07-08 |
| 13 | 202111038542-FER.pdf | 2024-05-28 |
| 14 | 202111038542-FORM-5 [28-11-2024(online)].pdf | 2024-11-28 |
| 15 | 202111038542-FER_SER_REPLY [28-11-2024(online)].pdf | 2024-11-28 |
| 16 | 202111038542-CORRESPONDENCE [28-11-2024(online)].pdf | 2024-11-28 |
| 1 | 202111038542SEARCHSTRATEGYE_24-05-2024.pdf |