Abstract: A system (100) for real-time in-vehicle wellness monitoring of a user (102) of a vehicle (104) is presented. The system includes a non-invasive biological parameter acquisition system configured to acquire measured non-invasive biological parameters (116, 602) of the user (102) via cameras (106), non-contact sensors (108, 404), and touch-based sensors (112, 504). The system includes an in-vehicle wellness prediction system (118) including an acquisition subsystem (120) and a processing subsystem (122) that includes a prediction platform (124) configured to process the measured parameters (116, 602) to generate determined non-invasive biological parameters (218), receive an input of a task (604), retrieve a model (126, 310, 606), and predict an outcome, including chronic inflammation, based on the parameters (116, 602, 218) and the model. Further, the system includes an interface unit (130, 132) configured to provide, in real-time, the outcome to facilitate non-invasive monitoring of short-term and long-term wellbeing of the user.
DESC:BACKGROUND
[0001] Embodiments of the present specification relate generally to wellness of drivers, and more particularly to systems and methods for in-vehicle wellness monitoring of a driver of a vehicle for promoting driver health and wellness.
[0002] Driving a vehicle is a demanding task that requires constant attention and focus, which can be stressful and physically taxing for drivers. Some of the leading causes of road accidents include distracted and drowsy drivers, drunk drivers, rash drivers, impaired drivers, and the like. Also, prolonged driving can lead to physical and/or mental fatigue, which in turn impact the wellbeing and safety of the drivers, passengers, other vehicles/drivers, pedestrians, and the like.
[0003] Some traditional ways of mitigating the effects of protracted driving include taking breaks and/or staying hydrated. Recently, several driver assistance technologies have been developed to monitor a driver in a vehicle. Also, several driver assistance systems monitor the distraction and/or drowsiness of the driver and issue an alert to the driver. Further, other driver assistance systems are configured to warn a driver at risk of an impending crash, while some others are designed to take corrective action to avoid a crash. However, while the presently available driver assistance/monitoring systems are able to track the current state of the driver, these systems fail to provide the drivers with real-time feedback regarding the short-term and long-term wellness of the drivers.
[0004] Wellness is typically defined as a state or quality of being in good health and an absence of illness. A wellness state may encompass physical, mental, and social health. In the recent years, there has been a discernable and urgent need for shifting the focus of the healthcare system from “diagnose and cure” to “predict and prevent,” thereby encouraging pre-emptive action. Moreover, there exists a strong correlation between wellness and physical health outcomes of a driver due to prolonged driving and driving under stressful conditions. Stress is defined as a state of a human body in which a steady state is disturbed as a result of various external and internal stressors, resulting in emotional and/or physical tension. Some common sources of stress include lifestyle, poor dietary habits, lack of sleep, lack of exercise, environmental factors, injury, an infection, or combinations thereof. A commonly occurring manifestation of the body’s reaction to stress is inflammation. Inflammation in turn has been linked to heart disease, stroke, obesity, diabetes, chronic kidney disease, Alzheimer’s, cancer, and autoimmune disorders, such as rheumatoid arthritis and lupus. Hence, there is a growing need for measuring wellness of the driver population to proactively galvanize desired preventive actions, thereby enhancing driver safety and wellbeing, and also ensuring the safety of passengers and pedestrians.
BRIEF DESCRIPTION
[0005] In accordance with aspects of the present specification, a system for real-time in-vehicle wellness monitoring of a user of a vehicle is presented. The system includes a non-invasive biological parameter acquisition system disposed in the vehicle and configured to acquire a set of measured non-invasive biological parameters corresponding to the user, where the non-invasive biological parameter acquisition system includes one or more cameras positioned in the vehicle, where the one or more cameras are configured to capture video of the user, one or more non-contact sensors positioned on a seat in the vehicle, in the seat in the vehicle, or a combination thereof, where the one or more non-contact sensors are configured to obtain non-contact sensor information corresponding to the user, one or more touch-based sensors positioned on a steering wheel in the vehicle, in the steering wheel in the vehicle, or a combination thereof, where the one or more touch-based sensors are configured to obtain touch-based sensor information corresponding to the user, where the set of measured non-invasive biological parameters includes an age, a gender, height, weight, an ethnicity, the captured video, the non-contact sensor information, the touch-based sensor information of the user, or combinations thereof. Furthermore, the system includes an in-vehicle wellness prediction system communicatively coupled to an infotainment system in the vehicle, integrated with an infotainment system in the vehicle, or a combination thereof, where the in-vehicle wellness prediction system includes an acquisition subsystem configured to obtain the set of measured non-invasive biological parameters corresponding to the user and a processing subsystem in operative association with the acquisition subsystem and including a prediction platform configured to process the set of measured non-invasive biological parameters to generate a set of determined non-invasive biological parameters corresponding to the user, receive an input corresponding to a selected task, retrieve a model based on the input, predict an outcome based on the set of measured non-invasive biological parameters, the set of determined non-invasive biological parameters, or a combination thereof and the model, where the outcome corresponds to a wellness metric, including chronic inflammation, predicted values of one or more invasive parameters, or a combination thereof, and where the wellness metric is representative of a quantified biological parameter of the user. Additionally, the system includes an interface unit integrated with the infotainment system in the vehicle and configured to provide, in real-time, the outcome to the user, where in-vehicle wellness prediction system is configured to facilitate non-invasive monitoring of the short-term and long-term wellbeing of the user of the vehicle based on the set of measured non-invasive biological parameters and the set of determined non-invasive biological parameters and provide feedback to the user in real-time to promote health and wellbeing of the user.
[0006] In accordance with another aspect of the present specification, a method for real-time in-vehicle wellness monitoring of a user of a vehicle is presented. The method includes receiving a set of measured non-invasive biological parameters corresponding to the user. Furthermore, the method includes processing the set of measured non-invasive biological parameters to generate a set of determined non-invasive biological parameters corresponding to the user. The method also includes receiving an input corresponding to a selected task. Moreover, the method includes retrieving at least one model based on the input corresponding to a selected task. Additionally, the method includes predicting an outcome based on the set of measured non-invasive biological parameters, the set of determined non-invasive biological parameters, or a combination thereof and the model, where the outcome corresponds to a wellness metric, including chronic inflammation, predicted values of one or more invasive parameters, or a combination thereof, and where the wellness metric is representative of a quantified biological parameter of the user. Also, the method includes providing, in real-time, the outcome to the user, where the method is configured to facilitate non-invasive monitoring of the short-term and long-term wellbeing of the user of the vehicle based on the set of measured non-invasive biological parameters and the set of determined non-invasive biological parameters and provide feedback to the user in real-time to promote health and wellbeing of the user.
[0007] In accordance with yet another aspect of the present specification, a system for real-time in-vehicle wellness monitoring of a user of a vehicle is presented. The system includes a prediction platform configured to process a set of measured non-invasive biological parameters to generate a set of determined non-invasive biological parameters corresponding to the user, receive an input corresponding to a selected task, retrieve a model based on the input, predict an outcome based on the set of measured non-invasive biological parameters, the set of determined non-invasive biological parameters, or a combination thereof and the model, where the outcome corresponds to a wellness metric, including chronic inflammation, predicted values of one or more invasive parameters, or a combination thereof, and where the wellness metric is representative of a quantified biological parameter of the user, and provide, in real-time, the outcome to the user to facilitate non-invasive monitoring of the short-term and long-term wellbeing of the user of the vehicle based on the set of measured non-invasive biological parameters, the set of determined non-invasive biological parameters, or a combination thereof and deliver feedback to the user in real-time to promote health and wellbeing of the user.
DRAWINGS
[0008] These and other features and aspects of embodiments of the present specification will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0009] FIG. 1 is a schematic representation of an exemplary system for in-vehicle wellness monitoring of a driver in the vehicle, in accordance with aspects of the present specification;
[0010] FIG. 2 is a flow chart illustrating a method for in-vehicle wellness monitoring of a driver in the vehicle, in accordance with aspects of the present specification;
[0011] FIG. 3 is a schematic illustration of a method for generating one or more models for use in the method for in-vehicle wellness monitoring of a driver of FIG. 1, in accordance with aspects of the present specification;
[0012] FIG. 4 is a diagrammatical illustration of one embodiment of an arrangement of a set of non-contact sensors for use in the system of FIG. 1, in accordance with aspects of the present specification;
[0013] FIG. 5 is a diagrammatical illustration of one embodiment of an arrangement of touch-based sensors for use in the system of FIG. 1, in accordance with aspects of the present specification;
[0014] FIG. 6 is a schematic illustration of the method for in-vehicle wellness monitoring of a driver of FIG. 2, in accordance with aspects of the present specification;
[0015] FIG. 7 is a diagrammatical illustration of providing a wellness metric to a driver for use in the system of FIG. 1 to facilitate non-invasive monitoring of the short-term and long-term wellbeing of a driver of a vehicle and to provide feedback to the driver in real-time to promote health and wellbeing of the driver, in accordance with aspects of the present specification; and
[0016] FIG. 8 is a schematic representation of one embodiment of a digital processing system implementing a prediction platform for use in the system of FIG. 1, in accordance with aspects of the present specification.
DETAILED DESCRIPTION
[0017] The following description presents exemplary systems and methods for in-vehicle wellness monitoring of a driver and/or passengers traveling in a vehicle. In particular, the systems and methods facilitate prediction of one or more wellness metrics corresponding to a driver and/or passengers in the vehicle using only non-invasive biological parameters corresponding to the driver. Embodiments described hereinafter present exemplary systems and methods to promote driver health and wellness by facilitating enhanced non-invasive monitoring of short-term as well as long-term wellness of the driver, in real-time. In one embodiment, the in-vehicle wellness monitoring system utilizes non-invasive biological parameters corresponding to the driver to predict wellness metrics corresponding to the driver. Further, the wellness metric may be representative of short-term variations associated with the driver’s physical and mental state. In another example, the wellness metric may represent long-term health trends and advice for the driver. In yet another example, the wellness metric may be representative of an overall wellness score of the driver. Another example of the wellness metric may include predicted values of invasive parameters such as a C-reactive protein (CRP) value, a cortisol value, a serum protein electrophoresis (SPEP or SPE) value, and the like. In yet another example, the wellness metrics may also be indicative of an estimate of an inflammatory state of the driver 102, such as chronic inflammation, acute inflammation, stress, and the like, which may in turn be indicative of underlying health conditions and potential risks to the driver’s 102 overall wellness. Use of the present systems and methods presents significant advantages in reliably facilitating the non-invasive monitoring the short-term and long-term wellbeing of the driver and/or passengers in the vehicle and providing quantitative measurements and/or other outcomes in the form of the wellness metrics, thereby overcoming the drawbacks of currently available methods of monitoring the wellness of the driver. It may be noted that the terms wellness and wellbeing are used interchangeably.
[0018] For ease of understanding, the exemplary embodiments of the present systems and methods are described in the context of an in-vehicle wellness monitoring system configured to provide wellness metrics that correspond to short-term and long-term wellbeing to a driver of a vehicle. An exemplary environment that is suitable for practising various implementations of the present systems and methods is discussed in the following sections with reference to FIG. 1.
[0019] Referring now to the drawings, FIG. 1 illustrates an exemplary in-vehicle wellness monitoring system 100 designed to improve the health and safety of drivers while driving by non-invasively monitoring the short-term and long-term wellbeing of a driver 102 of a vehicle 104. In certain embodiments, the system 100 may be integrated into the vehicle 104, thereby permitting continuous monitoring of the wellbeing of the driver 102. Further, the system 100 is configured to non-invasively measure various biological parameters corresponding to the driver 102 using non-invasive sensors and determine the short-term and long-term wellbeing of the driver 102 based on the measured non-invasive biological parameters. In particular, the system 100 is configured to determine/ predict wellness metrics corresponding to the driver 102 based on the measured non-invasive biological parameters. In one example, the wellness metrics may provide short-term information related to the wellbeing of the driver 102 such as alertness of the driver 102, mood of the driver 102, fatigue of the driver 102, and the like. The predicted wellness metrics may also provide long-term information corresponding to the driver 102 such as trend lines of the computed/determined and/or measured non-invasive biological parameters corresponding to the driver 102, overall wellness score related to the driver 102, estimates of the inflammatory state of the driver 102, and the like. These wellness metrics may be used by the drivers 102 to make lifestyle and/or behavioral changes to improve their overall health and wellbeing.
[0020] As used herein, the term “user” refers to a person using the system 100 for in-vehicle wellness monitoring. In the present example, the user may include a driver of a vehicle such as an automobile, a truck, a two-wheeler, a three-wheeler, an all-terrain vehicle, a boat, and the like. Also, as used herein, the term “invasive parameter” refers to a parameter that is determined using an invasively drawn sample such as, but not limited to, a blood sample, a tissue sample, and the like. Further, as used herein, the term “invasive parameter” also encompasses in vitro parameters determined in a laboratory using other samples such as, but not limited to, a salivary sample, a urine sample, and the like. Some non-limiting examples of the invasive parameter include a C-reactive protein (CRP) value, a cortisol value, a serum protein electrophoresis (SPEP or SPE) value, and the like.
[0021] In a similar fashion, as used herein, the term “non-invasive parameter” refers to a parameter that is measured or determined without use of an invasively drawn sample or laboratory analysis. Consequently, the non-invasive parameter, as defined herein, may be measured and/or determined continuously. Some non-limiting examples of the non-invasive parameter include an age, a gender, height, weight, ethnicity, bioimpedance values, a pulse/heart rate, heart rate variability, respiratory rate, sweat, skin tone characterization, skin conductance information, an image of the face, an image of the eyes, eye color information, eye movement information, , hand movement information, posture, and the like. It may be noted that some non-invasive biological parameters may be directly measured from the driver 102, while certain other non-invasive biological parameters may be determined or computed by processing acquired video and other measured non-invasive biological parameters corresponding to the driver 102. In addition, some of the non-invasive biological parameters may be provided by the driver such as age, gender, height, weight, ethnicity, and the like.
[0022] Also, as used herein, the term “outcome” refers to one or more wellness metrics, predicted values of one or more invasive parameters, or both the wellness metrics and the predicted values of the invasive parameters. Further, as used herein, the term “wellness metric” is used to refer to a quantified biological parameter that is scientifically established to be an indicator of an individual’s health and/or lifestyle. In one example, the wellness metric is representative of a long-term wellness of the driver 102 such as an overall wellness score. The long-term wellness score may serve as a comprehensive assessment of the long-term wellness of the driver 102 and a measure of the driver’s 102 general wellbeing.
[0023] The overall wellness score may be a measure of the overall wellbeing of the driver 102, in one example. In other non-limiting examples, the wellness metrics may include an indicator of trend lines of the measured and/or determined non-invasive biological parameters. The wellness metrics may also be indicative of an estimate of an inflammatory state of the driver 102, such as chronic inflammation, acute inflammation, stress, and the like. It may be noted that chronic inflammation is typically indicative of a sustained immune response often associated with prolonged exposure to inflammatory stimuli, which may be indicative of underlying health conditions and potential risks to the driver’s 102 overall wellness. Also, in yet another example, the wellness metric may be representative of short-term information such as driver alertness, driver mood, driver fatigue, and the like.
[0024] It may be noted that the term “model” is used to refer to a neural network that is trained and configured to perform one or more desired tasks. Also, the terms models, task-specific models, neural network models, and artificial intelligence (AI) models, may be used interchangeably. Additionally, in one example, the term “task” is used to refer to a function to be performed by a model. In one non-limiting example, the task may include generating one or more wellness metrics corresponding to the driver 102. The task performed by the model may also include determining predicted values of one or more invasive parameters corresponding to the driver 102. Some non-limiting examples of the task that a model is configured to perform may entail predicting wellness metrics representative of fatigue, alertness, stress, inflammatory state such as chronic inflammation and acute inflammation, overall wellness score, trend lines of measured and/or determined non-invasive biological parameters, predicted values of invasive biological parameters, and the like of the driver 102.
[0025] As noted hereinabove, the system 100 may be integrated into the vehicle 104. In a presently contemplated configuration, the system 100 includes an in-vehicle wellness prediction system 118 and a non-invasive biological parameter acquisition system. In one example, the non-invasive biological parameter acquisition system may include one or more cameras 106, a set of non-contact sensors 108, and a set of touch-based sensors 112. The non-invasive biological parameter acquisition system may be used to acquire the non-invasive biological parameters corresponding to the driver 102. Further, the in-vehicle wellness prediction system 118 may be configured to process the acquired non-invasive biological parameters in conjunction with one or more models to predict one or more wellness metrics that are generally indicative of the short-term and long-term wellbeing of the driver 102.
[0026] The camera 106 may be optimally positioned in the vehicle 104 to capture video of the driver 102. In accordance with aspects of the present specification, the video captured by the camera 106 is configured to facilitate determining information such as heart rate of the driver 102, heat rate variability (HRV) of the driver 102, eye movement of the driver 102, eye color information of the driver 102, posture of the driver 102, skin tone information of the driver 102, skin conductance of the driver 102, hand movements of the driver 102, respiratory rate of the driver 102, and the like. Some non-limiting examples of the camera 106 include an optical camera, an infrared camera, a hyperspectral camera, and the like.
[0027] Additionally, in accordance with aspects of the present specification, the set of non-contact sensors 108 may be positioned on a car seat and/or embedded in a car seat such as a driver’s seat 110. Some non-limiting examples of the non-contact sensors 108 include piezoelectric sensors, optical fiber sensors, ultrasound sensors, capacitive sensors, resistive sensors, and the like. These non-contact sensors 108 are configured to obtain non-contact sensor information corresponding to the driver 102. Other methods that facilitate non-contact monitoring of the driver 102 include infrared thermography, radar-based techniques, and ultrasound-based techniques. In one example, the information obtained from the non-contact sensors 108 is configured to facilitate non-invasive measurement or monitoring of the biological parameters of the driver 102 such as respiratory rate, heart rate, HRV, and the like.
[0028] Further, in one embodiment, the set of touch-based sensors 112 may be positioned in a steering wheel 114 and/or on a steering wheel 114 of the vehicle 104. As will be appreciated, a touch sensor, a tactile sensor, or a touch-based sensor is an electronic sensor used to detect and record physical touch. Traditionally, a touch-based sensor facilitates measurement of a parameter in response to contact, touch, or pressure on the surface of a touch-based sensor. In one embodiment, the touch-based sensors 112 may be positioned at the “10 o’clock” and 2 o’clock” positions on the steering wheel 114. Additionally or alternatively, the touch-based sensors 112 may be positioned “9 o’clock” and “3 o’clock” positions on the steering wheel 114. Moreover, in certain other embodiments, the touch-based sensors 112 may be distributed along the circumference of the steering wheel 114. These touch-based sensors 112 may be configured to acquire touch-based information associated with the driver 102. In one example, the information acquired via the touch-based sensors 112 may facilitate the non-invasive determination of the heart rate variability and the bioimpedance of the driver 102.
[0029] It may be noted that for ease of illustration, one example of the non-invasive biological parameter acquisition system is depicted as including one camera 106, a set of six (6) non-contact sensors 108 disposed within and/or on the car seat 110, and a set of two (2) touch-based sensors 112 disposed on and/or in the steering wheel 114 of the vehicle 104. However, other arrangements of the camera 106, non-contact sensors 108, and touch-based sensors 112 are envisaged.
[0030] As noted hereinabove, driving is a demanding task that requires constant attention and focus, which can be stressful and physically taxing for drivers. Further, prolonged driving can lead to physical and mental fatigue, affecting a driver’s wellbeing and safety. In accordance with aspects of the present specification, the system 100 is configured to mitigate the drawbacks of presently available driver monitoring systems by facilitating non-invasive monitoring of the short-term and long-term wellbeing of the driver 102 and providing feedback to the driver 102 in real-time, thereby allowing the drivers 102 to make lifestyle and/or behavioral changes to improve their overall health and wellbeing. In one example, the system 100 is configured to provide non-invasive monitoring of chronic inflammation in real-time. This can have significant long-term benefits, including reducing the risk of chronic diseases and improving quality of life.
[0031] To that end, once the driver 102 is seated in the driver’s seat 110 in the vehicle 104 and ready to commence a trip, the non-invasive biological parameter acquisition system which includes the camera 106, the set of non-contact sensors 108, and the set of touch-based sensors 112 is employed to non-invasively acquire biological parameters corresponding to the driver 102.
[0032] The camera 106 is configured to non-invasively acquire video corresponding to the driver 102. Furthermore, the non-contact sensors 108 embedded in and/or positioned on the car seat 110 are configured to non-invasively acquire data/information related to biological parameters of the driver 102 and the non-contact sensor information may be used to determine the respiratory rate, heart rate, and the like of the driver 102. In a similar fashion, the touch-based sensors 112 positioned in and/or on the steering wheel 114 are configured to non-invasively acquire data related to biological parameters of the driver 102 and the touch-based sensor data/information may be employed to determine the heart rate variability, bioimpedance, and the like. Additionally, other information 115 related to the driver 102 such as age, gender, weight, height, ethnicity, menstrual cycle data, and the like may also be gathered. This information may be manually provided by the driver 102 or may be automatically obtained. Reference numeral 116 may be used to generally represent collective information or a set of measured non-invasive biological parameters related to the driver 102 and includes the video captured by the camera 106, the non-contact sensor information gathered by the non-contact sensors 108, the touch-based sensor information acquired by the touch-based sensors 112, and the other information 115 provided by the driver 102 such as age, gender, weight, height, ethnicity, menstrual cycle data, and the like.
[0033] As depicted in FIG. 1, the system 100 includes an in-vehicle wellness prediction system 118. In accordance with aspects of the present specification, the in-vehicle wellness prediction system 118 is integrated in the vehicle 104, in one embodiment. Furthermore, subsequent to the acquisition of the collective driver information/the set of measured non-invasive biological parameters 116, which includes the video information, the non-contact sensor based information, the touch-based sensor information, and the other information 115, may be transmitted or communicated to the in-vehicle wellness prediction system 118. In one example, the set of measured non-invasive biological parameters 116 may be transmitted or communicated to the in-vehicle wellness prediction system 118 via wired means such as a cable. In other examples, the set of measured non-invasive biological parameters 116 may be wirelessly transmitted to the in-vehicle wellness prediction system 118 via use of a network. In yet another example, the set of measured non-invasive biological parameters 116 may be transmitted to a remote location and/or the cloud for storage and the in-vehicle wellness prediction system 118 may be configured to retrieve the set of measured non-invasive biological parameters 116 from the remote storage location and/or cloud. Also, in certain embodiments, the set of measured non-invasive biological parameters 116 may be communicated to the in-vehicle wellness prediction system 118 in real-time. In other embodiments, the set of measured non-invasive biological parameters 116 may be stored and communicated to the in-vehicle wellness prediction system 118 at a later time.
[0034] In a presently contemplated configuration, the in-vehicle wellness prediction system 118 is depicted as including an acquisition subsystem 120 and a processing subsystem 122. The acquisition subsystem 120 is configured to receive the set of measured non-invasive biological parameters 116 corresponding to the driver 102. It may be noted that in one embodiment, the acquisition subsystem 120 may be configured to directly obtain the set of measured non-invasive biological parameters 116 from the non-invasive parameter acquisition system. However, in certain other embodiments, the acquisition subsystem 120 may obtain the set of measured non-invasive biological parameters 116 from a storage such as a data repository 134, an optical data storage article such as a compact disc (CD), a digital versatile disc (DVD), a Blu-ray disc, and the like.
[0035] The set of measured non-invasive biological parameters 116 is subsequently communicated to the processing subsystem 122 for further processing. Moreover, in one embodiment as presented in FIG. 1, the processing subsystem 122 is depicted as including a prediction platform 124 and one or more models 126. It may be noted that the terms models, artificial intelligence (AI) models, task-specific models, and neural network models, may be used interchangeably.
[0036] In a non-limiting example, the processing subsystem 122 may include one or more application-specific processors, digital signal processors, microcomputers, graphical processing units, microcontrollers, Application Specific Integrated Circuits (ASICs), Programmable Logic Arrays (PLAs), Field Programmable Gate Arrays (FGPAs), and/or any other suitable processing devices. In some embodiments, the processing subsystem 122 may be configured to retrieve the set of measured non-invasive biological parameters 116 from the data repository 134. The data repository 134 may include a hard disk drive, a floppy disk drive, a read/write CD, a DVD, a Blu-ray disc, a flash drive, a solid-state storage device, a local database, and the like.
[0037] In addition, the examples, demonstrations, and/or process steps performed by certain components of the system 100 such as the processing subsystem 122 may be implemented by suitable code on a processor-based system, where the processor-based system may include a general-purpose computer or a special-purpose computer. Also, different implementations of the present specification may perform some or all of the steps described herein in different orders or substantially concurrently.
[0038] It may be noted that currently available driver monitoring systems are capable of tracking the current state of the driver such as the alertness of the driver. However, these systems fail to provide the drivers with real-time feedback regarding the short-term and long-term wellbeing of the drivers.
[0039] In accordance with aspects of the present specification, the set of measured non-invasive biological parameters 116 which includes the sensor data from the non-contact sensors 108 and touch-based sensors 112, the video, and the other information 115 related to the driver 102 are acquired and processed by the in-vehicle wellness prediction system 118 to generate computed or determined non-invasive biological parameters corresponding to the driver 102.
[0040] Furthermore, in accordance with aspects of the present specification, the prediction platform 124 in conjunction with the models 126 is configured to predict wellness metrics that are representative of short-term and long-term wellbeing of the driver 102 based directly on the determined non-invasive biological parameters, the set of measured non-invasive biological parameters 116, or a combination thereof. To that end, the prediction platform 124 is configured to process the video of the driver 102 acquired via the camera 106 in the vehicle 104 to determine the heart rate and skin tone information of the driver 102. In one example, the prediction platform 124 may be configured to process the acquired video via transdermal optical imaging to determine the heart rate, the heart rate variability (HRV), and the skin tone information of the driver 102. As will be appreciated, transdermal optical imaging utilizes video corresponding to the facial region captured using conventional cameras and machine learning (ML) techniques to extract facial blood flow changes.
[0041] In some embodiments, the HRV may be determined from the heart rate of the driver 102. In one example, transdermal thermal imaging may be used to determine the HRV of the driver 102. More particularly, transdermal thermal imaging involves capturing thermal data from the skin's surface to analyze physiological processes. Furthermore, to determine HRV from the transdermal thermal imaging, specialized algorithms may be utilized to analyze the fluctuations in skin temperature over time. As will be appreciated, the fluctuations in the skin temperature over time are influenced by changes in blood flow and autonomic nervous system activity associated with the cardiac cycle. Subtle variations in skin temperature patterns, such as the timing of vasodilation and vasoconstriction responses driven by sympathetic and parasympathetic nervous system activity are detected and the HRV may be inferred from these variations in skin temperature patterns. Moreover, advanced signal processing techniques and machine learning algorithms may be employed to accurately extract HRV parameters from thermal imaging data, thereby enabling non-invasive and continuous monitoring of autonomic function.
[0042] In another method, the video captured by the camera 106 may be processed by the prediction platform 124 to measure the photoplethysmography (PPG) signal. Subsequently, the prediction platform 124 may process the PPG signal via conventional signal or image processing methods or use deep learning techniques to determine the heart rate from the PPG signal.
[0043] Furthermore, the acquired video may be processed by the prediction platform 124 to determine eye color information, eye movement information, hand movements, posture, skin tone characterization, skin conductance information, and the like of the driver 102. In one example, the prediction platform 124 may employ an object detection algorithm such as You Only Look Once (YOLO) to identify a region of interest such as the eye regions of the driver 102 from the acquired video. Once the eye regions are identified, eye color information and eye movement information may be determined by the prediction platform 124. In a similar manner, the prediction platform 124 may use an object detection algorithm such as YOLO to identify the hands of the driver 102 from the acquired video. Subsequently, the prediction platform 124 may be configured to track the hand movements of the driver 102. The acquired video may also be processed to determine the posture of the driver 102 in the vehicle104.
[0044] The prediction platform 124 may also be configured to process the acquired video to determine the respiratory rate of the driver 102. It may be noted that typically the temperature around the nostrils of the driver 102 fluctuates during inspiration and expiration of the respiratory cycle. In one example, to determine the respiratory rate, the prediction platform 124 may be configured to identify a region of interest such as the nose of the driver 102 from the video acquired via the camera 106. Techniques such as segmentation may be used to identify the nose area of the driver 102. Subsequently, the prediction platform 124 may be configured to track the nose area of the driver 102 and extract the breathing rate or respiratory rate of the driver 102. In one example, the prediction platform 124 may employ infrared thermography to monitor the respiratory rate of the driver 102.
[0045] Moreover, the prediction platform 124 is also configured to process the touch-based sensor data or information acquired from the touch-based sensors 112. In one example, the acquired touch-based sensor data may be processed by the prediction platform 124 to determine the heart rate variability (HRV) of the driver 102. It may be noted that the HRV is a measure of the variation in the time interval between consecutive heartbeats and offers a noninvasive way to indicate imbalances in the autonomic nervous system. The variation between consecutive heartbeats tends to be lower when the body is in an excited state, while the variation between consecutive heartbeats may be higher if the body is in a relaxed state. Determining the HRV of the driver 102 aids in identifying any signs of current and/or future health problems of the driver 102. For example, if it is determined that the driver 102 has a high HRV, the driver 102 may have greater cardiovascular fitness and may be more resilient to stress. However, if it is determined that the driver 102 has a lower HRV, it may be indicative of current or future health problems as the driver’s body may be less resilient to stress and changing situations. As previously noted, the HRV may be determined from the heart rate of the driver 102. Also, in some other embodiments, transdermal thermal imaging may be used to determine the HRV of the driver 102.
[0046] In addition, the prediction platform 124 may be configured to process the touch-based sensor data to determine the bioimpedance of the driver 102. As will be appreciated, bioimpedance is the response of a living organism to an externally applied electric current and is a measure of the opposition to the flow of that electric current through the tissues. Moreover, bioimpedance measurements allow for the non-invasive characterization of blood flow and body composition and is used to measure the amount of muscle, fat, and total body water in the body. Accordingly, bioimpedance measurements corresponding to the driver 102 may be used for disease prognosis, monitoring of vital body statistics. Detection of edema, diagnosis of skin-related diseases, detection of cancerous tissues, and monitoring of ischemia during the transplant process, and the like.
[0047] In one example, a hand-to-hand model may be employed for bioimpedance measurement of the driver 102 in the vehicle 104. Further, in the embodiment of FIG. 1, the touch-based sensors 112 disposed on or in the steering wheel 114 may be used to facilitate the measurement of bioimpedance of the driver 102. Once the driver 102 places his/her hands on the steering wheel 114 and makes contact with the touch-based sensors 112, electrical properties of the body of the driver 102 may be assessed. In particular, a small, harmless electrical current may be passed through the body of the driver 102 via the touch-based sensors 112 and the resulting voltage may be measured to determine the bioimpedance of the driver 102. Furthermore, the bioimpedance so determined may be analyzed to determine various physiological parameters such as body composition, hydration levels, and cell integrity of the driver 102. Use of this model offers a convenient and non-invasive method for assessing health metrics. These health metrics in turn may be used in fields like fitness, nutrition, and medical diagnostics of the driver 102, thereby offering insights into overall health and guiding personalized wellness strategies for the driver 102.
[0048] Also, the non-contact sensor information may be processed by the prediction platform 124 to obtain the respiratory rate of the driver 102. As will be appreciated, the respiratory rate is a measure of the number of breaths per minute and is a parameter that represents the movement of air in and out of the lungs. It may be noted that the respiratory rate is a fundamental vital sign that is necessary for monitoring other vital signs such as oxygen saturation, temperature, blood pressure, pulse/heart rate, and the like. Furthermore, the respiratory rate is sensitive to different pathological conditions such as adverse cardiac events, pneumonia, stressors such as emotional stress, cognitive load, heat, cold, physical effort, and exercise-induced fatigue. Also, any change in the respiratory rate is typically an indication of deterioration of the body. Hence, respiratory rate information corresponding to the driver 102 may be used for detection of cardiac events, pneumonia, stress, fatigue, and the like. In one example, a non-contact sensor 108 such as an ultrasound sensor embedded in the driver’s seat 110 may be used to obtain the sensor information related to the breathing of the driver 102. Subsequently, the prediction platform 124 may be configured to process the data from the ultrasound sensor to determine the respiratory rate of the driver 102. Additionally, the prediction platform 124 may also be configured to obtain heart rate information using sensor information from the non-contact sensors 108.
[0049] In accordance with exemplary aspects of the present specification, the prediction platform 124 is configured to combine the acquired video, the touch-based sensor information, and the non-contact sensor information and process the set of measured non-invasive biological parameters 116 to provide indicators of the short-term wellbeing and long-term wellbeing of the driver 102. Consequent to the processing of the video, the touch-based sensor information, and the non-contact sensor information by the prediction platform 124, a set of non-invasive biological parameters corresponding to the driver 102 may be determined or computed. These non-invasive biological parameters may generally be referred to as computed or derived or determined non-invasive biological parameters. Some non-limiting examples of the set of determined non-invasive biological parameters corresponding to the driver 102 may include respiratory rate, heart rate, eye color information, eye movement information, skin tone information, skin conductance information, hand movements, posture, heart rate variability, bioimpedance, and the like.
[0050] With continuing reference to FIG. 1, the prediction platform 124 is configured to process the determined non-invasive biological parameters via use of one or more models 126 to predict the wellness metrics corresponding to the driver 102. In certain embodiments, the models 126 may include a neural network that is trained and configured to perform one or more desired tasks. By way of example, one embodiment of the model 126 may be trained to determine short-term information related to the driver 102 such as driver alertness or mood or fatigue. Another model 126 may be trained to ascertain long-term information associated with the driver 102 such as trend lines of the non-invasive biological parameters. As previously noted, the terms task-specific model, neural network, model, AI model, and neural network model may be used interchangeably.
[0051] As will be appreciated, a neural network (NN) is a computational model and includes several layers. Each layer in the neural network model in turn includes several computational nodes. The computational node is configured to perform mathematical operations based on received input to generate an output. Some non-limiting examples of the mathematical operations include summation, passing through a non-linearity, comparing a present state of the node with a previous state, and the like. Moreover, the neural network model also includes weights that are typically associated between each node in a layer and one or more nodes in subsequent layers. The generation of the models will be described in greater detail with reference to FIG. 3.
[0052] The models 126 are trained to generate specific desired outputs. In particular, the models 126, when deployed, aid the prediction platform 124 in performing a given task to provide a desired wellness metric. By way of example, the prediction platform 124 may be configured to use a model 126 to process the set of measured non-invasive biological parameters 116 and/or the set of determined non-invasive biological parameters to predict a wellness metric in the form of an overall wellness score of the driver 102. In another example, a model 126 may be employed to process the set of measured non-invasive biological parameters 116 and/or the set of determined non-invasive biological parameters to predict a wellness metric in the form of the alertness or mood of the driver 102. In yet another example, the prediction platform 124 may utilize a model 126 to predict a wellness metric in the form of trend lines of the determined and/or measured non-invasive biological parameters.
[0053] As noted hereinabove, the prediction platform 124 is configured to receive the set of measured non-invasive biological parameters 116, which in turn includes the other information 115 such as age, gender, height, weight, ethnicity, and the like, the sensor data from the non-contact sensors 108 and the touch-based sensors 112, and the captured video related to the driver 102. Moreover, the prediction platform 124 is further configured to process the set of measured non-invasive biological parameters 116 to generate one or more determined non-invasive biological parameters such as heart rate, heart rate variability, eye color information, eye movement information, hand movements, posture, respiratory rate, bioimpedance, skin tone information, skin conductance information, and the like.
[0054] Additionally, the prediction platform 124 is also configured to receive an input that is representative of one or more selected tasks to be performed. In one embodiment, the selected task may be provided by the driver 102, for example. However, in other embodiments, the selected task may be automatically chosen by the system 100. Some non-limiting examples of the selected tasks include predicting a current alertness state of the driver 102, predicting a current mood of the driver 102, predicting an overall wellness score of the driver 102, predicting an inflammatory state estimate of the driver 102, predicting trend lines of the measured and/or determined non-invasive biological parameters, and the like.
[0055] Subsequent to receipt of the set of non-invasive biological parameters 116 and/or the computation of the set of determined non-invasive biological parameters and the task selected by the driver 102 or the system 100, the prediction platform 124 is configured to retrieve a model 126 corresponding to the selected task. In one embodiment, the prediction platform 124 is configured to query the data repository 134 to identify a corresponding model 126 based on the selected task. Additionally, the prediction platform 126 may be configured to retrieve the identified model 126. By way of a non-limiting example, if the selected task entails predicting the overall wellness score of the driver 102, then a model 126 configured to perform the prediction of the overall wellness score is retrieved.
[0056] Once the model 126 is identified and retrieved, the prediction platform 124 is configured to perform the selected task using the model 126. In particular, the prediction platform 124 is configured to process the received, measured, and/or determined non-invasive biological parameters in conjunction with the model 126 to generate a desired wellness metric.
[0057] With continuing reference to the selected task of predicting the overall wellness score of the driver 102, subsequent to processing of the sets of measured and/or determined non-invasive biological parameters by the prediction platform 124 via use of the model 126, an outcome in the form of a predicted wellness metric, including chronic inflammation, representative of the overall wellness score of the driver 102 is generated. In one non-limiting example of the overall wellness score, a measured age of the driver 102 may be determined from the respiratory rate or information. The measured age of the driver 102 may be compared with the real age provided by the driver 102. Consequent to the comparison if it is determined that the measured age is greater than the real age of the driver 102, it may be deduced that overall wellness score indicates that the driver 102 is unwell. However, if it is determined that the measured age is less than the real age of the driver 102, it may be deduced that the overall wellness score indicates that the driver 102 is healthy.
[0058] In another example, the selected task may entail predicting a wellness metric that is indicative of an estimate of an inflammatory state of the driver 102, such as acute inflammation, chronic inflammation, stress, and the like. As will be appreciated, chronic inflammation is a crucial biomarker for long-term health and well-being. In particular, chronic inflammation is generally indicative of a sustained immune response often associated with prolonged exposure to inflammatory stimuli, which may be indicative of underlying health conditions and potential risks to the driver’s 102 overall wellness. Also, chronic inflammation is linked to several chronic diseases, such as diabetes, heart disease, and cancer. In this example, subsequent to processing of the sets of measured and/or determined non-invasive biological parameters by the prediction platform 124 via use of the model 126, a predicted wellness metric representative of the inflammatory state of the driver 120 such as chronic inflammation is generated. This processing of the sets of measured and/or determined non-invasive biological parameters via use of the model 126 to predict a wellness metric representative of chronic inflammation of the driver 102 advantageously circumvents the shortcomings of the traditional methods that use invasive blood tests or body-fluid swabs, which in turn are inconvenient and unsuitable for real-time monitoring.
[0059] Additionally, in certain embodiments, an indicator representative of the wellness metric may be generated in the form of text, a graphic, a chart, or any other form of audio, and/or visual representation. These wellness metrics and/or the corresponding indicators may then be provided to the driver 102 or other systems to facilitate further actions, behavioral changes, lifestyle recommendations, analysis, treatment planning, follow-up, and the like. In one example, the indicators and/or wellness metrics may be visualized on an interface unit such as a display 130 of an infotainment system 128 in the vehicle 104 or otherwise communicated to the driver 102.
[0060] Further, in one embodiment, the prediction platform 124 is configured to maintain a model. As used herein, “maintain a model” may entail generating the model and hosting the model. The model 126 may be hosted in a data repository 134. The model 126 may additionally or alternatively be hosted in a local repository, a remote repository, the cloud, and the like. Moreover, in one example, the model may include one or more task-specific artificial intelligence models 126. Furthermore, the model 126 is configured to receive as input a set of non-invasive biological parameters and provide as output a wellness metric. In one example, a set of non-invasive biological parameters such as the measured and/or determined non-invasive biological parameters corresponding to driver 102 may be received. In addition, the sets of the measured and/or determined non-invasive biological parameters may be provided as a set of parameters to the model to cause the model to generate a wellness metric corresponding to the driver 102. The aspect of maintaining the model will be described in greater detail with reference to FIG. 3.
[0061] With continuing reference to FIG. 1, in certain embodiments, the in-vehicle wellness prediction system 118 may be integrated with an infotainment system 128 of the vehicle 104. However, in other embodiments, the in-vehicle wellness prediction system 118 may be a standalone unit and may be communicatively coupled to the infotainment system 128. In a presently contemplated configuration depicted in FIG. 1, the in-vehicle wellness prediction system 118 is depicted as being integrated with the infotainment system 128 of the vehicle 104.
[0062] In accordance with exemplary aspects of the present specification, the system 100 and the in-vehicle wellness prediction system 118 in particular is configured to perform automated in-vehicle wellness monitoring of the driver 102, in real-time, on an edge device, where the edge device is in the vehicle 104, integrated in the vehicle 104, or a combination thereof. In one example, the edge device may include the infotainment system 128 of the vehicle 104. Other examples of the edge device may include a cellular phone or a tablet that is positioned in the vehicle 104.
[0063] The in-vehicle wellness prediction system 118 may also include a display 130 and a user interface 132. It may be noted that in some embodiments the display 130 and the user interface 132 may be representative of the infotainment system 128 in the vehicle 104.
[0064] The display 130 and the user interface 132 may overlap in some embodiments such as a touch screen. Further, in some embodiments, the display 130 and the user interface 132 may include a common area. The display 130 may be configured to visualize or present the predicted wellness metrics. In addition, the indicators of the wellness metrics may also be displayed on the display 130.
[0065] The user interface 132 of the in-vehicle wellness prediction system 118 may include a human interface device (not shown) that is configured to aid the driver 102 in providing inputs or manipulating the outcomes and/or indicators visualized on the display 130. By way of example, the driver 102 may enter personal details such as age, gender, height, weight, ethnicity, and the like using the user interface 132. Additionally, the driver 102 may enter a selected task to be performed via the user interface 132. In certain embodiments, the human interface device may include a trackball, a joystick, a stylus, a mouse, or a touch screen. It may be noted that the user interface 132 may be configured to aid the driver 102 and/or a passenger in navigating through the inputs and/or wellness metrics/indicators generated by the in-vehicle wellness prediction system 118.
[0066] The system 100 presented hereinabove is described with reference to the in-vehicle non-invasive monitoring of the short-term and long-term wellbeing of the driver 102. In accordance with aspects of the present specification, the system 100 can additionally and/or alternatively be used to monitor the short-term and long-term wellbeing of one or more passengers traveling in the vehicle 104. Accordingly, the camera 106 may be positioned such that video corresponding to the passenger may also be recorded. However, in certain embodiments, one or more additional cameras 106 may be provided to acquire video corresponding to one or more passengers in the vehicle 104. Similarly, one or more non-contact sensors 108 may be embedded in the passenger seats to obtain non-contact sensor information related to the passengers. Moreover, to obtain touch-based information corresponding to the passengers one or more touch-based sensors 112 may be disposed on the dashboard and infotainment systems that are located on the back of the front seats in the vehicle 104. In this example, the passengers may be prompted to touch the touch-based sensors 112 to provide the touch-based sensor information. The prediction platform 124 in conjunction with the models 126 may also be configured to generate wellness metrics corresponding to the passengers in the vehicle 104.
[0067] Implementing the in-vehicle wellness prediction system 118 that includes the prediction platform 124 as described hereinabove aids in promoting driver health and wellbeing. More particularly, the in-vehicle wellness prediction system 118 is integrated in the vehicle 104 and is designed to improve the wellbeing and safety of the drivers 102 while driving by predicting and/or generating in real-time clinically relevant wellness metrics using only non-invasive biological parameters associated with the driver 102. Once the predicted wellness metrics are generated, the system 100 may also be configured to appropriately communicate these wellness metrics to the driver 102 and/or passengers in real-time, thereby allowing the driver 102 and/or passengers to make lifestyle and behavioral changes to improve their overall health and wellbeing. This can have significant long-term benefits, including reducing the risk of chronic diseases and improving quality of life.
[0068] Additionally, the system 100 is designed to be user-friendly, non-intrusive, and customizable. In particular, the system 100 allows the drivers 102 to select tasks and tailor the output to their preferences. Furthermore, the multimodal nature of the system 100 ensures accurate measurement of driver wellness under diverse lighting and driving conditions. Moreover, the system 100 is designed to operate seamlessly with the vehicle’s data systems, thereby providing a seamless and integrated experience to the driver 102.
[0069] Moreover, the system 100 may also be configured to appropriately facilitate partnering the driver 102 and/or passengers with fitness centers, clinical centers, hospitals, nutraceuticals, and the like, to enhance the wellness of driver 102 and/or passengers. The working of the system 100 may be better understood with reference to FIGs. 2-7.
[0070] Embodiments of the exemplary method of FIG. 2 may be described in a general context of computer executable instructions on computing systems or a processor. Generally, computer executable instructions may include routines, programs, objects, components, data structures, procedures, modules, functions, and the like that perform particular functions or implement particular abstract data types.
[0071] Moreover, the embodiments of the exemplary methods may be practiced in a distributed computing environment where optimization functions are performed by remote processing devices that are linked through a wired and/or wireless communication network. In the distributed computing environment, the computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0072] In addition, in FIG. 2 the exemplary method is illustrated as a collection of blocks in a logical flow chart, which represents operations that may be implemented in hardware, software, firmware, or combinations thereof. It may be noted that the various operations are depicted in the blocks to illustrate the functions that are performed. In the context of software, the blocks represent computer instructions that, when executed by one or more processing subsystems, perform the recited operations.
[0073] Moreover, the order in which the exemplary methods are described is not intended to be construed as a limitation, and any number of the described blocks may be combined in any order to implement the exemplary methods disclosed herein, or equivalent alternative methods. Further, certain blocks may be deleted from the exemplary methods or augmented by additional blocks with added functionality without departing from the spirit and scope of the subject matter described herein.
[0074] Turning now to FIG. 2, a flow chart 200 of an exemplary method for in-vehicle wellness monitoring, in accordance with aspects of the present specification, is presented. In particular, the method 200 entails predicting one or more wellness metrics corresponding to a user such a driver of a vehicle. The method 200 of FIG. 2 is described with reference to the components of FIG. 1. Moreover, in certain embodiments, the method 200 may be performed by the prediction platform 124 in conjunction with the model(s) 126.
[0075] The method starts at step 202, where a model such as the model 126 is maintained. As used herein, “maintain a model” may entail generating the model and hosting the model. The model 126 is configured to receive as input a set of parameters corresponding to a user such as the driver 102 of the vehicle 104. Further, the model 126 is configured to provide as output a wellness metric corresponding to the driver 102. As previously noted, in certain embodiments, the prediction platform 124 is configured to maintain the model. More particularly, the prediction platform 124 is configured to generate the model 126 and host the model 126. The model 126 may be employed to generate the wellness metric corresponding to the driver 102 based on the input. Also, the model 126 may be hosted in a data repository 134. Furthermore, the model 126 may also be hosted in a local repository, a remote repository, the cloud, and the like.
[0076] Subsequently, a set of non-invasive biological parameters corresponding to the driver 102 may be received, measured, and/or determined based on measured non-invasive biological parameters. In one example, the non-invasive biological parameter acquisition system may be used to measure the non-invasive biological parameters. The non-invasive biological parameter acquisition system may include the camera 106, the set of non-contact sensors 108, and the set of touch-based sensors 112.
[0077] Accordingly, at step 204, other information 115 related to the driver 102 such as age, gender, weight, height, ethnicity, menstrual cycle data, and the like may be gathered. This information may be manually provided by the driver 102 via use of the interface 132 or may be automatically obtained by the system 100.
[0078] Moreover, as indicated by step 206, video of the driver 102 seated in the car seat 110 may be captured via use of the camera 106. In one embodiment, a video corresponding to the face of the driver 102 may be captured. In other embodiments, a video of the body of the driver 102 including the face may be captured. The video captured by the camera 106 is configured to aid in determining or computing information such as heart rate of the driver, eye movement of the driver 102, eye color information of the driver 102, posture of the driver 102, skin tone information of the driver 102, skin conductance information of the driver 102, face characterization of the driver 102, respiratory rate of the driver 102, hand movements of the driver 102, posture of the driver 102, and the like.
[0079] Additionally, at step 208, non-contact sensor information or data corresponding to the driver 102 may be acquired from the non-contact sensors 108 that are embedded in the car seat 110. Other methods to facilitate non-contact measurement of non-invasive biological parameters of the driver 102 include infrared thermography, radar-based techniques, and ultrasound-based techniques. In one example, the information obtained from the non-contact sensors 108 is configured to facilitate non-invasive determination and/or monitoring of the biological parameters of the driver 102 such as respiratory rate, heart rate, and the like.
[0080] Moreover, touch-based information associated with the driver 102 may be acquired via use of the touch-based sensors 112 disposed on the steering wheel 114, as indicated by step 210. In one example, the information acquired via the touch-based sensors 112 may facilitate the determination/computation of the heart rate variability and the bioimpedance of the driver 102.
[0081] As previously noted, the acquired video information, the non-contact sensor based information, the touch-based sensor information, and the other information 115 may be generally referred to as a set of measured non-invasive biological parameters 116 or collective information. The set of measured non-invasive biological parameters 116 is transmitted or communicated to the in-vehicle wellness prediction system 118 for processing to predict one or more wellness metrics that are representative of the short-term and long-term wellbeing of the driver 102. More particularly, the in-vehicle wellness prediction system 118 is configured to predict, in real-time, the one or more wellness metrics that are representative of the short-term and long-term wellbeing of the driver 102 based directly on the measured non-invasive biological parameters and/or the non-invasive biological parameters that are determined from the set of measured non-invasive biological parameters 116.
[0082] Accordingly, as depicted by step 212, the video of the driver 102 acquired via the camera 106 in the vehicle 104 may be processed by the prediction platform 124 to determine the heart rate of the driver 102. In one example the acquired video may be processed via transdermal optical imaging to determine the heart rate, the HRV, the skin tone information, and the skin conductance information of the driver 102. Also, in another example, the video captured by the camera 106 may be processed by the prediction platform 124 to measure the photoplethysmography (PPG) signal. The PPG signal may then be processed via conventional signal or image processing methods or via deep learning techniques to determine the heart rate from the PPG signal.
[0083] Moreover, eye color information, eye movement information, hand movements, posture, and the like of the driver 102 may also be determined by processing the acquired video. In one example, the prediction platform 124 may employ an object detection algorithm such as You Only Look Once (YOLO) to identify the regions of interest of the driver 102 from the acquired video. Subsequently, the identified regions of interest may be processed to determine the eye color information, eye movement information, hand movements, posture, and the like.
[0084] Additionally, the acquired video may be processed to determine the respiratory rate of the driver 102. In one example, to determine the respiratory rate, the prediction platform 124 may be configured to identify a region of interest such as the nose of the driver 102 from the video acquired via the camera 106. Techniques such as segmentation may be used to identify the nose area of the driver 102. Subsequently, the nose area of the driver 102 may be tracked and the breathing rate or respiratory rate may be extracted. By way of example, the prediction platform 124 may employ infrared thermography to monitor the respiratory rate of the driver 102.
[0085] Also, as indicated by step 214, the non-contact sensor information may be processed by the prediction platform 124 to obtain the respiratory rate of the driver 102. As will be appreciated, the respiratory rate is a measure of the number of breaths per minute and is a parameter that represents the movement of air in and out of the lungs and any change in the respiratory rate is typically an indication of deterioration of the body due to cardiac events, pneumonia, stress, fatigue, and the like. For example, an ultrasound non-contact sensor 108 embedded in the driver’s seat 110 may be used to obtain the sensor information related to the breathing of the driver 102. This data from the ultrasound sensor may be processed to determine the respiratory rate of the driver 102. Also, at step 214, the prediction platform 124 may be configured to similarly obtain heart rate information and HRV information using sensor information from the non-contact sensors 108.
[0086] Furthermore, at step 216, the touch-based sensor data or information acquired from the touch-based sensors 112 may be processed to determine the heart rate variability and bioimpedance of the driver 102. As previously noted, the HRV is a measure of the variation in the time interval between consecutive heartbeats and determining the HRV of the driver 102 aids in identifying any signs of current and/or future health problems of the driver 102. As previously noted, in some embodiments, the HRV information may be determined from the heart rate information of the driver 102, transdermal thermal imaging, and the like. Additionally, the touch-based sensor data may also be processed to determine the bioimpedance of the driver 102. As previously noted, bioimpedance is the response of a living organism to an externally applied electric current and the bioimpedance measurements corresponding to the driver 102 may be used for detection of edema, diagnosis of skin-related diseases, detection of cancerous tissues, and monitoring of ischemia during the transplant process, disease prognosis, monitoring of vital body statistics, and the like. As previously noted, in some embodiments, the bioimpedance information may be determined via processing of the touch-based sensor information via use of a hand-to-hand model.
[0087] Consequent to the processing of the set of measured non-invasive biological parameters 116 which includes the acquired video, the touch-based sensor information, and the non-contact sensor information of steps 204-216, non-invasive biological parameters corresponding to the driver 102 such as, but not limited to, respiratory rate, heart rate, eye color information, eye movement information, skin tone information, skin conductance information, hand movements, heart rate variability, bioimpedance, posture, and the like are determined. These non-invasive biological parameters may generally be referred to as computed or derived or determined non-invasive biological parameters. Reference numeral 218 is generally used to refer to a set of determined non-invasive biological parameters that is generated subsequent to the processing of steps 204-216.
[0088] Subsequently, the set of determined non-invasive biological parameters 218 may be processed via use of one or more models 126 to generate an outcome in the form of one or more wellness metrics that are generally indicative of the short-term wellbeing and long-term wellbeing of the driver 102. In particular, the set of determined non-invasive biological parameters 218 may be provided as input to a model 126 to cause the model 126 to generate the one or more wellness metrics corresponding to the driver 102. Accordingly, at step 220, an input with reference to a selected task may be received. The selected tasks may be obtained automatically or as input from the driver 102 via the user interface 132. Other means of obtaining input corresponding to the selected tasks are also contemplated. In one example, the input corresponding to the selected tasks may be received by the prediction platform 124. As previously noted, one example of a task to be performed may include generating a wellness metric representative of the overall wellness score of the driver 102.
[0089] Subsequently, at step 222, one or more models 126 may be retrieved based on the input received at step 220. In particular, at step 222, the model(s) 126 corresponding to the selected task(s) may be retrieved. In one example, the prediction platform 124 may retrieve the models 126 from the data repository 134 or from a remote repository, the cloud, and the like. As previously noted, the models 126 may be generated offline and stored in the data repository 134. Also, each model 126 may be configured to perform a single task or a combination of tasks.
[0090] Moreover, at step 224, the selected task(s) may be performed to predict one or more desired outcomes. In particular, the set of determined non-invasive biological parameters 218 may be provided as input to the retrieved model 126 to cause the model 126 to generate one or more wellness metrics. Specifically, the set of determined non-invasive biological parameters 218 may be processed by the corresponding model 126 to predict a desired outcome in the form of a wellness metric. It may be noted that at step 224, one or more predicted wellness metrics may be generated corresponding to the selected tasks by processing the set of determined non-invasive biological parameters 218 via a corresponding model 126. In the example depicted in FIG. 2, the predicted outcomes may include wellness metrics 226. As previously noted, the wellness metrics 226 may include short-term wellness indicators such as alertness and mood scores of the driver 102, long-term wellness indicators such as overall wellness scores, blood-test based wellness scores, and the like. It may be noted that the prediction of the wellness metrics 226 enables the capturing of the associated circadian rhythms and other long-term rhythms. Also, in one example, indicators representative of the wellness metrics 226 may be generated. The indicators may be visual indicators, audio indicators, charts, graphs, text, videos, and the like. In addition, the wellness metrics 226 may be provided to the driver 102 for example to facilitate further analysis, lifestyle recommendations, behavioral changes, and/or treatment planning, as indicated by step 228.
[0091] As previously noted, the prediction platform 124 is configured to maintain a model 126. In particular, the prediction platform 124 is configured to generate and host one or more models 126.
[0092] FIG. 3 is a schematic illustration 300 of one example of a method for generating an artificial intelligence model such as the model 126 of FIG. 1. The method 300 of FIG. 3 is described with reference to the components of FIGs. 1-2. In one embodiment, the method 300 may be performed by the prediction platform 124.
[0093] The method 300 for generating a model entails “training” a neural network 302 to “learn” a desired task. Accordingly, the neural network 302 is trained with appropriate inputs corresponding to the desired task. In certain embodiments, the prediction platform 124 may include the neural network 302 and may be configured to generate the models 126.
[0094] As depicted in FIG. 3, inputs are provided to the neural network 302. In one example, providing an input to the neural network 302 entails task selection. More particularly, the task selection may entail selecting one or more tasks that a model is designed to perform or predict. These desired/selected tasks 306 may be provided as input to the neural network 302. As noted hereinabove, some non-limiting examples of the selected tasks include predicting an overall wellness score of the driver, predicting driver alertness and mood, predicting trend lines of the measured non-invasive biological parameters of the driver, predicting stress, predicting invasive parameters such as inflammation markers, predicting fatigue, and the like.
[0095] Moreover, providing an input to the neural network 302 may entail data collection. In one non-limiting example, a plurality of sets of measured non-invasive biological parameters 116 corresponding to a plurality of users such as drivers and/or passengers is acquired and provided to the neural network 302 as input. It may be noted that each set of the plurality of sets of measured non-invasive biological parameters 116 corresponds to each user of the plurality of users. Furthermore, it may also be noted that the sample set of the plurality of users is selected to be sufficient to capture diversity of age, gender, weight, height, ethnicity, and the like.
[0096] Additionally, data collection may also entail collecting a plurality of sets of determined non-invasive biological parameters 218 corresponding to a plurality of users such as drivers and/or passengers and provided to the neural network 302 as input. It may be noted that each set of the plurality of sets of determined non-invasive biological parameters 218 corresponds to each user of the plurality of users. Moreover, the sample set of the plurality of users is selected to be sufficient to capture diversity of age, gender, weight, height, ethnicity, and the like.
[0097] In certain embodiments, each set of measured and/or determined non-invasive biological parameters 116, 218 may include video, two-dimensional (2D) data such as images, time-series data such as ultrasound signals and optical signals, and bioimpedance values measured across tissue at different locations on the user’s body and at different frequencies. By way of a non-limiting example, the 2D data may include images of the user’s face, user’s body, images of the user’s tongue, images of the user’s nail(s), images of the user’s skin, images of the user’s eyes, images of the posture of the user, and the like. Some examples of the measured non-invasive biological parameters 116 include an age, gender, height, weight, ethnicity, captured video, non-contact sensor information, touch-based sensor information corresponding to the plurality of users. Similarly, some examples of the determined non-invasive biological parameters 218 include a heart rate, pulse rate, respiratory rate, heart rate variability, skin tone characterization, skin conductance information, eye characterization, eye movement information, eye color information, menstrual cycle data, hand movements, posture, bioimpedance values and other such non-invasive biological parameters corresponding to the plurality of users.
[0098] Additionally, in some embodiments, the data collection step may also entail collecting relevant data such as sensor data and camera images/videos corresponding to each task 306. This data may be pre-processed, and features may be extracted to prepare the data for analysis. Subsequently, the pre-processed data is analyzed using a combination of a neural network 302 and statistical analysis. It may be noted that the plurality of sets of non-invasive biological parameters and the task-specific data may be generally represented by reference numeral 304. The plurality of sets of non-invasive biological parameters 304 and the task-specific data may be provided as input to train the neural network 302.
[0099] Moreover, during the training phase, one or more desired outputs 308 are also provided to the neural network 302. The desired outputs 308 may include a plurality of sets of predicted values of wellness metrics corresponding to the plurality of users. In one example, the desired output 308 may include a plurality of sets of overall wellness scores corresponding to a plurality of users. These wellness scores may include survey-based wellness scores corresponding to the plurality of users. Some non-limiting examples of standardized surveys that can be used to obtain the survey-based wellness scores include Wheel of Wellness (WoW), Life Assessment Questionnaire (LAQ), Wellness Inventory (WI), Life Coping Inventory (LCI), Wellness Evaluation of Lifestyle (WEL), Five-Factor WEL (5F-Wel) and Four-Factor WEL (4F-WEL), Predictive Wellness Survey (PWS), The Optimal Living Profile (OLP), Web-based Health Risk Assessment (HRA), The Body-Mind-Spirit Wellness Behavior and Characteristic Inventory (BMS-WBCI), The Satisfaction with Life Survey (SWLs) and the Wellness Behavior Survey (WBS), Ryff’s Psychological Wellbeing (PWB) Scales, and the like.
[0100] Additionally, in another example, the desired outputs 308 may include a plurality of sets of alertness and mood scores corresponding to a plurality of users. These alertness and mood scores may include survey-based alertness and mood scores corresponding to the plurality of users. In one example, an Alertness and Mood survey (AMS) may be used to obtain the alertness and mood scores corresponding to the plurality of users. Some non-limiting examples of items that may be considered in the survey for the computation of the alertness and mood scores include quality of sleep, the present day’s workload, a current state of feeling such as sleepy, happy, sick, energetic, physically exhausted, mentally fatigued, stressed, tired, depressed, bored, lonely, and the like.
[0101] Furthermore, the desired outputs 308 may also include a plurality of sets of blood test-based wellness scores corresponding to a plurality of users. Accordingly, a plurality of sets of invasive parameters corresponding to the plurality of users may be acquired and provided to the neural network 302. In one example, the invasive parameters may include C-reactive protein (CRP) values, cortisol values, serum protein electrophoresis (SPEP or SPE) values, and the like. Also, these invasive parameters may be acquired through invasive blood tests or body-fluid swabs.
[0102] In certain embodiments, the invasive parameters may be acquired via clinical trials/tests performed on the plurality of users. It may be noted that each set of the plurality of sets of invasive parameters corresponds to a respective user of the plurality of users. Although the method 300 entails use of a plurality of sets of non-invasive biological parameters 304, a plurality of sets of blood test-based wellness scores, a plurality of sets of survey-based wellness scores, and a plurality of sets of survey-based alertness and mood scores, for ease of illustration only one set of inputs 302 and one set of outputs 308 are depicted in FIG. 3.
[0103] Once the inputs such as the sets of non-invasive biological parameters and task-specific data 304, the selected task(s) 306, and the desired outputs 308 such as the sets of invasive parameters, the sets of survey-based wellness scores, the sets of survey-based alertness and mood scores, and the sets of blood test-based wellness scores are provided to the neural network 302, the neural network 302 may be trained to perform a selected task. In particular, the neural network 302 may be trained to provide an output in the form of a wellness metric such as an overall wellness score, an alertness and mood score, a blood test-based wellness score that may include a predicted value of a CRP value, an SPE value, and a cortisol value, and the like. Additionally or alternatively, the neural network 302 may be trained to predict one or more wellness metrics. Moreover, the neural network 302 may also be trained to perform one or more tasks 306.
[0104] It may be noted that during the training or learning phase of the neural network 302, one or more model parameters in the form of weights of the neural network 302 for predicting desired outcomes may be optimized. In particular, the model parameters may be optimized such that loss between the predicted outcomes and the desired outputs 308 is minimized to ensure that the predicted outcomes closely match with the values of desired outputs 308.
[0105] Consequent to the training phase, a task-specific model 310 is generated. The task-specific model 310 may be configured to perform one or more selected tasks 306. By way of example, if the selected task 306 is prediction of an alertness and mood score of the driver 102, then the corresponding task-specific model 310 is configured to facilitate prediction of an alertness and mood score of the driver 102 directly based on the set of measured non-invasive biological parameters 116 and/or the set of determined non-invasive biological parameters 218 corresponding to a given user. It may be noted that the model 310 may be configured to perform a single task or a plurality of tasks. Also, the models 310 may be generated offline. Moreover, in one example, these models 310 may be stored in the data repository 134. In other embodiments, the models 310 may be transmitted for storage in a remote facility.
[0106] Furthermore, in certain embodiments, statistical analysis may be used to validate the accuracy of the model 310 and identify areas for improvement. Also, the model 310 may be validated by comparing its predictions with invasive measurements obtained from a cohort. Once the models 310 are validated, the models 310 may be deployed in the in-vehicle wellness prediction system 118 in the vehicle 104.
[0107] It may be noted that in accordance with aspects of the present specification, once the neural network 302 is trained to generate a model 310, the neural network 302 will not require any invasive measurements. The neural network 302 will be configured to predict the blood test-based wellness scores based on the set of measured non-invasive biological parameters 116 and/or the set of determined non-invasive biological parameters 218 of the driver 102 measured and/or determined during the drive. The predictions may be used to provide trend lines for the wellness metrics, thereby communicating real-time feedback to the drivers 102 on their wellbeing.
[0108] For example, the non-invasive sensors 108, 110 measure biological parameters such as heart rate, respiratory rate, heart rate variability and bioimpedance, while the camera 106 captures images of the face, eyes, posture, and hand movements. This data is fed to a task-specific model 310. This model 310, when deployed may be configured to predict wellness metrics such as chronic inflammation, for example. In particular, the model 310 is trained to predict the wellness metric based on the set of determined non-invasive biological parameters 116 and/or the set of determined non-invasive biological parameters 218 and the task-specific data. The output in the form of a wellness metric may be displayed on the vehicle's dashboard display 130, providing real-time feedback to the driver 102. In one example, this display 130 may visualize trend lines for chronic inflammation or other wellness metrics and may also be color-coded to indicate severity.
[0109] FIG. 4 is a schematic representation 400 of one example arrangement of non-contact sensors 404 in or on a car seat 402 in a vehicle, in accordance with aspects of the present specification. It may be noted that the example depicted in FIG. 4 may be representative of one arrangement of the non-contact sensors 108 disposed on or in the car seat 110 in the vehicle 104 of FIG. 1. The one or more non-contact sensors 404 may be arranged on or embedded within the car seat 402. Other arrangements are also envisaged. These non-contact sensors 404 facilitate the acquisition of non-contact sensor data from the driver 102.
[0110] Referring now to FIG. 5, a schematic representation 500 of one example arrangement of touch-based sensors 504 in or on a steering wheel 502 in a vehicle, in accordance with aspects of the present specification, is presented. It may be noted that the example depicted in FIG. 5 may be representative of one arrangement of the touch-based sensors 112 disposed on or in the steering wheel 114 in the vehicle 104 of FIG. 1. The one or more touch-based sensors 504 may be arranged on or embedded within the steering wheel 502. In the embodiment depicted in FIG. 5, the touch-based sensors 504 are positioned at the “10 o’clock” and 2 o’clock” positions on the steering wheel 502. Other arrangements are also envisaged. These touch-based sensors 504 facilitate the acquisition of touch-based sensor data from the driver 102.
[0111] FIG. 6 is a schematic illustration 600 of the method 200 for in-vehicle wellness monitoring of a driver of FIG. 2. The method 600 is described with reference to the components of FIGs. 1-5. The in-vehicle wellness prediction system 118 and the prediction platform 124 in particular is configured to receive as input a set of measured non-invasive biological parameters 602 and one or more selected tasks 604. Additionally, the prediction platform 124 is also configured to compute one or more determined non-invasive biological parameters 218 based on the set of measured non-invasive biological parameters 602 .As previously noted, the set of measured non-invasive biological parameters 602 may include an age, gender, height, weight, an ethnicity of the driver 102, captured video of the driver 102, non-contact sensor information corresponding to the driver 102, touch-based sensor information corresponding to the driver 102, or combinations thereof. Also, the set of determined non-invasive biological parameters 218 may include respiratory rate, heart rate, eye color information, eye movement information, skin tone information, skin conductance information, eye characterization, hand movements, heart rate variability, bioimpedance, posture, and the like of the driver 102. Additionally, based on the selected task 604, the prediction platform 124 is configured to retrieve a corresponding task-specific model 606 from the data repository 134.
[0112] Subsequently, the set of measured non-invasive biological parameters 602 and/or the set of determined non-invasive biological parameters 218 are provided as input to the model 606. Further, the prediction platform 124 is configured to process the set of measured non-invasive biological parameters 602 and/or the set of determined non-invasive biological parameters 218 via use of the model 606. Consequent to processing of the set of measured non-invasive biological parameters 602 and/or the set of determined non-invasive biological parameters 218 using the model 606, one or more wellness metrics 608 are generated. In the example of FIG. 6, if the selected task 604 entails prediction of the alertness and mood score of the driver 102, the prediction platform 124 in conjunction with the model 606 is configured to process the set of measured non-invasive biological parameters 602 and/or the set of determined non-invasive biological parameters 218 to generate a wellness metric 608 representative of a predicted alertness and mood score value that is determined based only on the set of measured non-invasive biological parameters 602 and/or the set of determined non-invasive biological parameters 218.
[0113] Furthermore, in certain embodiments, one or more indicators that are representative of the wellness metrics 608 may be generated by the prediction platform 124. The indicator may be an audio indicator, a video indicator, a chart, a text box that includes written text and/or a numeric value indicative of the wellness metric, and the like. In another example, the indicator may be a graphical representation of the wellness metric 608. By way of example, the graphical representation of the wellness metric 608 may depict a trend of a corresponding wellness metric 608.
[0114] In yet another example, the indicator may be an icon or an emoticon that is representative of the wellness metric 608. For example, a shape and/or color of the icon/emoticon may be used to represent a “state” or “quality” of the wellness metric 608. In particular, a green color emoticon may be representative of a “no disease state” or a “low-grade disease state,” while a red color emoticon may be representative of an “acute disease state.” Some other examples of the indicator include quality metrics in the form of text, numerical values, quality bars or other shapes, where the quality metrics are generally representative of a “state” or “quality” of the wellness metric 608. The quality bars may have a horizontal orientation or a vertical orientation. Also, these bars may be color quality bars, where one or more colors may be used in the quality bars to represent the “quality” or “state” of the wellness metrics 608. By way of example, a green color bar may represent a “healthy state,” while a red color bar may represent an “acute disease state.” In yet another embodiment, one or more of these indicators may be convolved to generate a composite indicator or quality metric.
[0115] The wellness metrics 608 so generated are generally representative of the short-term and/or long term wellbeing of the driver 102. These wellness metrics 608 may be communicated as feedback to the driver 102 in real-time to promote the health and wellness. For example, the wellness metrics 608 may be communicated to the driver 102, a passenger, a clinician, or a device for further analysis, lifestyle recommendations, behavioral changes, treatment planning, triaging, and/or storage. In certain embodiments, the wellness metrics 608 may be communicated for visualization on a display such as the display 130 of the infotainment system 128 in the vehicle 104 or on a mobile device, while the driver 102 is driving the vehicle 104. Reference numeral 610 is generally representative of an indicator of a wellness metric 608. Moreover, in some embodiments, the indicator(s) 610 may be positioned at convenient locations on the display unit 130 and/or the mobile device such as a cellular phone. The wellness metrics 608 may be communicated via the cloud, wired means, wireless means, and the like.
[0116] Turning now to FIG. 7, a diagrammatical representation 700 of one example of the performance of the system for in-vehicle wellness monitoring 100 of FIG. 1 and the method for in-vehicle wellness monitoring 200 of FIG. 2 is presented. In particular, the example of FIG. 7 is a diagrammatical illustration 700 of providing a wellness metric to the driver 102 in real-time to facilitate analysis and/or lifestyle recommendations. Also, FIG. 7 is described with reference to the components of FIGs. 1-6.
[0117] In the example depicted in FIG. 7, a schematic illustration 700 of a visualization of the wellness metrics 608 is presented. As previously noted, the in-vehicle wellness prediction system 118 and the prediction platform 124 in particular is configured to generate and communicate the wellness metrics 608 in real-time to the driver 102 via visualization on the display 130 of the infotainment system 128 in the vehicle 104 and/or for storage to facilitate further analysis/lifestyle recommendations. It may be noted that in the example embodiment of FIG. 7, the wellness metrics 608 are visualized on a display 702 such as the display 130 of the infotainment system 128 of the vehicle 104. It may be noted that the name of the wellness metrics 608, a “quality” or “state” of the wellness metrics 608, a severity of the wellness metrics 608, trend lines associated with the wellness metrics 608, and the like may be visualized in real-time on the display 130 in the vehicle 104. In one example, the wellness metrics 608 may be color-coded to indicate severity. In other embodiments, the wellness metrics 608 may be visualized on a display of a handheld device such as a cell phone. However, other means of visualization are also anticipated.
[0118] It may be noted that in FIG. 7 an example of the wellness metric 608 in the form of an “overall wellness score” is presented. Additionally, a “quality” or “state” of the overall wellness score is indicated as “high.” These outputs are represented in the form of a text box 704. Furthermore, predicted trends of the one or more of the set of measured non-invasive biological parameters 602 and/or the set of determined non-invasive biological parameters 218 are presented as a graphical representation 706. Moreover, the “overall wellness score-high” state may also be represented in the form of an emoticon 708. In this example, the emoticon 708 may be color-coded to have a green color to indicate the “overall wellness score-high” state. Also, an audio indicator 710 of the “overall wellness score-high” state may be presented. Reference numeral 712 is used to refer to other controls on the infotainment system 128 in the vehicle 104.
[0119] The visual representations of the wellness metrics 608 provided to the driver 102 in real-time as depicted in FIG. 7 present a convenient snapshot of the “wellness state” to the driver 102, a passenger, or a clinician, thereby promoting driver health and wellness and enhancing a clinical workflow. It may be noted that the example of the visual representation presented in FIG. 7 is for illustrative purposes. Other designs are also anticipated.
[0120] Referring now to FIG. 8, a schematic representation 800 of one embodiment 802 of a digital processing system implementing the prediction platform 124 (see FIG. 1), in accordance with aspects of the present specification, is depicted. Also, FIG. 8 is described with reference to the components of FIGs. 1-7.
[0121] It may be noted that while the prediction platform 124 is shown as being a part of the in-vehicle wellness prediction system 118, in certain embodiments, the prediction platform 124 may also be integrated into end user systems such as, but not limited to, the infotainment system 128 in the vehicle 104 (see FIG. 1). Moreover, the example of the digital processing system 802 presented in FIG. 8 is for illustrative purposes. Other designs are also anticipated.
[0122] The digital processing system 802 may contain one or more processors such as a central processing unit (CPU) 804, a random access memory (RAM) 806, a secondary memory 808, a graphics controller 810, a display unit 812, a network interface 814, and an input interface 816. It may be noted that the components of the digital processing system 802 except the display unit 812 may communicate with each other over a communication path 818. In certain embodiments, the communication path 818 may include several buses, as is well known in the relevant arts.
[0123] The CPU 804 may execute instructions stored in the RAM 806 to provide several features of the present specification. Moreover, the CPU 804 may include multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, the CPU 804 may include only a single general-purpose processing unit.
[0124] Furthermore, the RAM 806 may receive instructions from the secondary memory 808 using the communication path 818. Also, in the embodiment of FIG. 8, the RAM 806 is shown as including software instructions constituting a shared operating environment 820 and/or other user programs 822 (such as other applications, DBMS, and the like). In addition to the shared operating environment 820, the RAM 806 may also include other software programs such as device drivers, virtual machines, and the like, which provide a (common) run time environment for execution of other/user programs. Moreover, in certain embodiments, the RAM may also include a model 824. The model 824 may be the task-specific model 126 (see FIG. 1).
[0125] With continuing reference to FIG. 8, the graphics controller 810 is configured to generate display signals (e.g., in RGB format) for display on the display unit 812 based on data/instructions received from the CPU 804. The display unit 812 may include a display screen to display images defined by the display signals. Furthermore, the input interface 816 may correspond to a keyboard and a pointing device (e.g., a touchpad, a mouse, and the like) and may be used to provide inputs. In addition, the network interface 814 may be configured to provide connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems connected to a network, for example.
[0126] Moreover, the secondary memory 808 may include a hard drive 826, a flash memory 828, and a removable storage drive 830. The secondary memory 808 may store data generated by the system 100 (see FIG. 1) and software instructions (for example, for implementing the various features of the present specification), which enable the digital processing system 802 to provide several features in accordance with the present specification. The code/instructions stored in the secondary memory 808 may either be copied to the RAM 806 prior to execution by the CPU 804 for higher execution speeds or may be directly executed by the CPU 804.
[0127] Some or all of the data and/or instructions may be provided on a removable storage unit 832, and the data and/or instructions may be read and provided by the removable storage drive 830 to the CPU 804. Further, the removable storage unit 832 may be implemented using medium and storage format compatible with the removable storage drive 830 such that the removable storage drive 830 can read the data and/or instructions. Thus, the removable storage unit 832 includes a computer readable (storage) medium having stored therein computer software and/or data. However, the computer (or machine, in general) readable medium can also be in other forms (e.g., non-removable, random access, and the like.).
[0128] It may be noted that as used herein, the term “computer program product” is used to generally refer to the removable storage unit 832 or a hard disk installed in the hard drive 826. These computer program products are means for providing software to the digital processing system 802. The CPU 804 may retrieve the software instructions and execute the instructions to provide various features of the present specification.
[0129] Also, the term “storage media/medium” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operate in a specific fashion. Such storage media may include non-volatile media and/or volatile media. Non-volatile media include, for example, optical disks, magnetic disks, or solid-state drives, such as the secondary memory 808. Volatile media include dynamic memory, such as the RAM 806. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
[0130] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, the transmission media may include coaxial cables, copper wire, and fiber optics, including the wires that include the communication path 818. Moreover, the transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
[0131] Reference throughout this specification to “one embodiment,” “an embodiment,” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present specification. Thus, appearances of the phrases “in one embodiment,” “in an embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0132] Furthermore, the described features, structures, or characteristics of the specification may be combined in any suitable manner in one or more embodiments. In the description presented hereinabove, numerous specific details are provided such as examples of programming, software modules, user selections, network transactions, database queries, database structures, hardware modules, hardware circuits, hardware chips, and the like, to provide a thorough understanding of embodiments of the specification.
[0133] The aforementioned components may be dedicated hardware elements such as circuit boards with digital signal processors or may be software running on a general-purpose computer or processor such as a commercial, off-the-shelf personal computer (PC). The various components may be combined or separated according to various embodiments of the invention.
[0134] Furthermore, the foregoing examples, demonstrations, and process steps such as those that may be performed by the system may be implemented by suitable code on a processor-based system, such as a general-purpose or special-purpose computer. It should also be noted that different implementations of the present specification may perform some or all of the steps described herein in different orders or substantially concurrently, that is, in parallel. Furthermore, the functions may be implemented in a variety of programming languages, including but not limited to C++, Python, and Java. Such code may be stored or adapted for storage on one or more tangible, machine readable media, such as on data repository chips, local or remote hard disks, optical disks (that is, CDs or DVDs), memory or other media, which may be accessed by a processor-based system to execute the stored code. Note that the tangible media may include paper or another suitable medium upon which the instructions are printed. For instance, the instructions may be electronically captured via optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner if necessary, and then stored in the data repository or memory.
[0135] The aforementioned components may be dedicated hardware elements such as circuit boards with digital signal processors or may be software running on a general-purpose computer or processor such as a commercial, off-the-shelf personal computer (PC). The various components may be combined or separated according to various embodiments of the invention.
[0136] Furthermore, the foregoing examples, demonstrations, and process steps such as those that may be performed by the system may be implemented by suitable code on a processor-based system, such as a general-purpose or special-purpose computer. It should also be noted that different implementations of the present specification may perform some or all of the steps described herein in different orders or substantially concurrently, that is, in parallel. Furthermore, the functions may be implemented in a variety of programming languages, including but not limited to C++, Python, and Java. Such code may be stored or adapted for storage on one or more tangible, machine readable media, such as on data repository chips, local or remote hard disks, optical disks (that is, CDs or DVDs), memory or other media, which may be accessed by a processor-based system to execute the stored code. Note that the tangible media may include paper or another suitable medium upon which the instructions are printed. For instance, the instructions may be electronically captured via optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner if necessary, and then stored in the data repository or memory.
[0137] Embodiments of the systems and methods for in-vehicle wellness monitoring described hereinabove advantageously present a robust framework for non-invasively monitoring the short-term and long-term wellness of the driver 102 in real-time. Specifically, the in-vehicle wellness prediction system 118 is designed to improve the wellbeing and safety of the drivers 102 while driving by predicting clinically relevant wellness metrics using only non-invasive biological parameters associated with the driver 102 and providing feedback to the driver 102 in real-time, thereby allowing the drivers 102 to make lifestyle and/or behavioral changes to improve their overall health and wellbeing. By way of example, the systems and methods allow non-invasive monitoring of chronic inflammation of the driver 102 in real-time, which in turn can have significant long-term benefits, including reducing the risk of chronic diseases and improving quality of life.
[0138] Moreover, use of the systems and methods described herein allow the continuous acquisition and/or computation of the non-invasive biological parameters and hence facilitate the continuous prediction of the wellness metrics.
[0139] The system for in-vehicle wellness monitoring is designed to be user-friendly, easy to use, and non-intrusive, thereby allowing the drivers to focus on driving, while still getting real-time feedback on their wellbeing in real-time. Further, the system is designed to be customizable, thereby allowing the drivers to select tasks and tailor the output to their preference. Moreover, the system is designed to operate seamlessly with the vehicle’s data systems, thereby providing a seamless and integrated experience to the driver. Additionally, the multimodal nature of the system and method for in-vehicle wellness monitoring ensures accurate measurement of driver wellness under diverse lighting and driving conditions.
[0140] The wellness metrics are appropriately communicated to the driver 102 and/or passengers, thereby allowing the drivers 102 to make lifestyle and behavioral changes to improve their overall health and wellbeing. This can have significant long-term benefits, including reducing the risk of chronic diseases and improving quality of life. Moreover, the system may also be configured to appropriately facilitate partnering the driver 102 and/or passengers with fitness centers, clinical centers, hospitals, nutraceuticals, and the like, to enhance the wellness of driver and/or passengers.
[0141] Furthermore, the systems and methods for in-vehicle wellness monitoring entail use of machine learning/artificial intelligence to directly map the measured non-invasive biological parameters and/or the determined non-invasive biological parameters to predicted wellness metrics. Additionally, the input set of non-invasive biological parameters may be advantageously expanded or modified based on the state-of-art, thereby facilitating further enhancement of the reliability of the wellness metrics. Also, intelligence inferred from the survey-based data and clinical data to generate the task-specific models provides a robust framework for use in predicting the wellness metrics.
[0142] Although specific features of embodiments of the present specification may be shown in and/or described with respect to some drawings and not in others, this is for convenience only. It is to be understood that the described features, structures, and/or characteristics may be combined and/or used interchangeably in any suitable manner in the various embodiments.
[0143] While only certain features of the present specification have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the present specification is intended to cover all such modifications and changes as fall within the true spirit of the invention.
,CLAIMS:1. A system (100) for real-time in-vehicle wellness monitoring of a user (102) of a vehicle (104), the system (100) comprising:
a non-invasive biological parameter acquisition system disposed in the vehicle (104) and configured to acquire a set of measured non-invasive biological parameters (116, 602) corresponding to the user (102), wherein the non-invasive biological parameter acquisition system comprises:
one or more cameras (106) positioned in the vehicle (104), wherein the one or more cameras (106) are configured to capture video of the user (102);
one or more non-contact sensors (108, 404) positioned on a seat (110, 402) in the vehicle (104), in the seat (110, 402) in the vehicle (104), or a combination thereof, wherein the one or more non-contact sensors (108, 404) are configured to obtain non-contact sensor information corresponding to the user (102);
one or more touch-based sensors (112, 504) positioned on a steering wheel (114, 502) in the vehicle (104), in the steering wheel (114, 502) in the vehicle (104), or a combination thereof, wherein the one or more touch-based sensors (112, 504) are configured to obtain touch-based sensor information corresponding to the user (102),
wherein the set of measured non-invasive biological parameters (116, 602) comprises an age, a gender, height, weight, an ethnicity, the captured video, the non-contact sensor information, the touch-based sensor information of the user (102), or combinations thereof;
an in-vehicle wellness prediction system (118) communicatively coupled to an infotainment system (128) in the vehicle (104), integrated with an infotainment system (128) in the vehicle (104), or a combination thereof, wherein the in-vehicle wellness prediction system (118) comprises:
an acquisition subsystem (120) configured to obtain the set of measured non-invasive biological parameters (116, 602) corresponding to the user (102);
a processing subsystem (122) in operative association with the acquisition subsystem (120) and comprising a prediction platform (124) configured to:
process the set of measured non-invasive biological parameters (116, 602) to generate a set of determined non-invasive biological parameters (218) corresponding to the user (102);
receive an input corresponding to a selected task (604);
retrieve a model (126, 310, 606) based on the input;
predict an outcome based on the set of measured non-invasive biological parameters (116, 602), the set of determined non-invasive biological parameters (218), or a combination thereof and the model (126, 310, 606), wherein the outcome corresponds to a wellness metric (226, 608), including chronic inflammation, predicted values of one or more invasive parameters, or a combination thereof, and wherein the wellness metric (226, 608) is representative of a quantified biological parameter of the user (102); and
an interface unit (130, 132) integrated with the infotainment system (128) in the vehicle (104) and configured to provide, in real-time, the outcome to the user (102),
wherein in-vehicle wellness prediction system (118) is configured to facilitate non-invasive monitoring of the short-term and long-term wellbeing of the user (102) of the vehicle (104) based on the set of measured non-invasive biological parameters (116, 602) and the set of determined non-invasive biological parameters (218) and provide feedback to the user (102) in real-time to promote health and wellbeing of the user (102).
2. The system (100) of claim 1, wherein the non-invasive biological parameter acquisition system is further configured to obtain other information (115) corresponding to the user (102), and wherein the other information (115) comprises an age, a gender, height, weight, an ethnicity, or combinations thereof of the user (102).
3. The system (100) of claim 1, wherein the prediction platform (124) is configured to process the captured video to facilitate non-invasive determination of heart rate, heart rate variability, respiratory rate, skin tone information, skin conductance information, eye color information, eye movement information, hand movements, posture of the user (102), or combinations thereof.
4. The system (100) of claim 3, wherein the prediction platform (124) is configured to process the non-contact sensor information to facilitate non-invasive measurement of respiratory rate, heart rate, heart rate variability of the user (102), or combinations thereof.
5. The system (100) of claim 4, wherein the prediction platform (124) is configured to process the touch-based sensor information to facilitate non-invasive measurement of heart rate, heart rate variability, bioimpedance of the user (102), or combinations thereof.
6. The system (100) of claim 5, wherein the set of determined non-invasive parameters (218) corresponding to the user (102) comprises bioimpedance values, a heart rate, a respiratory rate, heart rate variability, skin tone information, skin conductance information, a face characterization, eye color information, eye movement information, hand movements, posture, eye characterization, or combinations thereof, wherein the one or more invasive parameters comprise a creatinine reactive protein value, a salivary cortisol value, a serum protein electrophoresis value, or combinations thereof, and wherein the wellness metric (226, 608) comprises an acute inflammation, a chronic inflammation, stress, or combinations thereof.
7. The system (100) of claim 6, wherein to predict the outcome the prediction platform (124) is configured to provide as input the set of measured non-invasive biological parameters (116, 602), the set of determined non-invasive biological parameters (218), or a combination thereof as input to the model (126, 310, 606) to cause the model (126, 310, 606) to process the set of measured non-invasive biological parameters (116, 602), the set of determined non-invasive biological parameters (218), or a combination thereof to generate the outcome.
8. The system (100) of claim 1, wherein the prediction platform (124) is further configured to:
generate one or more indicators (610) representative of the wellness metric (226, 608), the predicted values of the invasive parameters, or a combination thereof, and wherein the one or more indicators (610) provide metrics corresponding to a state of the wellness metric (226, 608), the predicted values of the invasive parameters, or a combination thereof; and
provide, in real-time, the one or more indicators (610), corresponding to the wellness metric (226, 608), the predicted values of the invasive parameters, or a combination thereof to facilitate further analysis or lifestyle recommendations to promote health and wellbeing of the user (102).
9. The system (100) of claim 1, wherein the system (100) is configured to continuously acquire, via the non-invasive biological parameter acquisition system, the set of measured non-invasive biological parameters (116, 602).
10. The system (100) of claim 1, wherein the prediction platform (124) is configured to maintain a model (126, 310, 606), and wherein to maintain the model (126, 310, 606) the prediction platform (124) is configured to generate one or more models (126, 310, 606), wherein the one or more models (126, 310, 606) are tuned for performing one or more tasks, and wherein the one or more models (126, 310, 606) are artificial intelligence models.
11. The system (100) of claim 10, wherein to generate the one or more models (126, 310, 606) the prediction platform (124) is configured to:
obtain a plurality of sets of measured non-invasive parameters (116, 602) corresponding to a plurality of users, a plurality of sets of determined non-invasive parameters (218) corresponding to the plurality of users, or a combination thereof;
obtain a plurality of sets of invasive parameters (308) corresponding to the plurality of users, a plurality of blood test-based wellness scores (308) corresponding to the plurality of users, a plurality of survey-based wellness scores (308) corresponding to the plurality of users, a plurality of survey-based alertness and mood scores (308) corresponding to the plurality of users, or combinations thereof;
receive an input corresponding to one or more tasks (306) to be performed;
receive an input (304) corresponding to the plurality of sets of measured non-invasive parameters (116, 602), the plurality of sets of determined non-invasive parameters (218), or a combination thereof;
receive an input corresponding to one or more desired outcomes, wherein the one or more desired outcomes correspond to the survey-based one or more wellness scores (308), the one or more survey based alertness and mood scores (308), one or more blood test based wellness scores (308), the predicted values of one or more invasive parameters (308), or combinations thereof;
optimize model parameters of a neural network (302) based on the information corresponding to the inputs associated with the one or more tasks (306), the plurality of sets of measured non-invasive parameters (116, 602), the plurality of sets of determined non-invasive parameters (218), the one or more desired outcomes (308), or combinations thereof; and
train the neural network (302) to perform the more or more tasks (306) to generate the one or more models (126, 310, 606).
12. The system (100) of claim 1, wherein the system (100) is configured to perform automated in-vehicle wellness monitoring of the user (102), in real-time, on an edge device, and wherein the edge device is in the vehicle (104), integrated in the vehicle (104), or a combination thereof.
13. A method (200) for real-time in-vehicle wellness monitoring of a user (102) of a vehicle (104), the method (200) comprising:
(a) receiving (204, 206, 208, 210) a set of measured non-invasive biological parameters (116, 602) corresponding to the user (102);
(b) processing (212, 214, 216) the set of measured non-invasive biological parameters (116, 602) to generate a set of determined non-invasive biological parameters (218) corresponding to the user (102);
(c) receiving (220) an input corresponding to a selected task (604);
(d) retrieving (222) at least one model (126, 310, 606) based on the input corresponding to a selected task (604);
(d) predicting (224) an outcome based on the set of measured non-invasive biological parameters (116, 602), the set of determined non-invasive biological parameters (218), or a combination thereof and the model (126, 310, 606), wherein the outcome corresponds to a wellness metric (226, 608), including chronic inflammation, predicted values of one or more invasive parameters, or a combination thereof, and wherein the wellness metric (226, 608) is representative of a quantified biological parameter of the user (102); and
(e) providing (228), in real-time, the outcome to the user (102),
wherein the method (200) is configured to facilitate non-invasive monitoring of the short-term and long-term wellbeing of the user (102) of the vehicle (104) based on the set of measured non-invasive biological parameters (116, 602) and the set of determined non-invasive biological parameters (218) and provide feedback to the user (102) in real-time to promote health and wellbeing of the user (102).
14. The method (200) of claim 13, wherein steps (a)-(e) are performed in real-time on an edge device to automatically provide in-vehicle wellness monitoring of the user (102), and wherein the edge device is in the vehicle (104), integrated in the vehicle (104), or a combination thereof.
15. The method (200) of claim 13, wherein receiving (204, 206, 208, 210) the set of measured non-invasive biological parameters (116, 602) corresponding to the user (102) comprises:
capturing video of the user (102) using one or more cameras (106) positioned in the vehicle (104);
obtaining non-contact sensor information corresponding to the user (102) using one or more non-contact sensors (108, 404) positioned on a seat (110, 402) in the vehicle (104), in the seat (110, 402) in the vehicle (104), or a combination thereof;
obtaining touch-based sensor information corresponding to the user (102) using one or more touch-based sensors (112, 504) positioned on a steering wheel (114, 502) in the vehicle (104), in the steering wheel (114, 502) in the vehicle (104) or a combination thereof; and
receiving other information (115) corresponding to an age, a gender, height, weight, an ethnicity, or combinations thereof of the user (102),
wherein the set of measured non-invasive biological parameters (116, 602) comprises the age, the gender, height, weight, the ethnicity, the captured video, the non-contact sensor information, the touch-based sensor information of the user (102), or combinations thereof.
16. The method (200) of claim 13, wherein processing (212, 214, 216) the set of measured non-invasive biological parameters (116, 602) to generate the set of determined non-invasive biological parameters (218) corresponding to the user (102) comprises:
processing the captured video to facilitate non-invasive determination of heart rate, heart rate variability, respiratory rate, skin tone information, skin conductance information, eye color information, eye movement information, hand movements, posture of the user (102), or combinations thereof;
processing the non-contact sensor information to facilitate non-invasive measurement of respiratory rate, heart rate, heart rate variability, or combinations thereof; and
processing the touch-based sensor information to facilitate non-invasive measurement of heart rate, heart rate variability, bioimpedance of the user (102), or combinations thereof.
17. The method (200) of claim 16, wherein the set of determined non-invasive parameters (218) corresponding to the user (102) comprises bioimpedance values, a heart rate, a respiratory rate, a heart rate variability, skin tone information, skin conductance information, a face characterization, eye color information, eye movement information, hand movements, posture, eye characterization, or combinations thereof, wherein the one or more invasive parameters comprise a creatinine reactive protein value, a salivary cortisol value, and a serum protein electrophoresis value, or combinations thereof, and wherein the wellness metric (226, 608) comprises an acute inflammation, a chronic inflammation, stress, or combinations thereof.
18. The method (200) of claim 17, wherein predicting (224) the outcome comprises providing as input the set of measured non-invasive biological parameters (116, 602), the set of determined non-invasive biological parameters (218), or a combination thereof as input to the model (126, 310, 606) to cause the model (126, 310, 606) to process the set of measured non-invasive biological parameters (116, 602), the set of determined non-invasive biological parameters (218), or a combination thereof to generate the outcome.
19. The method (200) of claim 13, further comprising:
generating one or more indicators (610) representative of the wellness metric (226, 608), the predicted values of the invasive parameters, or a combination thereof, wherein the one or more indicators (610) provide metrics corresponding to a state of the wellness metric (226, 608), the predicted values of the invasive parameters, or a combination thereof; and
providing the one or more indicators (610), corresponding to the wellness metric (226, 608), the predicted values of the invasive parameters, or a combination thereof to facilitate further analysis or lifestyle recommendations.
20. The method (200) of claim 13, further comprising continuously acquiring the set of measured non-invasive biological parameters (116, 602) corresponding to the user (102) of the vehicle (104).
21. The method (200) of claim 13, further comprising maintaining a model (126, 310, 606), wherein maintaining the model (126, 310, 606) comprises generating one or more models (126, 310, 606), wherein the one or more models (126, 310, 606) are tuned for performing one or more tasks, and wherein the one or more models (126, 310, 606) are artificial intelligence models.
22. The method (200) of claim 21, wherein generating the one or more models (126, 310, 606) comprises:
obtaining a plurality of sets of measured non-invasive parameters (116, 602) corresponding to a plurality of users, a plurality of sets of determined non-invasive parameters (218) corresponding to the plurality of users, or a combination thereof;
obtaining a plurality of sets of invasive parameters (308) corresponding to the plurality of users, a plurality of blood test-based wellness scores (308) corresponding to the plurality of users, a plurality of survey-based wellness scores (308) corresponding to the plurality of users, a plurality of survey-based alertness and mood scores (308) corresponding to the plurality of users, or combinations thereof;
receiving an input corresponding to one or more tasks (306) to be performed;
receiving an input (304) corresponding to the plurality of sets of measured non-invasive parameters (116, 602) the plurality of sets of determined non-invasive parameters (218), or a combination thereof;
receiving an input corresponding to one or more desired outcomes, wherein the one or more desired outcomes correspond to the survey-based one or more wellness scores (308), the one or more survey based alertness and mood scores (308), one or more blood test based wellness scores (308), the predicted values of one or more invasive parameters (308), or combinations thereof;
optimizing model parameters of a neural network (302) based on the information corresponding to the inputs corresponding to the one or more tasks (306), the plurality of sets of measured non-invasive parameters (116, 602), the plurality of sets of determined non-invasive parameters (218), the one or more desired outcomes (308), or combinations thereof; and
training the neural network (302) to perform the more or more tasks (306) to generate the one or more models (126, 310, 606).
23. A system (100) for real-time in-vehicle wellness monitoring of a user (102) of a vehicle (104), the system (100) comprising:
a prediction platform (124) configured to:
process a set of measured non-invasive biological parameters (116, 602) to generate a set of determined non-invasive biological parameters (218) corresponding to the user (102);
receive an input corresponding to a selected task (604);
retrieve a model (126, 310, 606) based on the input;
predict an outcome based on the set of measured non-invasive biological parameters (116, 602), the set of determined non-invasive biological parameters (218), or a combination thereof and the model (126, 310, 606), wherein the outcome corresponds to a wellness metric (226, 608), including chronic inflammation, predicted values of one or more invasive parameters, or a combination thereof, and wherein the wellness metric (226, 608) is representative of a quantified biological parameter of the user (102); and
provide, in real-time, the outcome to the user (102) to facilitate non-invasive monitoring of the short-term and long-term wellbeing of the user (102) of the vehicle (104) based on the set of measured non-invasive biological parameters (116, 602), the set of determined non-invasive biological parameters (218), or a combination thereof and deliver feedback to the user (102) in real-time to promote health and wellbeing of the user (102).
| # | Name | Date |
|---|---|---|
| 1 | 202341029912-PROVISIONAL SPECIFICATION [25-04-2023(online)].pdf | 2023-04-25 |
| 2 | 202341029912-FORM FOR STARTUP [25-04-2023(online)].pdf | 2023-04-25 |
| 3 | 202341029912-FORM FOR SMALL ENTITY(FORM-28) [25-04-2023(online)].pdf | 2023-04-25 |
| 4 | 202341029912-FORM 1 [25-04-2023(online)].pdf | 2023-04-25 |
| 5 | 202341029912-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [25-04-2023(online)].pdf | 2023-04-25 |
| 6 | 202341029912-EVIDENCE FOR REGISTRATION UNDER SSI [25-04-2023(online)].pdf | 2023-04-25 |
| 7 | 202341029912-DRAWINGS [25-04-2023(online)].pdf | 2023-04-25 |
| 8 | 202341029912-FORM 3 [26-04-2023(online)].pdf | 2023-04-26 |
| 9 | 202341029912-Request Letter-Correspondence [08-09-2023(online)].pdf | 2023-09-08 |
| 10 | 202341029912-Power of Attorney [08-09-2023(online)].pdf | 2023-09-08 |
| 11 | 202341029912-FORM-26 [08-09-2023(online)].pdf | 2023-09-08 |
| 12 | 202341029912-Form 1 (Submitted on date of filing) [08-09-2023(online)].pdf | 2023-09-08 |
| 13 | 202341029912-Covering Letter [08-09-2023(online)].pdf | 2023-09-08 |
| 14 | 202341029912-DRAWING [23-04-2024(online)].pdf | 2024-04-23 |
| 15 | 202341029912-CORRESPONDENCE-OTHERS [23-04-2024(online)].pdf | 2024-04-23 |
| 16 | 202341029912-COMPLETE SPECIFICATION [23-04-2024(online)].pdf | 2024-04-23 |
| 17 | 202341029912-STARTUP [03-05-2024(online)].pdf | 2024-05-03 |
| 18 | 202341029912-FORM28 [03-05-2024(online)].pdf | 2024-05-03 |
| 19 | 202341029912-FORM-9 [03-05-2024(online)].pdf | 2024-05-03 |
| 20 | 202341029912-FORM 18A [03-05-2024(online)].pdf | 2024-05-03 |
| 21 | 202341029912-FER.pdf | 2024-06-10 |
| 22 | 202341029912-FORM 3 [12-09-2024(online)].pdf | 2024-09-12 |
| 23 | 202341029912-Proof of Right [05-12-2024(online)].pdf | 2024-12-05 |
| 24 | 202341029912-PETITION UNDER RULE 137 [09-12-2024(online)].pdf | 2024-12-09 |
| 25 | 202341029912-OTHERS [09-12-2024(online)].pdf | 2024-12-09 |
| 26 | 202341029912-FER_SER_REPLY [09-12-2024(online)].pdf | 2024-12-09 |
| 27 | 202341029912-CLAIMS [09-12-2024(online)].pdf | 2024-12-09 |
| 28 | 202341029912-PatentCertificate25-03-2025.pdf | 2025-03-25 |
| 29 | 202341029912-IntimationOfGrant25-03-2025.pdf | 2025-03-25 |
| 1 | 202341029912E_21-05-2024.pdf |