Sign In to Follow Application
View All Documents & Correspondence

System And Method For Monitoring Behavioral Changes Using Motion Sensor Data By Machine Learning Model

Abstract: ABSTRACT SYSTEM AND METHOD FOR MONITORING BEHAVIORAL CHANGES USING MOTION SENSOR DATA BY MACHINE LEARNING MODEL A system and method for monitoring behavioral changes of a subject 102 based on a motion sensor data using a machine learning model 110 that enables to evaluate one or more conditions of the subject 102 is provided. The method includes (i) obtaining the motion sensor data from a digital device 104, the motion sensor data includes three-dimensional (3D) accelerometer data, 3D magnetometer data, 3D gyroscope data; (ii) pre-processing the motion sensor data to obtain filtered motion sensor data; (iii) extracting one or more domain features from filtered motion sensor data; (iv) analyzing the one or more domain features to determine a plurality of analyzed features; (v) deriving a new set of features by merging the plurality of anayzed features; (vi) categorizing a level of intensity of behavioral changes of the subject 102 and (vii) evaluating one or more conditions of the subject using machine learning model 110. FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
08 January 2021
Publication Number
28/2022
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
ipo@myipstrategy.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-11-26
Renewal Date

Applicants

BRAINSIGHT TECHNOLOGY PRIVATE LIMITED
No 22, 4th Floor, Hosur Road, Koramangala, 7th Block, Salarpuria Towers I, Bangalore, Karnataka, India 560095

Inventors

1. Rimjhim Agrawal
640, 14TH CROSS, JP NAGAR, 2ND PHASE, BANGALORE-560078, Karnataka, India
2. Dilip Rajeswari
1st G-Cross, 4th Block, 2nd Phase, Banashankari 3rd Stage, Bangalore, Karnataka, India - 560085
3. Neha Sara John
309, 14th cross, Indiranagar 2nd Stage, Bangalore, Karnataka, India - 560038
4. Shreyash Singh
Near bobby electronics, In front of shiv mandir, Main Road Parasia, Chhindwara Parasia, Madhya Pradesh, India - 480441

Specification

DESC:SYSTEM AND METHOD FOR MONITORING BEHAVIORAL CHANGES USING MOTION SENSOR DATA BY MACHINE LEARNING MODEL
CROSS-REFERENCE TO PRIOR-FILED PATENT APPLICATIONS
[0001] This application claims priority from the Indian provisional application no. 202041028975 filed on January 08, 2021, which is herein incorporated by reference.
Technical Field
[0002] The embodiments herein generally relate to the machine learning model, more particularly, a system and method for monitoring behavioral changes of a subject based on motion sensor data using the machine learning model that enables an evaluation of the conditions of the subject.
Description of the Related Art
[0003] Technology has changed the world drastically. Especially, artificial intelligence is paving a way for the drastic improvement in the technology. Mobile applications are now witnessing the much-needed filtration from the artificial intelligence segment. The interaction between artificial intelligence and mobile applications are quintessentially known as intelligent applications. In intelligent mobile applications, the backend of mobile applications needs to be a multi-tasking machine. This application is expected to review how the user is responding to the application. User engagement levels and retention rates are used to analyze this.
[0004] The healthcare segment is an inevitable part of the world economy. The intelligent mobile applications offer to shape future and goals set in the healthcare segment too. The mobile applications may even monitor mental health status which helps for easy diagnosis to medical practitioners. Existing systems are focussed on analyzing active data that is collected from the users in monitoring mental health status. The analysis of active data provides a partial scenario about the mental status of the user. For a medical practitioner, it is very essential to get a clarified observation about the user to proceed for treatment.
[0005] Existing systems collect some physiological data through wearable devices for mental health status assessment. This assessment also helps the medical practitioner partially. It gives partial knowledge to him for assessing his mental health status. If the wearable device has any technical issue, then the data collected about the user is halted. This creates again a confusing situation to the assessor about the user.
[0006] Accordingly, there remains a need to develop an efficient system that allows observing the changes in the behavior of the user based on sensor data.
SUMMARY
[0007] In view of the foregoing, an embodiment herein provides a system for monitoring behavioral changes of a subject based on a motion sensor data using a machine learning model that enables to evaluate one or more conditions of the subject. The system includes a digital device. The digital device includes at least one of a camera, a gyroscope, a magnetometer, a global positioning system (GPS), or an accelerometer. The digital device obtains the motion sensor data during a dynamic time interval. The motion sensor data includes at least one of a three-dimensional accelerometer data, a three-dimensional magnetometer data, a three-dimensional gyroscope data or a passive sensor data of the subject. The system includes a motion sensor analyzing server that acquires the motion sensor data from the digital device and processes the motion sensor data using the machine learning model. The motion sensor analyzing server includes a memory that stores a database and a processor that is configured to execute the machine learning model and is configured to (i) pre-process, using a normalization method and band pass filtering method, the motion sensor data to obtain filtered motion sensor data; (ii) extract, using a domain feature technique, one or more domain features from filtered motion sensor data, the one or more domain features includes time-domain features, and frequency domain features; (iii) determine one or more analyzed features by analyzing the one or more domain features, the one or more analyzed features are determined using relations among the one or more domain features using a statistical method; (iv) derive, using dimensionality reduction technique, a new set of features by merging the one or more analyzed features, the one or more analyzed features include at least one of data points of acceleration and gyroscope, an angle of the digital device, an altitude, each of x-dimension, y-dimension, z-dimension of acceleration and gyroscope; and (v) evaluate, using a trained machine learning model, the one or more conditions of the subject by categorizing a level of intensity of behavioral changes of the subject based on the new set of features using a clustering method, the level of intensity of behavioral changes includes at least one of routine disruptions, action/mobility intensity changes, or behavioral markers.
[0008] In some embodiments, the processor is configured to train the machine learning model by providing one or more historical levels of intensities of behavioral changes associated with a historical new sets of features and one or more historical conditions associated with historical plurality of subjects as training data to obtain the trained machine learning model.
[0009] In some embodiments, the processor is configured to reduce dimensions of the one or more domain features using the dimensionality reduction techniques to reduce a count of the one or more domain features while deriving the new set of features, the count is used to provide the one or more domain features that are required for training the machine learning model.
[0010] In some embodiments, the one or more domain features include a three dimensional angles of acceleration, a three dimensional attitude of the digital device, a three dimensional gravity of the digital device, a magnitude of vector of an angular velocity, an orientation of the angular velocity, a distance travelled, or an index of overall movement.
[0011] In some embodiments, the processor is configured to obtain the three-dimensional accelerometer data by activating the accelerometer of the digital device when the subject exerts an acceleration on the digital device, the three-dimensional accelerometer data is obtained at a range of frequencies of 10 Hertz (Hz) to 50 Hz.
[0012] In some embodiments, the processor is configured to obtain the three-dimensional gyroscope data when the subject rotates, the three-dimensional gyroscope data is obtained at a range of frequencies of 10 Hz to 50 Hz.
[0013] In some embodiments, the motion sensor data is pre-processed using a low pass filter to smoothen and remove unnecessary high-frequency components as noise that are captured by jerky movements by the subject, the cut-off frequency of the low pass filter is 0.3 Hz.
[0014] In some embodiments, the three dimensional attitude of the digital device represents roll-attitude, pitch-attitude, and yaw-attitude, the three dimensional gravity of the digital device represents x-gravity, y-gravity, and z-gravity, the three dimensional angles of acceleration is calculated as acceleration angles of orientation along x-axis, y-axis, z-axis, the orientation of the angular velocity is calculated by pitch, yaw, and roll from the angular velocity, the distance traveled is a total distance traveled by the subject in the dynamic time interval, and the index of overall movement is obtained by calculating an euclidean metric of the motion sensor data of the accelerometer and the gyroscope.
[0015] In one aspect, a processor-implemented method for monitoring behavioral changes of a subject based on a motion sensor data using a machine learning model that enables to evaluate a plurality of conditions of the subject is provided. The method includes obtaining the motion sensor data from a digital device that includes at least one of a camera, a gyroscope, a magnetometer, a global positioning system (GPS), or an accelerometer during a dynamic time interval, the motion sensor data includes at least one of a three-dimensional accelerometer data, a three-dimensional magnetometer data, a three-dimensional gyroscope data or a passive sensor data of the subject. The method includes pre-processing, using a normalization method and band pass filtering method, the motion sensor data to obtain filtered motion sensor data. The method includes extracting one or more domain features from filtered motion sensor data using a domain feature technique, the one or more domain features includes time-domain features, and frequency domain features. The method includes determining one or more analyzed features by analyzing the one or more domain features, the one or more analyzed features are determined using relations among the one or more domain features using a statistical method. The method includes deriving a new set of features by merging the one or more analyzed features using dimensionality reduction techniques, the one or more analyzed features include at least one of data points of acceleration and gyroscope, an angle of the digital device, an altitude, each of x-dimension, y-dimension, z-dimension of acceleration and gyroscope. The method includes evaluating, using a trained machine learning model, the one or more conditions of the subject by categorizing a level of intensity of behavioral changes of the subject based on the new set of features using a clustering method, the level of intensity of behavioral changes includes at least one of routine disruptions, action/mobility intensity changes, or behavioral markers.
[0016] In some embodiments, the dimensionality reduction techniques that are used to derive the new set of features is at least one of principal component analysis, factor analysis, linear discriminant analysis or other dimensionality reduction techniques.
[0017] The system and/or method is used for generating suggestions to healthcare experts related to psychiatric, neuropsychiatric, mental illness, neurological, neuro-psychotic disorders using sleep pattern analysis through the motion sensor analysis for a patient. The system or method may help the experts in smart-phone based motion sensor assessments and interventions that require an emphasis on promoting long-term adherence, exploring possibilities of adaptive and personalized systems to predict risk/relapse, and determining the impact of sleep monitoring on improving patients’ quality of life and clinically meaningful outcomes. In psychiatric disorders, motion-sensing includes quantification of slow-wave motion and spindles may have potential diagnostic and prognostic applications. Changes in motion and sleep are also early warning signs of relapse in schizophrenia or conversion in schizophrenia prodrome. Also, the system or method may beneficial to assess motion analysis through the motion sensor analysis for a patient.
[0018] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
BRIEF DESCRIPTION OF THE DRAWINGS
[0019] The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:
[0020] FIG. 1 is a block diagram of a system for monitoring behavioral changes of a subject based on a motion sensor data that enables to evaluate conditions of the subject using a machine learning model according to some embodiments herein;
[0021] FIG. 2 is a block diagram of a motion sensor analyzing server according to some embodiments herein;
[0022] FIG. 3 is a block diagram of a pre-processing module according to some embodiments herein;
[0023] FIG. 4A is an exemplary data table that includes three-dimensional accelerometer data points collected from an accelerometer sensor for motion sensor analysis according to some embodiments herein;
[0024] FIG. 4B is an exemplary data table that includes three-dimensional gyroscope data points collected from the accelerometer sensor for the motion sensor analysis according to some embodiments herein;
[0025] FIG. 4C is an exemplary data table that includes three-dimensional magnetometer data points collected from the accelerometer sensor for the motion sensor analysis according to some embodiments herein;
[0026] FIG. 5A is a graphical representation of an output of action intensities of a subject according to some embodiments herein;
[0027] FIG. 5B is a graphical representation of an output of a step counter of the subject according to some embodiments herein;
[0028] FIG. 6 is a flow diagram that illustrates a method for monitoring behavioral changes of a subject based on a motion sensor data that enables to evaluate one or more conditions of the subject using a machine learning model according to some embodiments herein; and
[0029] FIG. 7 is a schematic diagram of a computer architecture in accordance with the embodiments herein.
DETAILED DESCRIPTION OF THE DRAWINGS
[0030] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0031] As mentioned, there is a need for a system and method of for monitoring behavioral changes of a subject based on a motion sensor data that enables to evaluate conditions of the subject using a machine learning model. Referring now to the drawings, and more particularly to FIGS. 1 through 7, where similar reference characters denote corresponding features consistently throughout the figures, preferred embodiments are shown.
[0032] FIG. 1 is a block diagram of a system 100 for monitoring behavioral changes of a subject 102 based on a motion sensor data that enables to evaluate conditions of the subject 102 using a machine learning model 110 according to some embodiments herein. The system 100 includes a digital device 104, and a motion sensor analyzing server 108. In some embodiments, the motion sensor system 100 includes an android application package (APK), iOS App Store Package (IPA), or any such application packages that are installed in the digital device 104 of the subject 102. In some embodiments, the digital device 104 may include a mobile phone, a kindle, a PDA (Personal Digital Assistant), a tablet, or a smartphone. In some embodiments, the system 100 may include an application that may be installed in android based devices, windows-based devices, or any such mobile operating systems devices. The motion sensor analyzing server 108 is configured to connect with a charging port, an accelerometer, a camera, a gyroscope, and an inbuilt speaker of the digital device 104.
[0033] The digital device 104 obtains the motion sensor data of the subject 102 during a dynamic time interval and communicates to the motion sensor analyzing server 108 through the network 106 at the end of each day. In some embodiments, the digital device 104 stores the motion sensor data of the subject 102 in its local database before sending it to the motion sensor analyzing server 108. In some embodiments, the network 106 is a wired network or a wireless network such as Bluetooth, Wi-Fi, ZigBee, cloud, or any other communication networks. The digital device 104 includes at least one of a camera, a gyroscope, a magnetometer, a global positioning system (GPS), or an accelerometer. The motion sensor data includes at least one of a three-dimensional accelerometer data, a three-dimensional magnetometer data, a three-dimensional gyroscope data or a passive sensor data of the subject 102.
[0034] In some embodiments, the motion sensor data is collected actively in real-time when the subject 102 is interacting with the digital device 104. In some embodiments, verification labels are assigned to such interactive subject 102. In some embodiments, the motion sensor data is collected from the digital device 104 passively in real-time, for example, users with high risk, like individuals prone to erratic behavior, and a few collaborative users. In some embodiments, the motion sensor data is collected from the data points obtained from real-time user events. In some embodiments, the data points obtained from the real-time user events may be data collected from a 3D accelerometer of the digital device 104, a 3D gyroscope of the digital device 104, a 3D magnetometer/GPS of the digital device 104. In some embodiments, the user id is a distinct identity given to each subject 102 who installs the system 100 in the digital device 104.
[0035] In some embodiments, the 3D accelerometer data is collected by activating an accelerometer sensor when the user 102 exerts an acceleration on the user device104 along with three-axis points -X, Y, and Z. In some embodiments, the 3D accelerometer data is recorded based on personalized timing for each subject 102 based on trigger accessing. In some embodiments, the trigger accessing is based on personalized triggers in the digital device 104. In some embodiments, the 3D accelerometer data is collected at ranging frequencies of 10 Hertz (Hz) to 50 Hz based on the capability of the digital device 104. In some embodiments, the 3D gyroscope data points are collected during the dynamic time interval. In some embodiments, the 3D gyroscopic data is collected when the subject 102 rotates the digital device 104 along with three-axis points -X, Y, and Z. In some embodiments, the 3D gyroscopic data is recorded based on personalized timing for each subject 102 based on trigger accessing. In some embodiments, the trigger accessing is based on personalized triggers in the digital device 104. In some embodiments, the 3D gyroscopic data is collected at ranging frequencies of 10 Hz to 50 Hz based on the capability of the digital device 104. In some embodiments, the collected three-dimensional gyroscopic data points from the subject 102 are used to train the machine learning model 110. In some embodiments, the 3D magnetometer/GPS data points are collected during a dynamic time interval. In some embodiments, the 3D magnetometer/GPS data is collected by locating the digital device 104 along with three positional coordinates of the subject 102 that is latitude, longitude, and altitude. In some embodiments, the 3D magnetometer/GPS data is recorded based on personalized timing for each subject 102 based on trigger accessing. In some embodiments, the trigger accessing is based on personalized triggers in the digital device 104. In some embodiments, the 3D magnetometer/GPS data is collected at ranging frequencies of 1Hz for a pre-determined amount of time-based on trigger-based rules.
[0036] In some embodiments, the motion sensor analyzing server 108 collects data of the three-dimensional accelerometer. If the subject 102 is prone to walk, or if the subject 102 is prone to jog and if the user is not prone to the wandering behavior, then the data is not collected.
[0037] The motion sensor analyzing server 108 normalizes the motion sensor data from the digital device 104 based on the trigger rule approach.
[0038] The motion sensor analyzing server 108 pre-processes the normalized motion sensor data through a low pass filter to smoothen and remove ant unnecessary high-frequency components that might appear as noise.
[0039] The motion sensor analyzing server 108 extracts features from the filtered motion sensor data. The extracted features are time-domain features and frequency domain features. In some embodiments, the extracted features are an acceleration vector magnitude, an acceleration 3D angles, a phone 3D attitude, a phone 3D gravity, an angular velocity vector magnitude, an angular velocity orientation, a distance traveled, an index of the overall movement. In some embodiments, the acceleration vector magnitude is the vector sum of the three accelerometer axes data. In some embodiments, the acceleration of 3D angles is calculated as acceleration angles of orientation along the three axes. In some embodiments, the phone 3D attitude refers to the orientation of a body relative to a given frame of reference. The phone 3D attitude data represents the digital device 104 roll-attitude, pitch-attitude, and yaw-attitude library variables. In some embodiments, the 3D attitude may also be derived from accelerometer and gyroscope data. In some embodiments, the phone 3D gravity refers to the gravity acceleration vector expressed in the digital device‘s 104 reference frame. The phone 3D gravity represents x-gravity, y-gravity, and z-gravity library variables of the digital device 104. In some embodiments, the angular velocity vector magnitude is the vector sum of the gyroscopic angular velocity data. In some embodiments, the angular velocity orientation is calculated pitch, yaw, and roll from the angular velocity (gyroscope data). In some embodiments, the distance traveled is a total distance traveled by the user 102 in a defined period and is derived from the GPS data. In some embodiments, the index of overall movement is obtained by calculating the euclidean metric of the accelerometer and gyroscope sensor data.
[0040] The motion sensor analyzing server 108 analyzes the domain features to determine analyzed features. The motion sensor analyzing server 108 determines analyzed features using relations among the domain features using a statistical method. The analyzed features include at least one of data points of acceleration and gyroscope, an angle of the digital device, an altitude, each of x-dimension, y-dimension, z-dimension of acceleration and gyroscope. The analyzed features may provide the advantage of working with fewer data points while explaining the structure of the original motion sensor data.
[0041] The motion sensor analyzing server 108 derives a new set of features by merging the analyzed features using dimensionality reduction techniques. The dimensionality reduction techniques is at least one of principal component analysis, factor analysis, linear discriminant analysis or other dimensionality reduction techniques. In some embodiments, the merged features include acceleration and gyroscope data points, angle of the phone, altitude, each of the X-Y-Z dimensions of the acceleration and gyroscope. In some embodiments, the dimensionality reduction technique is used to reduce dimensions of the one or more analyzed features. In some embodiments, for example, data points of three-dimensional gyroscope, three-dimensional accelerometer, and their corresponding features that are the angle of the digital device 104, altitude, each of them is taken as X-Y-Z dimensions of acceleration and gyroscope for reduction of dimensionality.
[0042] The motion sensor analyzing server 108 categorizes a level of intensity of behavioral changes of the subject based on the new set of features using a clustering method. The new set of features may relate to motion intensity or behavioral markers using clustering method. In some embodiments, the clustering method is at least one of k-means, t-distributed stochastic neighbor embedding (SNE), or gaussian mixture model, etc.
[0043] The level of intensity of behavioral changes includes at least one of routine disruptions, action/mobility intensity changes, or behavioral markers. In some embodiments, disruptions in routines are detected using anomaly detection techniques. In some embodiments, behavioral markers for the various psychiatric disorders are detected using a dimensionality reduction method. In some embodiments, action/mobility intensity changes that are drastically different from the patterns are normally detected and could potentially be used as a mood change behavioral marker.
[0044] The motion sensor analyzing server 108 evaluates the conditions of the subject 102 based on categorization of the level of intensity of behavioral changes using a trained machine learning model. The conditions may be sedentary, mild physical activity, strenuous physical activity. In some embodiments, the motion sensor analyzing server 108 predicts personal attributes and modifications in behaviors, post-diagnosis, from the mobility patterns and intensities. The motion sensor analyzing server 108 may relate between mobility intensity and depression (slower-paced walking) and anxiety. The motion sensor analyzing server 108 may use the depression/anxiety questionnaire to set a baseline/psychomotor metric. The motion sensor analyzing server 108 may observe frenetic pacing as a potential behavior marker for the subject 102 suffering from generalized anxiety. The motion sensor analyzing server 108 may observe a typical slow pacing that may be observed as a potential behavior marker for the subject 102 suffering from major depression. The motion sensor analyzing server 108 may observe the subject 102 suffering from schizophrenia are known to have lower levels of movement as compared to healthy controls, for example, tardive dyskinesia could potentially be detected to see if the medication is having extreme side effects.
[0045] The motion sensor analyzing server 108 may capture the intensity of the activity which yields the probability of the subject 102 pacing’s. In some embodiments, the motion sensor analyzing server 108 access the performance, sensitivity, specificity, and accuracy scores based on a few user labels provided by the subject 102. In some embodiments, the motion sensor analyzing server 108 improves accuracy based on the labels provided by the subject 102. In some embodiments, the sleep pattern of the subject 102 uses the required assessment of the subject 102 sleep/wake timing.
[0046] The machine learning model 110 is trained by providing one or more historical levels of intensities of behavioral changes associated with a historical new sets of features and one or more historical conditions associated with historical plurality of subjects as training data to obtain the trained machine learning model. In some embodiments, labeled data associated with the subject 102 is used to relate the action intensities to the activity types of the subject 102. In some embodiments, the labeled data helps to set-up data for training the machine learning model 110. The motion sensor analyzing server 108 relates mobility intensity and depression for slow-paced walking. In some embodiments, anxiety scores are related to the semi-required questionnaire from the subject 102. In some embodiments, the anxiety scores help to set-up data for training the machine learning model 110.
[0047] FIG. 2 is a block diagram of a motion sensor analyzing server 108 according to some embodiments herein. The motion sensor analyzing server 108 includes a database 202, a motion sensor data obtaining module 204, a motion sensor data pre-processing module 206, a domain feature extraction module 208, an feature analyzing module 210, a new set of features deriving module 212, a level of intensity categorization module 214, and a machine learning module 110. The motion sensor data obtaining module 204 obtains the motion sensor data during a dynamic time interval and stores it in the database 202 at the end of each day. The motion sensor data includes at least one of a three-dimensional accelerometer data, a three-dimensional magnetometer data, a three-dimensional gyroscope data, or a passive sensor data of the subject.
[0048] The motion sensor data pre-processing module 206 pre-process the motion sensor data to obtain filtered motion sensor data using a normalization method and band pass filtering method. In some embodiments, the cut-off frequency of the low pass filter is around 0.3 Hz. In some embodiments, the low pass filtered signals are passed through a high pass filter to stabilize the signal. In some embodiments, the cut-off frequency of the high pass filter is around 20 Hz if the sampling frequency is 40 Hz or more.
[0049] The domain feature extraction module 208 extracts one or more domain features from filtered motion sensor data using a domain feature technique. The one or more domain features include time-domain features, and frequency domain features. In some embodiments, the extracted domain features are an acceleration vector magnitude, an acceleration 3D angles, a phone 3D attitude, a phone 3D gravity, an angular velocity vector magnitude, an angular velocity orientation, a distance traveled, an index of the overall movement.
[0050] The feature analyzing module 210 analyzes the one or more domain features to determine analyzed features. The analyzed features are determined using relations among the one or more domain features using a statistical method. The feature analyzing module 210 relates mobility intensity and depression for slow-paced walking. The new set of features deriving module 212 derives a new set of features by merging the one or more analyzed features using dimensionality reduction technique, the one or more analyzed features include at least one of the data points of acceleration and gyroscope, an angle of the digital device, an altitude, each of x-dimension, y-dimension, z-dimension of acceleration and gyroscope. The dimensionality reduction techniques is at least one of principal component analysis, factor analysis, linear discriminant analysis or other dimensionality reduction techniques. In some embodiments, the dimensions of the one or more domain features are reduced using at least one of principal component analysis or factor analysis to reduce a count of the one or more domain features while deriving the new set of features. The count is used to provide the one or more domain features that are required for training the machine learning model.
[0051] The level of intensity categorization module 214 categorizes a level of intensity of behavioral changes of the subject 102 based on the new set of features. The level of intensity of behavioral changes includes at least one of routine disruptions, action/mobility intensity changes, or behavioral markers using a clustering method. The condition evaluation method 216 evaluates one or more conditions of the subject 102 based on categorization of the level of intensity of behavioral changes using a trained machine learning model.
[0052] The machine learning model 110 is trained by providing one or more historical levels of intensities of behavioral changes associated with a historical new sets of features and one or more historical conditions associated with historical plurality of subjects as training data to obtain the trained machine learning model. In some embodiments, the trained machine learning model detects disruptions in routines using anomaly detection techniques. In some embodiments, behavioral markers for the various psychiatric disorders are detected using a dimensionality reduction method. In some embodiments, action/mobility intensity changes that are drastically different from the patterns normally detected could potentially be used as a mood change behavioral marker. The machine learning model 110 may access sedentary, mild physical activity, strenuous physical activity to derive acceleration intensities. In some embodiments, the machine learning model 110 predicts personal attributes and modifications in behaviors, post-diagnosis, from the mobility patterns and intensities.
[0053] FIG. 3 is a block diagram of a pre-processing module 206 according to some embodiments herein. The pre-processing module 206 includes a normalization module 302, and a band pass filtering module 304. The normalization module 302 normalizes the motion sensor data from the digital device 104 based on the trigger rule approach. The normalization module 302 may use the trigger rule approach to organize the motion sensor data in the database 202 by reducing redundancy due to duplicate data. In some embodiments, the database 202 stores the motion sensor data for several subjects at several dynamic time intervals, the steps of the subject 102 that are counted based on a threshold value for the three-dimensional accelerometer. The band pass filtering module 304 pre-processes the normalized motion sensor data through a low pass filter to smoothen and remove ant unnecessary high-frequency components that might appear as noise. In some embodiments, the noise could be jerky motions captured by the accelerometer while the subject 102 walks or movement vibrations. In some embodiments, the cut-off frequency of the low pass filter is around 0.3 Hz. In some embodiments, the low pass filtered signals are passed through a high pass filter to stabilize the signal. In some embodiments, the cut-off frequency of the high pass filter is around 20 Hz if the sampling frequency is 40 Hz or more.
[0054] FIG. 4A is an exemplary data table 400A that comprises three-dimensional accelerometer data points collected from an accelerometer sensor for motion sensor analysis according to some embodiments herein. The exemplary data table 400A includes data points of three-dimensional accelerometer data points associated with the subject 102 during the usage of the digital device 104. In some embodiments, the three-dimensional accelerometer data points are collected at dynamic time intervals as shown in the data table 400A.
[0055] FIG. 4B is an exemplary data table 400B that comprises three-dimensional gyroscope data points collected from an accelerometer sensor for motion sensor analysis according to some embodiments herein. The exemplary data table 400B includes data points of three-dimensional gyroscope data points associated with the subject 102 during the usage of the digital device 104. In some embodiments, the three-dimensional gyroscope data points are collected at pre-specified time intervals as shown in the data table 400B.
[0056] FIG. 4C is an exemplary data table 400C that comprises three-dimensional magnetometer data points collected from an accelerometer sensor for motion sensor analysis according to some embodiments herein. The exemplary data table 400C includes data points of three-dimensional magnetometer data points associated with the subject 102 during his usage of the digital device 104. In some embodiments, the three-dimensional magnetometer data points are collected at pre-specified time intervals as shown in the data table 400C.
[0057] FIG. 5A is an exemplary user interface view 500A of the digital device 104 that depicts output of action intensities of the subject 102 according to some embodiments herein. The exemplary user interface view 500A depicts the output of four types of action intensities of the subject 102. In some embodiments, the action intensities of the subject 102, for example, a sustained activity at 502, a sustained inactivity at 504, a light physical activity at 506, a vigorous physical activity at 508. The output of four types of action intensities of subject 102 has several hours spent by the subject 102 on the Y-axis and number of days on the X-axis.
[0058] FIG. 5B is an exemplary user interface view 500B of the digital device 104 that depicts output of the step counter of the subject 102 according to some embodiments herein. The exemplary user interface view 500B depicts the number of steps of the subject 102 that are varied over time.
[0059] FIG. 6 is a flow diagram 600 that illustrates a method for monitoring behavioral changes of a subject based on a motion sensor data using a machine learning model that enables to evaluate one or more conditions of the subject 102 according to some embodiments herein. At step 602, the method includes the step of obtaining the motion sensor data from a digital device. The digital device includes at least one of a camera, a gyroscope, a magnetometer, or an accelerometer during a dynamic time interval. The motion sensor data includes at least one of a three-dimensional accelerometer data, a three-dimensional magnetometer data, a three-dimensional gyroscope data or a passive sensor data of the subject. At step 604, the method includes the step of pre-processing the motion sensor data to obtain filtered motion sensor data using a normalization method and band pass filtering method. At step 606, the method includes the step of extracting one or more domain features from filtered motion sensor data using a domain feature technique. The one or more domain features includes time-domain features, and frequency domain features. At step 608, the method includes the step of determining one or more analyzed features by analyzing one or more domain features. The one or more analyzed features are determined using relations among the one or more domain features using a statistical method. At step 610, the method includes the step of deriving a new set of features by merging the one or more analyzed features using a principal component analysis dimensionality reduction technique. The one or more analyzed features include at least one of data points of acceleration and gyroscope, an angle of the digital device, an altitude, each of x-dimension, y-dimension, z-dimension of acceleration and gyroscope. At step 612, the method includes the step of evaluating, using a trained machine learning model, the one or more conditions of the subject by categorizing a level of intensity of behavioral changes of the subject based on the new set of features using a clustering method, the level of intensity of behavioral changes includes at least one of routine disruptions, action/mobility intensity changes, or behavioral markers.
[0060] A representative hardware environment for practicing the embodiments herein is depicted in FIG. 7, with reference to FIGS. 1 through 6. This schematic drawing illustrates a hardware configuration of a motion sensor analyzing server 108/computer system/ computing device in accordance with the embodiments herein. The system includes at least one processing device CPU 10 that may be interconnected via system bus 15 to various devices such as a random-access memory (RAM) 12, read-only memory (ROM) 16, and an input/output (I/O) adapter 18. The I/O adapter 18 can connect to peripheral devices, such as disk units 58 and program storage devices 50 that are readable by the system. The system can read the inventive instructions on the program storage devices 50 and follow these instructions to execute the methodology of the embodiments herein. The system further includes a user interface adapter 22 that connects a keyboard 28, mouse 50, speaker 52, microphone 55, and/or other user interface devices such as a touch screen device (not shown) to the bus 15 to gather user input. Additionally, a communication adapter 20 connects the bus 15 to a data processing network 52, and a display adapter 25 connects the bus 15 to a display device 26, which provides a graphical user interface (GUI) 56 of the output data in accordance with the embodiments herein, or which may be embodied as an output device such as a monitor, printer, or transmitter, for example.
[0061] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the scope of the appended claims. ,CLAIMS:CLAIMS
I/We Claim:

1. A system for monitoring behavioral changes of a subject based on a motion sensor data using a machine learning model (110) that enables to evaluate a plurality of conditions of the subject (102), the system comprising:
a digital device (104) that comprises at least one of a camera, a gyroscope, a magnetometer, a global positioning system (GPS), or an accelerometer, wherein the digital device (104) obtains the motion sensor data during a dynamic time interval, wherein the motion sensor data comprises at least one of a three-dimensional accelerometer data, a three-dimensional magnetometer data, a three-dimensional gyroscope data or a passive sensor data of the subject (102);
a motion sensor analyzing server (108) that acquires the motion sensor data from the digital device (104), and processes, using the machine learning model (110), the motion sensor data, wherein the motion sensor analyzing server comprises:
a memory that stores a database;
a processor that is configured to execute the machine learning model and is configured to,
pre-process, using a normalization method and band pass filtering method, the motion sensor data to obtain filtered motion sensor data;
characterized in that,
extract, using a domain feature technique, a plurality of domain features from filtered motion sensor data, wherein the plurality of domain features comprises time-domain features, and frequency domain features;
determine a plurality of analyzed features by analyzing the plurality of domain features, wherein the plurality of analyzed features are determined using relations among the plurality of domain features using a statistical method;
derive, using dimensionality reduction techniques, a new set of features by merging the plurality of analyzed features, wherein the plurality of analyzed features comprise at least one of data points of acceleration and gyroscope, an angle of the digital device, an altitude, each of x-dimension, y-dimension, z-dimension of acceleration and gyroscope; and
evaluate, using a trained machine learning model, the plurality of conditions of the subject (102) by categorizing a level of intensity of behavioral changes of the subject (102) based on the new set of features using a clustering method, wherein the level of intensity of behavioral changes comprises at least one of routine disruptions, action/mobility intensity changes, or behavioral markers.


2. The system as claimed in claim 1, wherein the processor is configured to train the machine learning model by providing a plurality of historical levels of intensities of behavioral changes associated with a historical new set of features and a plurality of historical conditions associated with historical plurality of subjects as training data to obtain the trained machine learning model.


3. The system as claimed in claim 1, wherein the processor is configured to reduce dimensions of the plurality of domain features using the dimensionality reduction techniques to reduce a count of the plurality of domain features while deriving the new set of features, wherein the count is used to provide the plurality of domain features that are required for training the machine learning model.


4. The system as claimed in claim 1, wherein the plurality of domain features comprise a three dimensional angles of acceleration, a three dimensional attitude of the digital device, a three dimensional gravity of the digital device, a magnitude of vector of an angular velocity, an orientation of the angular velocity, a distance travelled, or an index of overall movement.


5. The system as claimed in claim 1, wherein the processor is configured to obtain the three-dimensional accelerometer data by activating the accelerometer of the digital device (104) when the subject (102) exerts an acceleration on the digital device (104), wherein the three-dimensional accelerometer data is obtained at a range of frequencies of 10 Hertz (Hz) to 50 Hz.


6. The system as claimed in claim 1, wherein the processor is configured to obtain the three-dimensional gyroscope data when the subject (102) rotates, wherein the three-dimensional gyroscope data is obtained at a range of frequencies of 10 Hertz (Hz) to 50 Hz.


7. The system as claimed in claim 1, wherein the motion sensor data is pre-processed using a low pass filter to smoothen and remove unnecessary high-frequency components as noise that are captured by jerky movements by the subject (102), wherein the cut-off frequency of the low pass filter is 0.3Hz.


8. The system as claimed in claim 4, wherein the three dimensional attitude of the digital device represents roll-attitude, pitch-attitude, and yaw-attitude, wherein the three dimensional gravity of the digital device represents x-gravity, y-gravity, and z-gravity, wherein the three dimensional angles of acceleration is calculated as acceleration angles of orientation along x-axis, y-axis, z-axis, wherein the orientation of the angular velocity is calculated by pitch, yaw, and roll from the angular velocity, wherein the distance traveled is a total distance traveled by the subject (102) in the dynamic time interval, and wherein the index of overall movement is obtained by calculating an euclidean metric of the motion sensor data of the accelerometer and the gyroscope.



9. A processor-implemented method monitoring behavioral changes of a subject based on a motion sensor data using a machine learning model (110) that enables to evaluate a plurality of conditions of the subject (102), the method comprising:
obtaining the motion sensor data from a digital device that comprises at least one of a camera, a gyroscope, a magnetometer, a global positioning system (GPS), or an accelerometer during a dynamic time interval, wherein the motion sensor data comprises at least one of a three-dimensional accelerometer data, a three-dimensional magnetometer data, a three-dimensional gyroscope data or a passive sensor data of the subject (102);
pre-processing, using a normalization method and band pass filtering method, the motion sensor data to obtain filtered motion sensor data;
extracting, using a domain feature technique, a plurality of domain features from filtered motion sensor data, wherein the plurality of domain features comprises time-domain features, and frequency domain features;
analyzing the plurality of domain features to determine a plurality of analyzed features, wherein the plurality of analyzed features are determined using relations among the plurality of domain features using a statistical method;
deriving, using dimensionality reduction techniques, a new set of features by merging the plurality of analyzed features, wherein the plurality of analyzed features comprise at least one of data points of acceleration and gyroscope, an angle of the digital device, an altitude, each of x-dimension, y-dimension, z-dimension of acceleration and gyroscope; and
evaluating, using a trained machine learning model, the plurality of conditions of the subject (102) by categorizing a level of intensity of behavioral changes of the subject (102) based on the new set of features, using a clustering method, wherein the level of intensity of behavioral changes comprises at least one of routine disruptions, action/mobility intensity changes, or behavioral markers.


10. The method as claimed in claim 1, wherein the dimensionality reduction techniques that are used to derive the new set of features is at least one of principal component analysis, factor analysis, linear discriminant analysis or other dimensionality reduction techniques.
Dated this day of 06th January, 2021.

Arjun Karthik Bala
IN/PA-1021

Documents

Application Documents

# Name Date
1 202041028975-STATEMENT OF UNDERTAKING (FORM 3) [08-07-2020(online)].pdf 2020-07-08
2 202041028975-PROVISIONAL SPECIFICATION [08-07-2020(online)].pdf 2020-07-08
3 202041028975-PROOF OF RIGHT [08-07-2020(online)].pdf 2020-07-08
4 202041028975-FORM FOR STARTUP [08-07-2020(online)].pdf 2020-07-08
5 202041028975-FORM FOR SMALL ENTITY(FORM-28) [08-07-2020(online)].pdf 2020-07-08
6 202041028975-FORM 1 [08-07-2020(online)].pdf 2020-07-08
7 202041028975-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [08-07-2020(online)].pdf 2020-07-08
8 202041028975-EVIDENCE FOR REGISTRATION UNDER SSI [08-07-2020(online)].pdf 2020-07-08
9 202041028975-DRAWINGS [08-07-2020(online)].pdf 2020-07-08
10 202041028975-FORM-26 [24-07-2020(online)].pdf 2020-07-24
11 202041028975-PostDating-(22-06-2021)-(E-6-178-2021-CHE).pdf 2021-06-22
12 202041028975-APPLICATIONFORPOSTDATING [22-06-2021(online)].pdf 2021-06-22
13 202041028975-Abstract.jpg 2021-10-18
14 202041028975-DRAWING [08-01-2022(online)].pdf 2022-01-08
15 202041028975-CORRESPONDENCE-OTHERS [08-01-2022(online)].pdf 2022-01-08
16 202041028975-COMPLETE SPECIFICATION [08-01-2022(online)].pdf 2022-01-08
17 202041028975-STARTUP [22-07-2022(online)].pdf 2022-07-22
18 202041028975-FORM28 [22-07-2022(online)].pdf 2022-07-22
19 202041028975-FORM 18A [22-07-2022(online)].pdf 2022-07-22
20 202041028975-FER.pdf 2022-11-21
21 202041028975-OTHERS [21-05-2023(online)].pdf 2023-05-21
22 202041028975-FER_SER_REPLY [21-05-2023(online)].pdf 2023-05-21
23 202041028975-CORRESPONDENCE [21-05-2023(online)].pdf 2023-05-21
24 202041028975-CLAIMS [21-05-2023(online)].pdf 2023-05-21
25 202041028975-US(14)-HearingNotice-(HearingDate-18-01-2024).pdf 2023-12-29
26 202041028975-Correspondence to notify the Controller [09-01-2024(online)].pdf 2024-01-09
27 202041028975-Correspondence to notify the Controller [17-01-2024(online)].pdf 2024-01-17
28 202041028975-Annexure [17-01-2024(online)].pdf 2024-01-17
29 202041028975-Written submissions and relevant documents [01-02-2024(online)].pdf 2024-02-01
30 202041028975-PatentCertificate26-11-2024.pdf 2024-11-26
31 202041028975-IntimationOfGrant26-11-2024.pdf 2024-11-26

Search Strategy

1 SearchHistory(13)E_15-11-2022.pdf

ERegister / Renewals

3rd: 18 Jan 2025

From 08/01/2023 - To 08/01/2024

4th: 18 Jan 2025

From 08/01/2024 - To 08/01/2025

5th: 18 Jan 2025

From 08/01/2025 - To 08/01/2026

6th: 18 Jan 2025

From 08/01/2026 - To 08/01/2027

7th: 18 Jan 2025

From 08/01/2027 - To 08/01/2028

8th: 18 Jan 2025

From 08/01/2028 - To 08/01/2029

9th: 18 Jan 2025

From 08/01/2029 - To 08/01/2030

10th: 18 Jan 2025

From 08/01/2030 - To 08/01/2031