Abstract: A method of classifying activities performed by a user using motion data is disclosed. At first raw motion data of a user performing an exercise activity and a non-exercise activity over a period of time is captured. The exercise activity comprises at least a first exercise activity and a second exercise activity. The first exercise activity or the second exercise activity are classified using classifiers. The classifiers are trained beforehand using motion data stored in a test dataset or a master dataset. The test dataset indicates a real-time motion pattern specified by the user and the master dataset indicates a pre-determined motion pattern corresponding to the first exercise activity and the second exercise activity. Further labels corresponding to the first exercise activity and the second exercise activities are presented to the user. The classifiers the test dataset and the master dataset are further improved using the motion data. Fig 5B
WE CLAIM:
1. A method for training of classifiers for identifying activities based on motion data, the method
comprising:
capturing, by a sensor, raw motion data of a user performing an exercise activity and a non-exercise activity over a period of time, wherein the exercise activity comprises at least one of a first exercise activity and a second exercise activity;
processing, by a processor, the raw motion data to train classifiers, wherein the classifiers are trained to identify a motion pattern of each of at least one of the first exercise activity and the second exercise activity, wherein the motion pattern of each of at least one of the first exercise activity and the second exercise activity is identified by comparing the motion pattern with a master dataset or a user's training dataset, wherein the master dataset indicates a pre-determined motion pattern corresponding to at least one of the first exercise activity and the second exercise activity, wherein the user's training dataset indicates user-specific motion pattern corresponding to at least one of the first exercise activity and the second exercise activity, wherein at least one of the first exercise activity and the second exercise activity are identified by segregating specific motion data samples corresponding to at least one of the first exercise activity and the second exercise activity, respectively from the raw motion data using an unsupervised learning technique, and wherein each of the specific motion sample is further processed, to label at least one of the first exercise activity and the second exercise activity corresponding to the motion pattern identified;
receiving, by the sensor, a subsequent motion data from the user, wherein the subsequent motion data corresponds to a subsequent exercise activity;
classifying, by the processor, the subsequent motion data corresponding to the subsequent exercise activity into at least one of the first exercise activity and the second exercise activity in one of the user's training dataset and the master dataset, wherein the subsequent motion data is further added to the master dataset and the user's training dataset, wherein the subsequent motion data is further used to train the classifiers for identifying the subsequent exercise activity into at least one of the first exercise activity and the second exercise activity using a machine learning technique; and
predicting, by the processor, the label corresponding to at least one of the first exercise activity and the second exercise activity for the subsequent motion data using voting based prediction based on a supervisory learning technique.
2. The method as claimed in claim 1, wherein the raw motion data and the subsequent motion
data of the user performing at least one of the first exercise activity and the second exercise
activity are captured by tracking acceleration, angular velocity, and altimeter data from the
sensor worn by the user.
3. The method as claimed in claim 1, wherein the unsupervised learning technique comprises one of a k-means clustering, mixture models and a hierarchical clustering.
4. The method as claimed in claim 1, wherein the supervised learning technique comprises one of a neural network, a boosted tree network, and a random forest network.
5. A device for training of classifiers for identifying activities based on motion data, the device comprising:
at least one sensor communicatively coupled to the device, wherein the at least one sensor captures raw motion data of a user performing at least one of a first exercise activity and a second exercise activity over a period of time, and wherein the device further comprises a memory and a processor coupled to the memory, wherein the processor executes program instructions stored in the memory, to:
process the raw motion data to train classifiers, wherein the classifiers are trained to identify a motion pattern of each of the first exercise activity and the second exercise activity, wherein the motion pattern of each of the first exercise activity and the second exercise activity is identified by comparing the motion pattern with a master dataset or a user's training dataset, wherein the master dataset indicates a pre-determined motion pattern corresponding to at least one of the first exercise activity and the second exercise activity, wherein the user's training dataset indicates a real-time motion pattern, specified by the user, corresponding to at least one of the first exercise activity and the second exercise activity, wherein at least one of the first exercise activity and the second exercise activity are identified by segregating specific motion data samples corresponding to at least one of the first exercise activity and the second exercise activity, respectively, from the raw motion data using an unsupervised learning technique, and wherein each of the specific motion samples are further processed to label at least one of the first exercise activity and the second exercise activity corresponding to the motion pattern identified;
receive a subsequent motion data from the user, wherein the subsequent motion data corresponds to a subsequent exercise activity;
classify the subsequent motion data corresponding to the subsequent exercise activity into at least one of the first exercise activity and the second exercise activity in one of the user's training dataset and the master dataset, wherein the subsequent motion data is further added to the master dataset and the user's training dataset, wherein the subsequent motion data is further used to train the classifiers for identifying the subsequent exercise activity into at least one of the first exercise activity and the second exercise activity using a machine learning technique; and
predict the label corresponding to at least one of the first exercise activity and the second exercise activity for the subsequent motion data using voting based prediction based on a supervisory learning technique.
6. The device as claimed in claim 5, wherein the at least one sensor comprises one of an accelerometer, a magnetometer, a gyroscope, a Micro Electronic Mechanical System (MEMS), and aNano Electronic Mechanical System (NEMS).
7. A system for training of classifiers for identifying activities based on motion data, the system comprising:
at least one sensor to capture raw motion data of a user performing at least one of a first exercise activity and a second exercise activity over a period of time;
a device coupled to the at least one sensor for transmitting the raw motion data; and a server communicatively coupled to the device, wherein the server comprises a memory and a processor coupled to the memory, wherein the processor executes program instructions stored in the memory, to:
process the raw motion data to train classifiers, wherein the classifiers are trained to identify a motion pattern of each of the first exercise activity and the second exercise activity, wherein the motion pattern of each of the first exercise activity and the second exercise activity is identified by comparing the motion pattern with a master dataset or a user's training dataset, wherein the master dataset indicates a pre-determined motion pattern corresponding to at least one of the first exercise activity and the second exercise activity, wherein the user's training dataset indicates a real-time motion pattern, specified by the user, corresponding to at least one of the first exercise activity and the second exercise activity, wherein at least one of the first exercise activity and the second exercise activity are identified by segregating specific motion data samples corresponding to at least one of the first exercise activity and the second exercise activity, respectively, from the raw motion data using an unsupervised learning technique, and wherein each of the specific motion samples is further processed to label at least one of the first exercise activity and the second exercise activity corresponding to the motion pattern identified;
receive a subsequent motion data from the user, wherein the subsequent motion data corresponds to a subsequent exercise activity;
classify the subsequent motion data corresponding to the subsequent exercise activity into at least one of the first exercise activity and the second exercise activity in one of the user's training dataset and the master dataset, wherein the subsequent motion data is further added to the master dataset and the user's training dataset, wherein the subsequent motion data is further used to train the classifiers for identifying the subsequent exercise activity into at least one of the first exercise activity and the second exercise activity using a machine learning technique; and
predict the label corresponding to at least one of the first exercise activity and the second exercise activity for the subsequent motion data using voting based prediction based on a supervisory learning technique.
8. The system as claimed in claim 7, wherein the at least one sensor comprises one of an accelerometer, a magnetometer, a gyroscope, a Micro Electronic Mechanical System (MEMS), and aNano Electronic Mechanical System (NEMS).
9. The system as claimed in claim 7, wherein the device is one of an electronic device, a mobile phone, a smart watch, a fitness equipment, a display device and a wearable garment
10. A method for deriving motion pattern of activities based on motion data, the method
comprising:
capturing, by a sensor, raw motion data of a user performing an exercise activity and a non-exercise activity over a period of time, wherein the exercise activity comprises at least one of a first exercise activity and a second exercise activity;
processing, by a processor, the raw motion data to train classifiers, wherein the classifiers are trained to identify a motion pattern of each of at least one of the first exercise activity and the second exercise activity by segregating specific motion data samples corresponding to at least one of the first exercise activity and the second exercise activity, respectively from the raw motion data, wherein at least one of the first exercise activity and the second exercise activity are identified by segregating specific motion data samples corresponding to at least one of the first exercise activity and the second exercise activity, respectively from the raw motion data using an unsupervised learning technique, and wherein each of the specific motion samples is further processed to label at least one of the first exercise activity and the second exercise activity corresponding to the motion pattern identified;
receiving, by the sensor, a subsequent motion data from the user, wherein the subsequent motion data corresponds to a subsequent exercise activity;
deriving, by the processor, the motion pattern of the subsequent motion data corresponding to the subsequent exercise activity into at least one of the first exercise activity and the second exercise activity, wherein the motion pattern of the subsequent motion data derived is used to identify the motion pattern of each of the first exercise activity and the second exercise activity; and
predicting, by the processor, the label corresponding to at least one of the first exercise activity and the second exercise activity for the subsequent motion data using voting based prediction based on a supervisory learning technique.
| # | Name | Date |
|---|---|---|
| 1 | 201947028878.pdf | 2019-07-18 |
| 2 | 201947028878-STATEMENT OF UNDERTAKING (FORM 3) [18-07-2019(online)].pdf | 2019-07-18 |
| 3 | 201947028878-REQUEST FOR EXAMINATION (FORM-18) [18-07-2019(online)].pdf | 2019-07-18 |
| 4 | 201947028878-POWER OF AUTHORITY [18-07-2019(online)].pdf | 2019-07-18 |
| 5 | 201947028878-MARKED COPIES OF AMENDEMENTS [18-07-2019(online)].pdf | 2019-07-18 |
| 6 | 201947028878-FORM FOR SMALL ENTITY(FORM-28) [18-07-2019(online)].pdf | 2019-07-18 |
| 7 | 201947028878-FORM 18 [18-07-2019(online)].pdf | 2019-07-18 |
| 8 | 201947028878-FORM 13 [18-07-2019(online)].pdf | 2019-07-18 |
| 9 | 201947028878-FORM 13 [18-07-2019(online)]-1.pdf | 2019-07-18 |
| 10 | 201947028878-FORM 1 [18-07-2019(online)].pdf | 2019-07-18 |
| 11 | 201947028878-FIGURE OF ABSTRACT [18-07-2019(online)].jpg | 2019-07-18 |
| 12 | 201947028878-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [18-07-2019(online)].pdf | 2019-07-18 |
| 13 | 201947028878-DRAWINGS [18-07-2019(online)].pdf | 2019-07-18 |
| 14 | 201947028878-DECLARATION OF INVENTORSHIP (FORM 5) [18-07-2019(online)].pdf | 2019-07-18 |
| 15 | 201947028878-COMPLETE SPECIFICATION [18-07-2019(online)].pdf | 2019-07-18 |
| 16 | 201947028878-AMMENDED DOCUMENTS [18-07-2019(online)].pdf | 2019-07-18 |
| 17 | 201947028878-Proof of Right (MANDATORY) [29-07-2019(online)].pdf | 2019-07-29 |
| 18 | Correspondence by Agent _Power of Attorney _13-08-2019.pdf | 2019-08-13 |
| 19 | Correspondence by Agent _Form-26 _13-08-2019.pdf | 2019-08-13 |
| 20 | 201947028878-FER.pdf | 2021-10-18 |
| 21 | 201947028878-OTHERS [06-03-2022(online)].pdf | 2022-03-06 |
| 22 | 201947028878-FER_SER_REPLY [06-03-2022(online)].pdf | 2022-03-06 |
| 23 | 201947028878-CLAIMS [06-03-2022(online)].pdf | 2022-03-06 |
| 24 | 201947028878-PatentCertificate02-06-2023.pdf | 2023-06-02 |
| 25 | 201947028878-IntimationOfGrant02-06-2023.pdf | 2023-06-02 |
| 1 | 2021-04-0515-25-03E_05-04-2021.pdf |