Abstract: The present disclosure relates to a method and a system for authenticating a user. In one embodiment, one or more input and target data samples extracted from a plurality of physiological and movement signals of the user are processed to train one or more regression models. In real time authentication, the input and target data samples are extracted from the plurality of physiological and activity signals and mapped with trained regression models to determine a regression error. Based on the regression error, an appropriate authentication signal is then generated and transmitted to the user. Using dynamically selected multiple input and target data samples for user authentication increases the accuracy of authentication, thereby reducing possibilities of invalid authentication. Further, the power consumed by the sensors and computation load is reduced by dynamically powering up and powering down of the one or more sensors based on their usage during the authentication process. FIG. 3
CLIAMS:We Claim:
1. A method of authenticating a subject, the method comprising:
receiving in real time, by a processor of a wearable device, at least a plurality of physiological and movement signals of the subject from one or more physiological and activity sensors of the wearable device;
deriving, by the processor, one or more input and target data samples associated with the plurality of received physiological and movement signals;
determining, by the processor, a regression error value based on the derived input and target data samples; and
transmitting, by the processor, a signal authenticating the subject based on the comparison of the determined regression error value with a predetermined threshold regression error value.
2. The method as claimed in claim 1, further comprising:
receiving at least a plurality of physiological and movement signals of the subject from the one or more physiological and activity sensors;
deriving one or more input and target data samples associated with the plurality of received physiological and movement signals;
generating one or more combinations of input and target data samples by randomly selecting input and target data sample from the corresponding derived input and target data samples; and
training one or more regression models based on the one or more generated combinations of the input and target data samples to generate one or more trained regression models associated with the subject.
3. The method as claimed in claim 2, further comprising :
determining a model regression error value for each of the trained regression models;
calculating a training progress value at a time based on the determined model regression error value and a predetermined training threshold value; and
displaying the determined training progress value.
4. The method as claimed in claims 1 and 2, wherein deriving the one or more input and target data samples comprising:
extracting, at a predetermined time period, input data samples associated with at least a first subset of the plurality of received physiological and movement signals; and
extracting, at a predetermined time period, target data samples associated with at least a second subset of the plurality of received physiological and movement signal, wherein the first subset and the second subset comprises the plurality of physiological and movement signals.
5. The method as claimed in claim 1, wherein determining the regression error value comprising:
mapping the derived input data samples onto one or more trained regression models associated with the subject;
obtaining the one or more trained target data samples corresponding to the trained regression models that are mapped;
comparing the one or more trained target data samples with the derived target data samples; and
determining the regression error value based on comparison.
6. The method as claimed in claims 1 and 2, further comprising:
determining a first regression score of each of the trained regression models associated with an authorized subject based on the corresponding regression error of each trained regression model associated with the authorized subject;
determining a second regression score of each of the trained regression models associated with an unauthorized subject based on the corresponding regression error of each trained regression model associated with the unauthorized subject;
determining a significant score for each of the trained regression models based on the first and the second regression scores thus determined;
selecting one or more trained regression models having maximum significant score; and
dynamically performing at least one of enabling and disabling the one or more physiological and activity sensors based on the usage within the selected trained regression models.
7. A wearable device for authenticating a subject, said device comprising:
one or more one or more physiological and activity sensors;
a processor coupled with the one or more physiological and activity sensors; and
a memory communicatively coupled to the processor, wherein the memory stores processor-executable instructions, which, on execution, cause the processor to:
receive in real time at least a plurality of physiological and movement signals of the subject from the one or more physiological and activity sensors;
derive one or more input and target data samples associated with the plurality of received physiological and movement signals;
determine a regression error value based on the derived input and target data samples; and
transmit a signal authenticating the subject based on the comparison of the determined regression error value with a predetermined threshold regression error value.
8. The wearable device as claimed in claim 7, wherein the instructions, on execution, further cause the processor to:
receive at least a plurality of physiological and movement signals of the subject from the one or more physiological and activity sensors;
derive one or more input and target data samples associated with the plurality of received physiological and movement signals;
generate one or more combinations of input and target data samples by randomly selecting input and target data sample from the corresponding derived input and target data samples; and
train one or more regression models based on the one or more generated combinations of the input and target data samples to generate one or more trained regression models associated with the subject.
9. The wearable device as claimed in claim 8, wherein the instructions, on execution, further cause the processor to:
determine a model regression error value for each of the trained regression models;
calculate a training progress value at a time based on the determined model regression error value and a predetermined training threshold value; and
display the determined training progress value.
10. The wearable device as claimed in claims 7 and 8, wherein the instructions, on execution, further cause the processor to derive the one or more input and target data samples by:
extracting, at a predetermined time period, input data samples associated with at least a first subset of the plurality of received physiological and movement signals; and
extracting, at a predetermined time period, target data samples associated with at least a second subset of the plurality of received physiological and movement signal, wherein the first subset and the second subset comprises the plurality of physiological and movement signals.
11. The wearable device as claimed in claim 7, wherein the instructions, on execution, further cause the processor to determine the regression error value by:
mapping the derived input data samples onto one or more trained regression models associated with the subject;
obtaining the one or more trained target data samples corresponding to the trained regression models that are mapped;
comparing the one or more trained target data samples with the derived target data samples; and
determining the regression error value based on the comparison.
12. The wearable device as claimed in claims 7 and 8, wherein the instructions, on execution, further cause the processor to:
determine a first regression score of each of the trained regression models associated with an authorized subject based on the corresponding regression error of each trained regression model associated with the authorized subject;
determine a second regression score of each of the trained regression models associated with an unauthorized subject based on the corresponding regression error of each trained regression model associated with the unauthorized subject;
determine a significant score for each of the trained regression models based on the first and the second regression scores thus determined;
select one or more trained regression models having maximum significant score; and
dynamically perform at least one of enable and disable of the one or more physiological and activity sensors based on the usage within the selected trained regression models.
13. A non-transitory computer readable medium including instructions stored thereon that when processed by at least one processor cause a system to perform acts of:
receiving at least a plurality of physiological and movement signals of the subject from one or more physiological and activity sensors of the wearable device;
deriving one or more input and target data samples associated with the plurality of received physiological and movement signals at predetermined time period;
determining a regression error value based on the derived input and target data samples; and
transmitting a signal authenticating the subject based on the comparison of the determined regression error value with a predetermined threshold regression error value.
14. A method of authenticating a subject, the method comprising:
receiving, by a processor of an authenticating device, one or more input and target data samples from a wearable device, wherein the one or more input and target data samples are derived from a plurality of physiological and movement signals of the subject received by one or more physiological and activity sensors of the wearable device;
determining, by the processor, a regression error value based on the derived input and target data samples; and
authenticating, by the processor, the subject based on the comparison of the determined regression error value with a predetermined threshold regression error value.
15. A system for authenticating a subject, said system comprising:
a processor; and
a memory communicatively coupled to the processor, wherein the memory stores processor-executable instructions, which, on execution, cause the processor of the authenticating device to:
receive, one or more input and target data samples from the wearable device, wherein the one or more input and target data samples associated with at least a plurality of physiological and movement signals of the subject received by the one or more physiological and activity sensors;
determine a regression error value based on the received input and target data samples; and
authenticate the subject based on the comparison of the determined regression error value with a predetermined threshold regression error value.
Dated this 10th day of November 2014
M.S. Devi
Of K&S Partners
Agent for the Applicant
,TagSPECI:FIELD OF THE DISCLOSURE
The present subject matter is related, in general to system and method for authenticating a user, and more particularly, but not exclusively to authentication using biometric data.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 5655-CHE-2014 FORM-9 10-11-2014.pdf | 2014-11-10 |
| 1 | 5655-CHE-2014-RELEVANT DOCUMENTS [20-09-2023(online)].pdf | 2023-09-20 |
| 2 | 5655-CHE-2014 FORM-18 10-11-2014.pdf | 2014-11-10 |
| 2 | 5655-CHE-2014-PROOF OF ALTERATION [08-11-2022(online)].pdf | 2022-11-08 |
| 3 | 5655-CHE-2014-US(14)-HearingNotice-(HearingDate-28-05-2021).pdf | 2021-10-17 |
| 3 | 5655-CHE-2014-Request For Certified Copy-Online(13-11-2014).pdf | 2014-11-13 |
| 4 | IP28309-spec.pdf | 2014-11-14 |
| 4 | 5655-CHE-2014-IntimationOfGrant25-06-2021.pdf | 2021-06-25 |
| 5 | IP28309-fig.pdf | 2014-11-14 |
| 5 | 5655-CHE-2014-PatentCertificate25-06-2021.pdf | 2021-06-25 |
| 6 | FORM 5-IP28309.pdf | 2014-11-14 |
| 6 | 5655-CHE-2014-Annexure [11-06-2021(online)].pdf | 2021-06-11 |
| 7 | FORM 3-IP28309.pdf | 2014-11-14 |
| 7 | 5655-CHE-2014-Written submissions and relevant documents [11-06-2021(online)].pdf | 2021-06-11 |
| 8 | 5655CHE2014_certifiedcopyrequest.pdf | 2014-11-14 |
| 8 | 5655-CHE-2014-Correspondence to notify the Controller [17-05-2021(online)].pdf | 2021-05-17 |
| 9 | 5655-CHE-2014-FORM-26 [17-05-2021(online)].pdf | 2021-05-17 |
| 9 | abstract5655-CHE-2014.jpg | 2014-11-17 |
| 10 | 5655-CHE-2014 POWER OF ATTORNEY 11-05-2015.pdf | 2015-05-11 |
| 10 | 5655-CHE-2014-ABSTRACT [07-02-2020(online)].pdf | 2020-02-07 |
| 11 | 5655-CHE-2014 FORM-1 11-05-2015.pdf | 2015-05-11 |
| 11 | 5655-CHE-2014-CLAIMS [07-02-2020(online)].pdf | 2020-02-07 |
| 12 | 5655-CHE-2014 CORRESPONDENCE OTHERS 11-05-2015.pdf | 2015-05-11 |
| 12 | 5655-CHE-2014-DRAWING [07-02-2020(online)].pdf | 2020-02-07 |
| 13 | 5655-CHE-2014-FER.pdf | 2019-08-09 |
| 13 | 5655-CHE-2014-FER_SER_REPLY [07-02-2020(online)].pdf | 2020-02-07 |
| 14 | 5655-CHE-2014-FORM 3 [07-02-2020(online)].pdf | 2020-02-07 |
| 15 | 5655-CHE-2014-FER.pdf | 2019-08-09 |
| 15 | 5655-CHE-2014-FER_SER_REPLY [07-02-2020(online)].pdf | 2020-02-07 |
| 16 | 5655-CHE-2014 CORRESPONDENCE OTHERS 11-05-2015.pdf | 2015-05-11 |
| 16 | 5655-CHE-2014-DRAWING [07-02-2020(online)].pdf | 2020-02-07 |
| 17 | 5655-CHE-2014-CLAIMS [07-02-2020(online)].pdf | 2020-02-07 |
| 17 | 5655-CHE-2014 FORM-1 11-05-2015.pdf | 2015-05-11 |
| 18 | 5655-CHE-2014-ABSTRACT [07-02-2020(online)].pdf | 2020-02-07 |
| 18 | 5655-CHE-2014 POWER OF ATTORNEY 11-05-2015.pdf | 2015-05-11 |
| 19 | 5655-CHE-2014-FORM-26 [17-05-2021(online)].pdf | 2021-05-17 |
| 19 | abstract5655-CHE-2014.jpg | 2014-11-17 |
| 20 | 5655-CHE-2014-Correspondence to notify the Controller [17-05-2021(online)].pdf | 2021-05-17 |
| 20 | 5655CHE2014_certifiedcopyrequest.pdf | 2014-11-14 |
| 21 | 5655-CHE-2014-Written submissions and relevant documents [11-06-2021(online)].pdf | 2021-06-11 |
| 21 | FORM 3-IP28309.pdf | 2014-11-14 |
| 22 | 5655-CHE-2014-Annexure [11-06-2021(online)].pdf | 2021-06-11 |
| 22 | FORM 5-IP28309.pdf | 2014-11-14 |
| 23 | 5655-CHE-2014-PatentCertificate25-06-2021.pdf | 2021-06-25 |
| 23 | IP28309-fig.pdf | 2014-11-14 |
| 24 | 5655-CHE-2014-IntimationOfGrant25-06-2021.pdf | 2021-06-25 |
| 24 | IP28309-spec.pdf | 2014-11-14 |
| 25 | 5655-CHE-2014-US(14)-HearingNotice-(HearingDate-28-05-2021).pdf | 2021-10-17 |
| 25 | 5655-CHE-2014-Request For Certified Copy-Online(13-11-2014).pdf | 2014-11-13 |
| 26 | 5655-CHE-2014-PROOF OF ALTERATION [08-11-2022(online)].pdf | 2022-11-08 |
| 26 | 5655-CHE-2014 FORM-18 10-11-2014.pdf | 2014-11-10 |
| 27 | 5655-CHE-2014-RELEVANT DOCUMENTS [20-09-2023(online)].pdf | 2023-09-20 |
| 27 | 5655-CHE-2014 FORM-9 10-11-2014.pdf | 2014-11-10 |
| 1 | 5655CHE2014_05-08-2019.pdf |