Abstract: A method for identifying an action, from amongst known actions, performed by a person, includes receiving n skeleton frames of the person performing the action from a skeleton recording device (104), where each skeleton frame represents data of N skeleton joints of the person. The method further includes computing a covariance matrix of a feature matrix based on joint coordinates of the skeleton joints from each skeleton frame. Also, the method includes identifying an error covariance matrix based on value of n, value of N, and value of variance of the joint coordinates of one of the skeleton joints. Further, the method includes evaluating a noise-free covariance matrix based on covariance matrix and error covariance matrix. Furthermore, an active joint matrix is determined based on an active joint threshold and the noise-free covariance matrix to identify the action from amongst the plurality of known actions.
CLIAMS: A method for identifying an action, from amongst a plurality of known actions, performed by a person, the method comprising:
receiving n skeleton frames of the person performing the action, wherein the n skeleton frames are received from a skeleton recording device (104), and wherein each of the n skeleton frames represents data of N skeleton joints of the person;
computing, by a processor (108), a covariance matrix of a feature matrix for the action, wherein the feature matrix is based on joint coordinates of the N skeleton joints from each of the n skeleton frames for the action;
identifying, by the processor (108), an error covariance matrix for the action based on value of n, value of N, and value of variance of the joint coordinates of one of the N skeleton joints, wherein the error covariance matrix relates to noise present in the joint coordinates of the N skeleton joints of the person;
evaluating, by the processor (108), a noise-free covariance matrix for the action based on the covariance matrix and the error covariance matrix for the action;
determining, by the processor (108), an active joint matrix for the action based on an active joint threshold and the noise-free covariance matrix for the action, wherein the active joint matrix represents a plurality of most active joint coordinates that uniquely identifies the action; and
identifying, by the processor (108), the action, from amongst the plurality of known actions, based on the active joint matrix for the action.
The method as claimed in claim 1 further comprising:
evaluating a histogram-based feature vector for the action based on the active joint matrix, wherein the histogram-based feature vector represents a histogram of occurrences of the plurality of most active joint coordinates in the active joint matrix for the action;
computing an identification feature vector that uniquely identifies the action, wherein the identification feature vector is of a dimension smaller than that of the histogram-based feature vector for the action and is computed based on a pre-defined dimension reduction technique performed on the histogram-based feature vector for the action; and
wherein the action, from amongst the plurality of known actions, performed by the person, is identified based on comparison of the identification feature vector for the action with training feature vectors for the plurality of known actions, and wherein a training feature vector for a known action uniquely identifies the known action.
The method as claimed in claim 1 further comprising:
obtaining the joint coordinates of the N skeleton joints from each of the n skeleton frames for the action;
subtracting, for each of the n skeleton frames, joint coordinates of a stable joint from the joint coordinates of the N skeleton joints of the person to obtain a plurality of deviated skeleton joints; and
evaluating the feature matrix for the action based on arranging, for each of the n skeleton frames, the plurality of deviated joint coordinates column-wise.
The method as claimed in claim 1, wherein the identifying the error covariance matrix for the action comprises:
determining elements of the error covariance matrix based on:
T'(i,j) = n((6N-5)/(3(N-1))) s2 , for i = j,
T'(i,j) = n((3N-2)/(3(N-1))) s2 , for i ? j and (i mod 3) = (j mod 3); and
T'(i,j) = (ns^2)/(3(N-1)) , for i ? j and (i mod 3) ? (j mod 3),
wherein T'(i,j) represents an (i,j)th element of the error covariance matrix for the action, and wherein s2 represents the value of variance of the joint coordinates of one of the N skeleton joints of the person.
The method as claimed in claim 1, wherein the N skeleton joints of the person comprises a head joint, a shoulder centre joint, a shoulder left joint, a shoulder right joint, a spine joint, a hand left joint, a hand right joint, an elbow right joint, an elbow left joint, a wrist right joint, a wrist left joint, a hip left joint, a hip right joint, a hip centre joint, a knee right joint, a knee left joint, a foot left joint, a foot right joint, an ankle right joint, and an ankle left joint.
The method as claimed in claim 1, wherein the method further comprising:
receiving n skeleton frames of a training person for each of the plurality of known actions being performed by the training person, wherein the n skeleton frames are received from the skeleton recording device (104), and wherein each of the n skeleton frames represents data of N skeleton joints of the training person;
obtaining joint coordinates of the N skeleton joints from each of the n skeleton frames for a respective known action;
subtracting, for each of the n skeleton frames, joint coordinates of a stable joint from the joint coordinates of the N skeleton joints to obtain a plurality of deviated skeleton joints;
arranging, for each of the n skeleton frames, the plurality of deviated joint coordinates column-wise to form a feature matrix for the respective known action;
computing a covariance matrix of the feature matrix for each of the plurality of known actions;
identifying an error covariance matrix for each of the plurality of known actions based on value of n, value of N, and value of variance of the joint coordinates of one of the N skeleton joints, wherein the error covariance matrix relates to noise present in the joint coordinates of the N skeleton joints;
evaluating a noise-free covariance matrix for each of the plurality of known actions based on the covariance matrix and the error covariance matrix for the respective known action;
determining an active joint matrix for each of the plurality of known actions based on an active joint threshold and the noise-free covariance matrix for each of the plurality of known actions, wherein the active joint matrix represents a plurality of most active joint coordinates that uniquely identifies a known action;
computing a histogram-based feature vector for each of the plurality of known actions based on the active joint matrix for the respective known action, wherein the histogram-based feature vector represents a histogram of occurrences of the plurality of most active joint coordinates in the active joint matrix for the respective known action;
determining, for each of the plurality of known actions, a training feature vector that uniquely identifies the respective known action, wherein the training feature vector is of a dimension smaller than that of the histogram-based feature for the respective known action and is determined based on a pre-defined dimension reduction technique performed on the histogram-based feature vector for the respective known action; and
training an action identification system (102) using the training feature vector for each of the plurality of known actions to identify the action, from amongst the plurality of known actions, performed by the person.
The method as claimed in claim 6, wherein the method further comprising:
receiving n skeleton frames of the training person for each of the plurality of known actions being performed by the training person for multiple times;
determining the training feature vector for each of the plurality of known actions performed for each of the multiple times; and
training the action identification system (102) using the training feature vector for each of the plurality of known actions performed for the multiple times.
The method as claimed in claim 6, wherein the action identification system (102) is trained using a classifier, wherein the classifier is a Support Vector Machine (SVM) classifier.
The method as claimed in claim 6, wherein identifying the error covariance matrix for each of the plurality of known actions comprises:
determining elements of the error covariance matrix based on:
T(i,j) = n((6N-5)/(3(N-1))) s2 , for i = j,
T(i,j) = n((3N-2)/(3(N-1))) s2 , for i ? j and (i mod 3) = (j mod 3); and
T(i,j) = (ns^2)/(3(N-1)) , for i ? j and (i mod 3) ? (j mod 3),
wherein T(i,j) represents an (i,j)th element of the error covariance matrix for the respective known action, and wherein s2 represents the value of variance of the joint coordinates of one of the N skeleton joints of the training person.
An action identification system (102) for identifying an action, from amongst a plurality of known actions, performed by a person, the action identification system (102) comprising:
a processor (108);
a skeleton data processing module (118) coupled to, and executable by, the processor (108) to,
receive n skeleton frames of the person performing the action, wherein the n skeleton frames are received from a skeleton recording device, and wherein each of the n skeleton frames represents data of N skeleton joints of the person;
an identification module (120) coupled to, and executable by, the processor (108) to,
compute a covariance matrix of a feature matrix for the action, wherein the feature matrix is based on joint coordinates of the N skeleton joints from each of the n skeleton frames for the action;
identify an error covariance matrix for the action based on value of n, value of N, and value of variance of the joint coordinates of one of the N skeleton joints, wherein the error covariance matrix represents noise present in the joint coordinates of the N skeleton joints;
evaluate a noise-free covariance matrix for the action based on the covariance matrix and the error covariance matrix for the action;
determine an active joint matrix for the action based on an active joint threshold and the noise-free covariance matrix for the action, wherein the active joint matrix represents a plurality of most active joint coordinates that uniquely identifies the action;
identify the action, from amongst the plurality of known actions, performed by the person based on the active joint matrix for the action.
The action identification system (102) as claimed in claim 10, wherein the identification module (120) further:
subtracts, for each of the n skeleton frames, joint coordinates of a stable joint from the joint coordinates of the N skeleton joints to obtain a plurality of deviated skeleton joints; and
evaluates the feature matrix for the action based on arranging, for each of the n skeleton frames, the plurality of deviated joint coordinates column-wise.
The action identification system (102) as claimed in claim 10, wherein the identification module (120) further:
determines elements of the error covariance matrix based on:
T'(i,j) = n((6N-5)/(3(N-1))) s2 , for i = j,
T'(i,j) = n((3N-2)/(3(N-1))) s2 , for i ? j and (i mod 3) = (j mod 3); and
T'(i,j) = (ns^2)/(3(N-1)) , for i ? j and (i mod 3) ? (j mod 3),
wherein T'(i,j) represents an (i,j)th element of the error covariance matrix, and wherein s2 represents the value of variance of the joint coordinates of one of the N skeleton joints.
The action identification system (102) as claimed in claim 10, wherein the identification module (120) further:
determines a histogram-based feature vector for the action based on the active joint matrix, wherein the histogram-based feature vector represents a histogram of occurrences of the plurality of most active joint coordinates in the active joint matrix for the action; and
computes an identification feature vector that uniquely identifies the action, wherein the identification feature vector is of a dimension smaller than that of the histogram-based feature vector for the action and is computed based on a pre-defined dimension reduction technique performed on the histogram-based feature vector for the action; and
wherein the action, from amongst the plurality of known actions, performed by the person, is identified based on comparison of the identification feature vector for the action with training feature vectors for the plurality of known actions, and wherein a training feature vector for a known action uniquely identifies the known action.
The action identification system (102) as claimed in claim 10, wherein the identification module 120 further:
evaluates a feature matrix for each of the plurality of known actions being performed by a training person, wherein the feature matrix is evaluated based on joint coordinates of N skeleton joints from each of n skeleton frames of the training person;
computes a covariance matrix of the feature matrix for each of the plurality of known actions;
identifies an error covariance matrix for each of the plurality of known actions based on value of n, value of N, and value of variance of the joint coordinates of one of the N skeleton joints, wherein the error covariance matrix relates to noise present in the joint coordinates of the N skeleton joints;
evaluates a noise-free covariance matrix for each of the plurality of known actions based on the covariance matrix and the error covariance matrix for the respective known action;
determines an active joint matrix for each of the plurality of known actions based on an active joint threshold and the noise-free covariance matrix for each of the plurality of known actions, wherein the active joint matrix represents a plurality of most active joint coordinates that uniquely identifies a known action;
computes a histogram-based feature vector for each of the plurality of known actions based on the active joint matrix for the respective known action, wherein the histogram-based feature vector represents a histogram of occurrences of the plurality of most active joint coordinates in the active joint matrix for the respective known action;
determines, for each of the plurality of known actions, a training feature vector that uniquely identifies the respective known action, wherein the training feature vector is of a dimension smaller than that of the histogram-based feature for the respective known action and is determined based on a pre-defined dimension reduction technique performed on the histogram-based feature vector for the respective known action; and
training the action identification system (102) using the training feature vector for each of the plurality of known actions to identify the action, from amongst the plurality of known actions, performed by the person.
A non-transitory computer-readable medium having embodied thereon a computer program for executing a method comprising:
receiving n skeleton frames of a person performing an action, wherein the n skeleton frames are received from a skeleton recording device (104), and wherein each of the n skeleton frames represents data of N skeleton joints of the person;
computing a covariance matrix of a feature matrix for the action, wherein the feature matrix is based on joint coordinates of the N skeleton joints from each of the n skeleton frames for the action;
identifying an error covariance matrix for the action based on value of n, value of N, and value of variance of the joint coordinates of one of the N skeleton joints, wherein the error covariance matrix relates to noise present in the joint coordinates of the N skeleton joints of the person;
evaluating a noise-free covariance matrix for the action based on the covariance matrix and the error covariance matrix for the action;
determining an active joint matrix for the action based on an active joint threshold and the noise-free covariance matrix for the action, wherein the active joint matrix represents a plurality of most active joint coordinates that uniquely identifies the action; and
identifying the action from amongst a plurality of known actions, based on the active joint matrix for the action.
,TagSPECI:As Attached
| # | Name | Date |
|---|---|---|
| 1 | 986-MUM-2014-Request For Certified Copy-Online(07-05-2015).pdf | 2015-05-07 |
| 2 | SPEC IN.pdf | 2018-08-11 |
| 3 | PD012064IN-SC_Request for Priority Documents-PCT.pdf | 2018-08-11 |
| 4 | FORM 5.pdf | 2018-08-11 |
| 5 | FORM 3.pdf | 2018-08-11 |
| 6 | FIG IN.pdf | 2018-08-11 |
| 7 | ABSTRACT1.jpg | 2018-08-11 |
| 8 | 986-MUM-2014-Power of Attorney-130215.pdf | 2018-08-11 |
| 9 | 986-MUM-2014-FORM 18.pdf | 2018-08-11 |
| 10 | 986-MUM-2014-FORM 1(16-4-2014).pdf | 2018-08-11 |
| 11 | 986-MUM-2014-Correspondence-130215.pdf | 2018-08-11 |
| 12 | 986-MUM-2014-CORESSPONDENCE(16-4-2014).pdf | 2018-08-11 |
| 13 | 986-MUM-2014-FER.pdf | 2019-09-03 |
| 14 | 986-MUM-2014-Information under section 8(2) [02-03-2020(online)].pdf | 2020-03-02 |
| 15 | 986-MUM-2014-FORM 3 [02-03-2020(online)].pdf | 2020-03-02 |
| 16 | 986-MUM-2014-FER_SER_REPLY [03-03-2020(online)].pdf | 2020-03-03 |
| 17 | 986-MUM-2014-DRAWING [03-03-2020(online)].pdf | 2020-03-03 |
| 18 | 986-MUM-2014-CLAIMS [03-03-2020(online)].pdf | 2020-03-03 |
| 19 | 986-MUM-2014-US(14)-HearingNotice-(HearingDate-19-04-2023).pdf | 2023-03-24 |
| 20 | 986-MUM-2014-Correspondence to notify the Controller [27-03-2023(online)].pdf | 2023-03-27 |
| 21 | 986-MUM-2014-FORM-26 [17-04-2023(online)].pdf | 2023-04-17 |
| 22 | 986-MUM-2014-Written submissions and relevant documents [27-04-2023(online)].pdf | 2023-04-27 |
| 23 | 986-MUM-2014-PatentCertificate08-06-2023.pdf | 2023-06-08 |
| 24 | 986-MUM-2014-IntimationOfGrant08-06-2023.pdf | 2023-06-08 |
| 1 | 2019-07-0514-38-52_05-07-2019.pdf |