Sign In to Follow Application
View All Documents & Correspondence

A Method And System For Detecting Pose Of A Subject In Real Time

Abstract: Disclosed herein is method and system for detecting pose of a subject in real-time. In an embodiment, nodal points corresponding to the subject may be identified and used for identifying skeleton pose of the subject. Thereafter, a feature descriptor for the skeleton pose may be computed based on the nodal points. Further, the feature descriptor of the skeleton pose may be compared with predetermined feature descriptors for detecting the pose of the subject as predefined pose corresponding to one of the predetermined feature descriptors used for the comparison. The method of present disclosure makes accurate pose detection from a two-dimensional image of the subject, using a pose detection model, which is pre-trained with predetermined feature descriptors and deep learning techniques. FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
28 September 2018
Publication Number
14/2020
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
bangalore@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-01-10
Renewal Date

Applicants

WIPRO LIMITED
Doddakannelli, Sarjapur Road, Bangalore, Karnataka, India, Pin Code-560 035.

Inventors

1. Dr. GOPICHAND AGNIHOTRAM
A-207, S.K. Aster, Doddathogur Village, Electronics City, Near Narashimha Swami Temple, Bangalore, Karnataka, India, Pin Code-560 100.
2. PRASANTH KUMAR P
Ponnarassery House, Gireesh Nager, Nallalam PO, Kozhikode, Kerala, India, Pin Code-673 027.
3. PANDURANG NAIK
"Sunit lsha", #302, Parkhe Mala, Baner, Pune, Maharashtra, India, Pin Code-411 045.

Specification

Claims:WE CLAIM:
1. method for detecting pose of a subject in real-time, the method comprising:
identifying, by a pose detection system (103), a plurality of nodal points (211) corresponding to the subject from an input image frame (102) of the subject;
identifying, by the pose detection system (103), a skeleton pose (213) of the subject based on the plurality of nodal points (211);
computing, by the pose detection system (103), a feature descriptor for the skeleton pose (213) based on the plurality of nodal points (211);
comparing, by the pose detection system (103), the feature descriptor of the skeleton pose (213) with a plurality of predetermined feature descriptors (107) using a pose detection model (105),
wherein the pose detection model (105) is pre-trained with each of the plurality of predetermined feature descriptors (107), corresponding to a predefined pose of the subject, using predetermined deep learning techniques,
wherein each of the plurality of predetermined feature descriptors (107) are computed based on skeleton key points obtained by rotating a plurality of nodal points (211), corresponding to a reference image frame (102) of the subject, in a three-dimensional vector space; and
detecting, by the pose detection system (103), the pose of the subject as the predefined pose corresponding to one of the plurality of predetermined feature descriptors (107) based on the comparison.

2. The method as claimed in claim 1, wherein the skeleton pose (213) of the subject is identified by joining the plurality of nodal points (211) based on one or more nodal arrangements retrieved from predefined node detection libraries.

3. The method as claimed in claim 1, wherein computing the feature descriptor for the skeleton pose (213) comprises:
identifying a subset of the plurality of nodal points (211) based on spatial difference between each of the plurality of nodal points (211);
generating a feature vector corresponding to each pair of the plurality of nodal points (211) in the subset of the plurality of nodal points (211);
computing distance values between each of the feature vectors; and
ordering the distance values in a predefined sequence for computing the feature descriptor.

4. The method as claimed in claim 1, wherein rotating the plurality of nodal points (211) comprises rotating the plurality of nodal points (211) along a predetermined angle with respect to the reference image frame (102) of the subject.

5. The method as claimed in claim 1, wherein detecting the pose of the subject comprises:
assigning a similarity score for each of the plurality of predetermined feature descriptors (107) based on comparison; and
detecting the predefined pose corresponding to one of a predetermined feature descriptor, among the plurality of predetermined feature descriptors (107), having highest similarity score as the pose of the subject.

6. A pose detection system (103) for detecting pose of a subject in real-time, the pose detection system (103) comprising:
a processor (203); and
a memory (205), communicatively coupled to the processor (203), wherein the memory (205) stores processor-executable instructions, which on execution, cause the processor (203) to:
identify a plurality of nodal points (211) corresponding to the subject from an input image frame (102) of the subject;
identify a skeleton pose (213) of the subject based on the plurality of nodal points (211);
compute a feature descriptor for the skeleton pose (213) based on the plurality of nodal points (211);
compare the feature descriptor of the skeleton pose (213) with a plurality of predetermined feature descriptors (107) using a pose detection model (105),
wherein the pose detection model (105) is pre-trained with each of the plurality of predetermined feature descriptors (107), corresponding to a predefined pose of the subject, using predetermined deep learning techniques,
wherein each of the plurality of predetermined feature descriptors (107) are computed based on skeleton key points obtained by rotating a plurality of nodal points (211), corresponding to a reference image frame (102) of the subject, in a three-dimensional vector space; and
detect the pose of the subject as the predefined pose corresponding to one of the plurality of predetermined feature descriptors (107) based on the comparison.

7. The pose detection system (103) as claimed in claim 6, wherein the processor (203) identifies the skeleton pose (213) of the subject by joining the plurality of nodal points (211) based on one or more nodal arrangements retrieved from predefined node detection libraries.

8. The pose detection system (103) as claimed in claim 6, wherein to compute the feature descriptor for the skeleton pose (213), the processor (203) is configured to:
identify a subset of the plurality of nodal points (211) based on spatial difference between each of the plurality of nodal points (211);
generate a feature vector corresponding to each pair of the plurality of nodal points (211) in the subset of the plurality of nodal points (211);
compute distance values between each of the feature vectors; and
order the distance values in a predefined sequence for computing the feature descriptor.

9. The pose detection system (103) as claimed in claim 6, wherein the instructions cause the processor (203) to rotate the plurality of nodal points (211) along a predetermined angle with respect to the reference image frame (102) of the subject.

10. The pose detection system (103) as claimed in claim 6, wherein to detect the pose of the subject, the processor (203) is configured to:
assign a similarity score for each of the plurality of predetermined feature descriptors (107) based on comparison; and
detect the predefined pose corresponding to one of a predetermined feature descriptor, among the plurality of predetermined feature descriptors (107), having highest similarity score as the pose of the subject.

Documents

Application Documents

# Name Date
1 201841036851-STATEMENT OF UNDERTAKING (FORM 3) [28-09-2018(online)].pdf 2018-09-28
2 201841036851-REQUEST FOR EXAMINATION (FORM-18) [28-09-2018(online)].pdf 2018-09-28
3 201841036851-POWER OF AUTHORITY [28-09-2018(online)].pdf 2018-09-28
4 201841036851-FORM 18 [28-09-2018(online)].pdf 2018-09-28
5 201841036851-FORM 1 [28-09-2018(online)].pdf 2018-09-28
6 201841036851-FIGURE OF ABSTRACT [28-09-2018].jpg 2018-09-28
7 201841036851-DRAWINGS [28-09-2018(online)].pdf 2018-09-28
8 201841036851-DECLARATION OF INVENTORSHIP (FORM 5) [28-09-2018(online)].pdf 2018-09-28
9 201841036851-COMPLETE SPECIFICATION [28-09-2018(online)].pdf 2018-09-28
10 201841036851-Request Letter-Correspondence [09-10-2018(online)].pdf 2018-10-09
11 201841036851-Power of Attorney [09-10-2018(online)].pdf 2018-10-09
12 201841036851-Form 1 (Submitted on date of filing) [09-10-2018(online)].pdf 2018-10-09
13 201841036851-Proof of Right (MANDATORY) [21-12-2018(online)].pdf 2018-12-21
14 Correspondence by Agent_Form 1_31-12-2018.pdf 2018-12-31
15 201841036851-PETITION UNDER RULE 137 [22-07-2021(online)].pdf 2021-07-22
16 201841036851-OTHERS [22-07-2021(online)].pdf 2021-07-22
17 201841036851-FORM 3 [22-07-2021(online)].pdf 2021-07-22
18 201841036851-FER_SER_REPLY [22-07-2021(online)].pdf 2021-07-22
19 201841036851-DRAWING [22-07-2021(online)].pdf 2021-07-22
20 201841036851-COMPLETE SPECIFICATION [22-07-2021(online)].pdf 2021-07-22
21 201841036851-CLAIMS [22-07-2021(online)].pdf 2021-07-22
22 201841036851-FER.pdf 2021-10-17
23 201841036851-US(14)-HearingNotice-(HearingDate-22-12-2023).pdf 2023-11-23
24 201841036851-POA [28-11-2023(online)].pdf 2023-11-28
25 201841036851-FORM 13 [28-11-2023(online)].pdf 2023-11-28
26 201841036851-Correspondence to notify the Controller [28-11-2023(online)].pdf 2023-11-28
27 201841036851-AMENDED DOCUMENTS [28-11-2023(online)].pdf 2023-11-28
28 201841036851-Written submissions and relevant documents [06-01-2024(online)].pdf 2024-01-06
29 201841036851-FORM-26 [06-01-2024(online)].pdf 2024-01-06
30 201841036851-PatentCertificate10-01-2024.pdf 2024-01-10
31 201841036851-IntimationOfGrant10-01-2024.pdf 2024-01-10

Search Strategy

1 2020-12-1616-59-02E_16-12-2020.pdf

ERegister / Renewals

3rd: 09 Apr 2024

From 28/09/2020 - To 28/09/2021

4th: 09 Apr 2024

From 28/09/2021 - To 28/09/2022

5th: 09 Apr 2024

From 28/09/2022 - To 28/09/2023

6th: 09 Apr 2024

From 28/09/2023 - To 28/09/2024

7th: 26 Sep 2024

From 28/09/2024 - To 28/09/2025

8th: 25 Sep 2025

From 28/09/2025 - To 28/09/2026