Sign In to Follow Application
View All Documents & Correspondence

Skeleton Based Identification Of Individuals

Abstract: Systems and methods for skeleton based identification of individuals are described. In one embodiment, the method comprises obtaining skeleton data from a three dimensional (3D) skeleton of an individual, and identifying one or more gait cycles based on the skeleton data. For each of the one or more gait cycles, a plurality of gait features of the individual is further extracted from the 3D skeleton based on the skeleton data. The plurality of gait features include at least one of area related features and dynamic centroid distance related features. Based on the extracted gait features, the individual is identified.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
31 October 2012
Publication Number
18/2014
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

TATA CONSULTANCY SERVICES LIMITED
Nirmal Building  9th Floor  Nariman Point  Mumbai  Maharashtra 400021

Inventors

1. SINHA  Aniruddha
Tata Consultancy Services  Plot A2  M2 & N2  Sector V  Block GP  Salt Lake Electronics Complex  Kolkata - 700091  West Bengal
2. CHAKRAVARTY  Kingshuk
Tata Consultancy Services  Plot A2  M2 & N2  Sector V  Block GP  Salt Lake Electronics Complex  Kolkata - 700091  West Bengal
3. BHOWMICK  Brojeshwar
Tata Consultancy Services  Plot A2  M2 & N2  Sector V  Block GP  Salt Lake Electronics Complex  Kolkata - 700091  West Bengal

Specification

FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)
1. Title of the invention: SKELETON BASED IDENTIFICATION OF INDIVIDUALS
2. Applicant(s)
NAME NATIONALITY ADDRESS
TATA CONSULTANCY Indian Nirmal Building, 9th Floor, Nariman
SERVICES LIMITED Point, Mumbai, Maharashtra 400021,
India
3. Preamble to the description
COMPLETE SPECIFICATION
The following specification particularly describes the invention and the manner in which it
is to be performed.

TECHNICAL FIELD
[0001] The present subject matter relates, in general, to identification of individuals
and, in particular, to a system and a method for skeleton based identification of individuals.
BACKGROUND
[0002] A rapid growth of interest in individual recognition nowadays necessitates the
use of systems for providing accurate identification and verification of individuals. One such system is a biometric identification system for identifying an individual based on extracting biometric features peculiar to an individual’s body, such as a fingerprint and a palm print, and comparing the extracted features with corresponding pre-stored biometric features, such as pre-stored fingerprint sample and palm print sample, to identify the individual. Such a biometric identification system is useful in many applications, such as verifying identity of the individuals prior to allowing access to a secure area or a restricted area, secure computer and network based transactions, and video surveillance.
[0003] In recent years, individuals walking movements has emerged as another useful
biometric feature peculiar to the individual’s body. Various identification systems have been developed to recognize individuals based on their walking movements. Such identification systems capture two dimensional (2D) images depicting walking movements of the individual, and extracts peculiar gait features of the individual from the 2D images based on which the individual is identified.
SUMMARY
[0004] This summary is provided to introduce concepts related to skeleton based
identification of individuals. These concepts are further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[0005] In one embodiment, the method for skeleton based identification of individuals
comprises obtaining skeleton data from a three dimensional (3D) skeleton of an individual,

and detecting one or more gait cycles based on the skeleton data. For each of the one or more gait cycles, a plurality of gait features of the individual is further extracted from the 3D skeleton based on the skeleton data. The plurality of gait features include at least one of area related features and dynamic centroid distance related features. Based on the extracted gait features, the individual is identified.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] The detailed description is described with reference to the accompanying
figure(s). In the figure(s), the left-most digit(s) of a reference number identifies the figure in
which the reference number first appears. The same numbers are used throughout the figure(s)
to reference like features and components. Some embodiments of systems and/or methods in
accordance with embodiments of the present subject matter are now described, by way of
example only, and with reference to the accompanying figure(s), in which:
[0007] Fig. 1 illustrates a network environment implementing an identification
system, according to an embodiment of the present subject matter.
[0008] Fig. 2a illustrates components of the identification system, according to an
embodiment of the present subject matter.
[0009] Fig. 2b illustrates an exemplary 3D skeleton of an individual to be identified.
[0010] Fig. 2c illustrates an Adaptive Neutral Network (ANN) classifier with a feature
selection layer, according to an embodiment of the present subject matter.
[0011] Fig. 2d illustrates exemplary gait features selection using the ANN classifier.
[0012] Fig. 3 illustrates a method for skeleton based identification of individuals,
according to an embodiment of the present subject matter.
DETAILED DESCRIPTION
[0013] Conventionally, various biometric identification systems for identifying
individuals based on biometric features, such as fingerprints, palm prints, etc., are available. Such biometric identification systems, however, suffer from numerous drawbacks. For example, the individual to be identified needs to be in physical contact with the biometric identification system. Also, the individual’s co-operation and awareness is desired for the

identification. Moreover, such biometric identification systems usually fail to identify the
individuals, if the resolution of images containing the fingerprints, or palm prints is low.
[0014] Other advanced biometric identification systems capable of identifying an
individual based on his behavioral characteristic, such walking style, or gait features have been developed in the past few years. However, such advanced biometric identification systems identify the individuals based on camera captured 2D images of the individual’s walking movements. The camera captured 2D images are usually sensitive to the camera view angle, and may fail to accurately identify the individual if the walking movement of the individual is not parallel to the plane of the image. Moreover, various dynamic gait features, such as angle between two legs or two arms that varies from individual to individual, cannot be accurately identified from the 2D images.
[0015] In accordance with the present subject matter, a system and a method for
skeleton based identification of individuals are described. In one embodiment, a 3D skeleton of an individual is received in the form of one or more skeleton frames from a skeleton recording device, such as Kinect®. Thereafter, skeleton data comprising 3D coordinates, i.e., x, y, and z coordinates, of a plurality of skeleton joints of the individual is obtained from the received 3D skeleton. In one implementation, 3D coordinates of around 20 skeleton joints including head, shoulder centre, shoulder left, shoulder right, spine, hand left, hand right, elbow right, elbow left, wrist right, wrist left, hip left, hip right, hip centre, knee right, knee left, foot left, foot right, ankle right, and ankle left of the individual, can be obtained from the 3D skeleton.
[0016] Based on the skeleton data, one or more gait cycles are detected. In one
implementation, full-gait cycles are detected based on the skeleton data. In another implementation, half-gait cycles are detected based on the skeleton data. A full-gait cycle may be understood as a cycle that starts with right-ankle or left-ankle forward and ends with the same right-ankle or left-ankle forward, and a half-gait cycle may be understood as a cycle that starts with right-ankle or left-ankle forward and ends with left-ankle or right-ankle forward, respectively. In one implementation, a gait cycle can be detected based on computing a horizontal distance between the left ankle and the right ankle of the individual. In other words, distance between x coordinates of the left ankle and the right ankle, x being the horizontal plane.

[0017] For each of the one or more gait cycles, a plurality of gait features of an
individual is extracted from the 3D skeleton based on the skeleton data. The gait features may include a combination of static body-shape parameters, and dynamic time-varying gait signals. Examples of the gait features that are extracted, in accordance with one embodiment, include area related features, such as mean of area occupied by upper body portion and mean of area occupied by lower body portion of the individual. Further, the gait features may include angle related features, such as mean, standard deviation and maximum of angle of the upper left leg relative to the vertical axis, angle of the lower left leg relative to the upper left leg, and angle of the left ankle relative to horizontal axis, and mean, standard deviation and maximum of angle of the upper right leg relative to the vertical axis, angle of the lower right leg relative to the upper right leg, and angle of the right ankle relative to horizontal axis.
[0018] Furthermore, the gait features may include dynamic centroid distance related
features, such as mean, standard deviation, and maximum of Euclidean distances between centroid of upper body portion and centroids of right hand, left hand, right leg, and left leg.
[0019] According to one implementation, the gait features may also include other
static and dynamic features, such as step length, walking speed, and distance between the joints, such as height and length of the legs, torso, both lower legs, both thighs, both forearms, and both upper arms.
[0020] Subsequent to extraction of the gait features, the individual is identified based
on gait features, for example, the gait features that can uniquely identify the individual are selected from amongst a plurality of gait features to identify the individual. Selection of the gait features and identification of the individual is carried out using a conventionally known Adaptive Neutral Network (ANN) classifier with a feature selection layer.
[0021] According to the systems and the methods of the present subject matter, the
individuals are identified based on the 3D skeleton of the individual. Thus, accurate measurement of the gait features based on the 3D coordinates of the skeleton joints is achieved. Further, the system and the method uses a combination of static and dynamic gait features including area related features, dynamic centroid distance related features, angle related features, and other static and dynamic features which further improves the accuracy of the individual’s identification.

[0022] The following disclosure describes system and method for skeleton based
identification of individuals. While aspects of the described system and method can be implemented in any number of different computing systems, environments, and/or configurations, embodiments for the skeleton based identification of individuals are described in the context of the following exemplary system(s) and method(s).
[0023] Fig. 1 illustrates a network environment 100 implementing an identification
system 102, in accordance with an embodiment of the present subject matter.
[0024] In one implementation, the network environment 100 can be a public network
environment, including thousands of personal computers, laptops, various servers, such as blade servers, and other computing devices. In another implementation, the network environment 100 can be a private network environment with a limited number of computing devices, such as personal computers, servers, laptops, and/or communication devices, such as mobile phones and smart phones.
[0025] The identification system 102 is communicatively connected to a plurality of
user devices 104-1, 104-2, 104-3...,and, 104-N, collectively referred to as user devices 104 and individually referred to as a user device 104, through a network 106. In one implementation, a plurality of users may use the user devices 104 to communicate with the identification system 102. In said implementation, the identification system 102 is further connected to a skeleton recording device 105 through the network 106. Though the skeleton recording device 105 is shown external to the identification system 102, it is well appreciated that the skeleton recording device 105, in another implementation, can be integrated within the identification system 102.
[0026] The identification system 102 and the user devices 104 may be implemented in
a variety of computing devices, including, servers, a desktop personal computer, a notebook or portable computer, a workstation, a mainframe computer, a laptop and/or communication device, such as mobile phones and smart phones. Further, in one implementation, the identification system 102 may be a distributed or centralized network system in which different computing devices may host one or more of the hardware or software components of the identification system 102.

[0027] The identification system 102 may be connected to the user devices 104 over
the network 106 through one or more communication links. The communication links between the identification system 102 and the user devices 104 are enabled through a desired form of communication, for example, via dial-up modem connections, cable links, digital subscriber lines (DSL), wireless, or satellite links, or any other suitable form of communication.
[0028] The network 106 may be a wireless network, a wired network, or a
combination thereof. The network 106 can also be an individual network or a collection of many such individual networks, interconnected with each other and functioning as a single large network, e.g., the Internet or an intranet. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and such. The network 106 may either be a dedicated network or a shared network, which represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), etc., to communicate with each other. Further, the network 106 may include network devices, such as network switches, hubs, routers, for providing a link between the identification system 102 and the user devices 104. The network devices within the network 106 may interact with the identification system 102, and the user devices 104 through the communication links.
[0029] According to the present subject matter, the identification system 102 obtains
skeleton data comprising 3D coordinates, i.e., x, y, and z coordinates of various skeleton joints of the individual to be recognized from a 3D skeleton of the individual from the skeleton recording device 105. Based on the skeleton data, the identification system 102 detects one or more gait cycles. For each of the gait cycles, a plurality of gait features of the individual is extracted from the 3D skeleton based on the skeleton data. In one embodiment, the identification system 102 includes a feature extraction module 116, which is configured to extract the gait features of the individual. The gait features referred herein may include a plurality of area related features, dynamic centroid distance related features, angle related features, and other static and dynamic features.

[0030] The area related features may include mean of area occupied by the upper body
portion and mean of area occupied by the lower body portion of the individual. The angle related features may include mean, standard deviation and maximum of angle of the upper left leg relative to the vertical axis, angle of the lower left leg relative to the upper left leg, and angle of the left ankle relative to horizontal axis. The angle related features may also include mean, standard deviation and maximum of angle of the upper right leg relative to the vertical axis, angle of the lower right leg relative to the upper right leg, and angle of the right ankle relative to horizontal axis. Further, the dynamic centroid distance related features may include mean, standard deviation, and maximum of Euclidean distances between centroid of the upper body portion and centroids of right hand, left hand, right leg, and left leg. The other static and dynamic features may include step length, walking speed, and distance between the joints, such as height and length of the legs, torso, both lower legs, both thighs, both forearms, and both upper arms. Based on the above mentioned extracted gait features, the identification system 102 identifies an individual.
[0031] The manner in which the individual identification takes place is explained in
greater detail according to the Fig. 2a.
[0032] Fig. 2a illustrates various components of the identification system 102,
according to an embodiment of the present subject matter.
[0033] In said embodiment, the identification system 102 includes one or more
processor(s) 202, interfaces 204, and a memory 206 coupled to the processor(s) 202. The processor(s) 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) 202 are configured to fetch and execute computer-readable instructions and data stored in the memory 206.
[0034] The functions of the various elements shown in the figure, including any
functional blocks labeled as “processor(s)”, may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of

which may be shared. Moreover, explicit use of the term “processor” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read only memory (ROM) for storing software, random access memory (RAM), non-volatile storage. Other hardware, conventional and/or custom, may also be included.
[0035] The interface(s) 204 may include a variety of software and hardware
interfaces, for example, interface for peripheral device(s), such as a keyboard, a mouse, an external memory, and a printer. Further, the interface(s) 204 may enable the identification system 102 to communicate over the network 106, and may include one or more ports for connecting the identification system 102 with other computing devices, such as web servers and external databases. The interface(s) 204 may facilitate multiple communications within a wide variety of protocols and networks, such as a network, including wired networks, e.g., LAN, cable, etc., and wireless networks, e.g., WLAN, cellular, satellite, etc.
[0036] The memory 206 may include any computer-readable medium known in the art
including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The identification system 102 also includes module(s) 208 and data 210.
[0037] The module(s) 208 include routines, programs, objects, components, data
structures, etc., which perform particular tasks or implement particular abstract data types. The module(s) 208 further include, in addition to the feature extraction module 116, a gait cycle detection module 212, a feature selection module 214, and other module(s) 216.
[0038] The data 210 serves, amongst other things, as a repository for storing data
processed, received and generated by one or more of the modules 208. The data 210 includes skeleton data 220, feature data 222, and other data 224. The skeleton data 220 includes the 3D skeleton received from the skeleton recording device 105, and the skeleton data. The feature data 222 includes the extracted gait features. The other data 224 includes data generated as a result of the execution of one or more other modules 216.

[0039] According to the present subject matter, the gait cycle detection module 212 of
the identification system 102 receives a 3D skeleton of an individual to be indentified in the form of one or more skeleton frames. In one implementation, the gait cycle detection module 212 receives the 3D skeleton of the individual from the skeleton recording device 105 (shown in fig. 1), such as Kinect® connected to the identification system 102. Each of skeleton frames may contain one complete skeleton, or a plurality of skeleton frames in combination constitutes one complete skeleton. From the 3D skeleton, the gait cycle detection module 212 may obtain skeleton data comprising 3D coordinates, i.e., x, y, and z coordinates, of a plurality of skeleton joints of the individual. According to one implementation, the gait cycle detection module 212 obtains 3D coordinates of around 20 skeleton joints including head, shoulder centre, shoulder left, shoulder right, spine, hand left, hand right, elbow right, elbow left, wrist right, wrist left, hip left, hip right, hip centre, knee right, knee left, foot left, foot right, ankle right, and ankle left as the skeleton data.
[0040] In one implementation, the gait cycle detection module 212 receives the 3D
skeleton of the individual in sideway walking patterns. For example, the skeleton recording device 105 can be configured to generate the 3D skeleton of the individual walking from left direction to right direction, and/or the skeleton data of the individual walking from right direction to left direction, and the 3D skeleton generated in such sideway walking patterns can be exported to identification system 102, where the gait cycle detection module 212 receives such 3D skeleton. In one implementation, the gait cycle detection module 212 may store the received 3D skeleton and the skeleton data obtained from the 3D skeleton within the skeleton data 220.
[0041] Based on the skeleton data, the gait cycle detection module 212 of the
identification system 102 further detects at least one gait cycle, say a half-gait cycle. The description hereinafter is explained with reference to half-gait cycle only for the purpose of explanation, it should not be construed as a limitation, it is well appreciated that the full-gait cycles can also be computed for the purpose of identifying the individual. In one implementation, the gait cycle detection module 212 computes horizontal distance between the left ankle and the right ankle of the individual to identify the half-gait cycle. For example, the gait cycle detection module 212 computes the horizontal distance using the equation (1) provided below:

wherein, represents the horizontal distance,
represents the x coordinate of the left ankle,
represents the x coordinate of the right ankle, and
represents the total number of skeleton frames, such that, is greater than 1.
[0042] The gait cycle detection module 212 may smoothen the computed
horizontal distance (dk) using a conventionally known moving average algorithm with a small window size. For each of the half-gait cycles, the feature extraction module 116 is configured to extract a plurality of gait features of an individual from the 3D skeleton based on the skeleton data. As indicated previously, the gait features may include a plurality of area related features, angle related features, dynamic centroid distance related features, and other static and dynamic features.
[0043] The area related features may include, without limitation, mean of area
occupied by upper body portion and mean of area occupied by lower body portion of the individual. In one implementation, the feature extraction module 116 extracts the area occupied by the upper body portion of the individual using the equation (2) provided below:
wherein, represents the area occupied by the upper body portion of the individual,
represents the number of skeleton joints related to the upper body portion, and represents the skeleton joints related to the upper body portion, such that the coordinates of ith skeleton joint is (xi, yi).
[0044] Similarly, the feature extraction module 116 extracts the area occupied by the
lower body portion of the individual using the equation (3) provided below:
wherein, fal represents the area occupied by the lower body portion of the individual,
represents the number of skeleton joints related to the lower body portion, and represents the skeleton joints related to the lower body portion, such that the

coordinates of ith skeleton joint is (xi, yi).
[0045] The feature extraction module 116 calculates the area occupied by the upper
body portion and the lower body portion of the individual based on forming a closed polygon of skeleton joints of the upper body and the lower body portions. For example, the skeleton joints considered for extracting the area occupied by the upper body portion (fau) may include shoulder centre, shoulder left, hip left, hip centre, hip right, and shoulder right, and the skeleton joints considered for extracting the area occupied by the lower body portion (fal) may include hip centre, hip right, knee right, ankle right, ankle left, knee left, and hip left.
[0046] Subsequent to the calculation, the feature extraction module 116 computes
mean of area occupied both by the upper body portion and the lower body portion.
[0047] According to one implementation, the area related features are mathematically
represented by the expression (4) provided below:
…. (4)
[0048] In the above expression, represents mean of area occupied by the
upper body portion (fau), represents mean of area occupied by the lower body portion
(fal), and (fA) represents area related features.
[0049] The angle related features may include, without limitation, mean, standard
deviation and maximum of angle of the upper left leg relative to the vertical axis, angle of the lower left leg relative to the upper left leg, and angle of the left ankle relative to horizontal axis. The angle related features may also include mean, standard deviation and maximum of angle of the upper right leg relative to the vertical axis, angle of the lower right leg relative to the upper right leg, and angle of the right ankle relative to horizontal axis. The angle related features may be represented by (fAG).
[0050] The feature extraction module 116 is further configured to extract dynamic
centroid distance related feature based on computing mean, standard deviation, and maximum of Euclidean distances between centroid of the upper body portion and centroids of right hand, left hand, right leg, and left leg. According to one implementation, the feature extraction module 116 computes the centroids using the equation (5) provided below:

wherein, represents the centroids, and
represents the number of skeleton joints.
[0051] The skeleton joints may be selected in such a manner that they together form a
closed polygon. The centroids (C) may be computed for the upper body portion, right hand, left hand, left leg, and right leg of the individual. The upper body portion covers the skeleton joints, such as shoulder centre, shoulder left, hip left, hip right, and shoulder right. The right hand covers the skeleton joints, such as shoulder right, elbow right, wrist right and hand right. Similarly, the left hand covers the shoulder left, elbow left, wrist left, and hand left. The right leg covers the skeleton joints, such as hip right, knee right, ankle right, and foot right. The left leg covers the hip left, knee left, ankle left, and foot left. According to one implementation, the feature extraction module 116 computes the Euclidean distances between centroid of the upper body portion and centroids of the right hand, the left hand, the right leg, and the left leg may be computed using the equation (6) provided below:
(6)
wherein, (fcj) represents Euclidean distances between centroid of the upper body portion and represents centroid coordinates of the upper body portion, represents centroid coordinates of the right hand, left hand, right leg, and
left leg of the individual, and
j represents left hand, right hand, left leg and right leg joints, such that j=1 for left
hand and j=2 for right hand, j=3 for left leg, and j=4 for right leg.
[0052] Upon computation of the Euclidean distances between centroid of the upper
body portion and centroids of the right hand, the left hand, the right leg, and the left leg of the individual, the feature extraction module 116 computes mean, standard deviation, and maximum of Euclidean distances between the centroid of the upper body portion and the centroids of right hand, left hand, right leg, and left leg. The dynamic centroid distance related features are mathematically represented by the expression (7) provided below:

[0053] In the above expression, ( ) represents mean of Euclidean distances
between the centroid of the upper body portion and the centroids of right hand, left hand, right
leg, and left leg, ( ) represents standard deviation of Euclidean distances between
centroid of the upper body portion and centroids of right hand, left hand, right leg, and left
leg, ( ) represents maximum of Euclidean distances between centroid of the upper body
portion and centroids of right hand, left hand, right leg, and left leg, and represents
dynamic centroid distance related features.
[0054] The other static and dynamic features may include step length, walking speed,
and distance between the joints, such as height and length of the legs, torso, both lower legs, both thighs, both forearms, and both upper arms. The other static and dynamic features may be represented by (fOSD).
[0055] Thus, the gait features are extracted for each of the one or more gait cycles. In
one example, for each of the gait cycles, 2 area related features (fA), 18 angle related features (fAG), 12 dynamic centroid distance related features (fD), and 14 other static and dynamic features (fOSD) are extracted. In the context of the present subject matter, the extracted gait features are mathematically represented by the expression provided below:
.… (8)
[0056] In the above expression, (f) represents the gait features, ( ) represents area
related features, ( ) represents angle related features, ( ) represents dynamic centroid
distance related features, and represents other static and dynamic features. In one
implementation, the feature extraction module 116 may store the extracted gait features within the feature data 222.
[0057] Fig. 2b illustrates an exemplary 3D skeleton 250 of an individual to be
identified. As shown in Fig. 2b, the 3D skeleton 250 of the individual illustrates a plurality of skeleton joints of the individual starting from head to the ankle, such as head, shoulder centre, shoulder left, shoulder right, spine, hand left, hand right, elbow right, elbow left, wrist right, wrist left, hip left, hip right, hip centre, knee right, knee left, foot left, foot right, ankle right, and ankle left.

[0058] The 3D skeleton 250 of the individual also represents centriod 252 of the upper
body portion, and centroid 254 of the left hand. The upper body portion covers the skeleton joints, such as shoulder centre, shoulder left, hip left, hip right, and shoulder right. The left hand covers the skeleton joints, such as shoulder left, elbow left, hand left, and wrist left. As mentioned in the detailed description, the feature extraction module 116 extracts dynamic centroid distance related features ( ) by computing mean, standard deviation, and maximum of Euclidean distances between centroid of the upper body portion and the centroids of right hand, left hand, right leg, and left leg. As shown in Fig. 2b, the 3D skeleton 250 of the individual depicts Euclidean distance 256. The Euclidean distance 256 is the distance between the centroid 252 of the upper body portion and the centroid 254 of the left hand.
[0059] Once the plurality of gait features (f) is extracted, gait features that can
uniquely identify the individual are selected, and the individual identification takes place based on the selected gait features. In one example, the gait features that can uniquely identify the individual are selected. To select the gait features amongst the plurality of gait features (f), the gait features (f) as shown in expression (8) are provided as an input to the feature selection module 214, which is configured to select the gait features and identify the individual using an Adaptive Neutral Network (ANN) classifier.
[0060] In one implementation, the ANN classifier is preconfigured/trained for various
known individuals. An exemplary ANN classifier with a feature selection layer is illustrated in Fig. 2c. As shown in the Fig. 2c, the ANN classifier comprises a plurality of layers including a feature selection layer, an input layer, a hidden layer, and an output layer. Each of the layers includes a plurality of nodes. Specifically, the input layer includes a plurality of input nodes corresponding to the plurality of gait features of the known individuals, and the output layer includes a plurality of output nodes associated with a plurality of known individuals.
[0061] At the feature selection layer of the ANN classifier, an attenuation function
{F(Mi)} is calculated for the plurality of input nodes that represents the gait features. Based on the attenuation function {F(Mi)}, one or more inputs nodes are selected amongst the plurality of input nodes. For example, the input nodes having higher value of the arguments (Mi) of attenuation function {F(Mi)} are selected and passed through subsequent layers of the ANN classifier. Fig. 2d illustrates a graph 260 depicting an exemplary gait features selection

using the ANN classifier. As shown in the graph 260, value of attenuation function {F(Mi)} is plotted on the Y-axis, and the gait features, i.e., input nodes (i) is plotted on the X-axis, where i=1 indicates first input node, and i=46 indicates 46th input node.
[0062] As indicated in the foregoing description, the feature extraction module 116
extracts a plurality of gait features of the individual. The plurality of gait features includes 2 area related features, 12 dynamic centroid distance related features, 18 angle related features, and 14 other static and dynamic features, i.e., a total of 46 features. At the feature selection layer of the ANN classifier, an attenuation function {F(Mi)} is calculated for the plurality of input nodes that represents the gait features (f). The gait features having higher value of the arguments (Mi) of attenuation function {F(Mi)} are selected and passed through subsequent layers of the ANN classifier. Referring to the graph 260, 28th, 29th 30th, 31th, and 32nd gait features have higher value of of attenuation function {F(Mi)}as compared to the remaining gait features.
[0063] In the subsequent layers, a weight is assigned to each of the connections
between the nodes of the successive layers. The arguments of the attenuations (Mi), and the weights associated with the nodes are initially learned for a known set of individuals with the derived features for those individuals. This is called learning phase. After the learning phase, a testing phase starts, during which the extracted gait features of an unknown individual are fed into the ANN classifier. After the feed forward, using the arguments (Mi) and the weights associated between the nodes, the output node having the highest value is considered for identifying the individual. It is to be noted that each output node corresponds to an individual trained in the learning phase.
[0064] According to an example, comparision of accuracy of the individual
identification using different combinations of gait features and classifiers is depicted in Table 1 and Table 2 (provided below). The accuracy of individual identification is determined based on F-score associated with different combinations of the gait features and classifiers. The F-score as referred herein is computed using a conventionally known F-score calculation method based on precision and recall. According to said example, gait features of 10 individual are considered.
Table 1

ANN classifier 1 with (fAG) as input feature ANN classifier 1 with (fA, fD ) as input features ANN classifier 1
with (fAG, fA, fD )
as input features Naive Bayes
classifier with (fOSD)
as input feature
0.27 0.52 0.55 0.513
[0065] As shown in the Table 1 above, the F-score obtained using an ANN classifier
1, which is the ANN classifier without the feature selection layer, taking angle related features (fAG) as input is 0.27, the F-score obtained using the ANN classifier 1 with area related features (fA) and dynamic centroid distance related features (fD) as input is 0.52, the F-score obtained using the ANN classifier 1 with angle related features (fAG), area related features (fA), and dynamic centroid distance related features (fD) as input is 0.55, and the F-score obtained using Naive Bayes classifier with other static and dynamic features (fOSD) as input is 0.513.
[0066] It is clear from the Table 1 that the ANN classifier 1 with input features {fAG,
fA, fD}provides better accuracy of individual identification than the remaining combinations of the gait features and classifiers.
Table 2

ANN classifier 1 with (fAG, fA, fD, fOSD ) as input features ANN classifier 2 with (fAG, fA, fD, fOSD ) as input features
0.53 0.62
[0067] As shown in Table 2 above, the F-score obtained using the ANN classifier 1
with angle related features (fAG), area related features (fA), dynamic centroid distance related features (fD), and other static and dynamic features (fOSD) as input is 0.53, and the F-score obtained using ANN classifier 2 (depicted in Fig. 2c), which is the ANN classifier with the feature selection layer, taking angle related features (fAG), area related features (fA), dynamic centroid distance related features (fD), and other static and dynamic features (fOSD) as input is 0.62.
[0068] It is clear from the Table 2 that the ANN classifier 2 with gait features (fAG, fA,
fD, fOSD) as input provides better accuracy of individual identification. Thus, the identification

system described according to the present subject matter that utilizes the ANN classifier having the feature selection layer and uses the above mentioned gait features achieves higher accuracy of individual identification.
[0069] Fig. 3 illustrates a method 300 for skeleton based identification of individuals,
in accordance with an embodiment of the present subject matter. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method 300 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0070] The order in which the method 300 is described is not intended to be construed
as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300, or alternative methods. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 300 can be implemented in any suitable hardware, software, firmware, or combination thereof.
[0071] Referring to Fig. 3, at block 302, the method 300 includes obtaining skeleton
data from a 3D skeleton of an individual. The skeleton data includes information, i.e., 3D coordinates of skeleton joints, such as head, shoulder centre, shoulder left, shoulder right, spine, hand left, hand right, elbow right, elbow left, wrist right, wrist left, hip left, hip right, hip centre, knee right, knee left, foot left, foot right, ankle right, and ankle left. In one implementation, the gait cycle detection module 212 of the identification system 102 receives the 3D skeleton of the individual from the skeleton recording device 105, such as a Kinect® connected to the identification system 102. The gait cycle detection module 212 then obtains the skeleton data from the 3D skeleton of the individual.
[0072] At block 304, the method 300 includes detecting one or more gait cycles based
on the skeleton data. For example, half-gait cycles are detected based on the skeleton data. In

one implementation, the gait cycle detection module 212 detects the half-gait cycles based on
computing horizontal distance between the left ankles and right ankles of the individual.
[0073] At block 306, the method 300 includes extracting a plurality of gait features of
the individual, for each of the one or more gait cycles. The plurality of gait features include at least area related features and dynamic centroid distance related features. The plurality of gait features may also include angle related features and other static and dynamic features. The feature extraction module 116 extracts the plurality of gait features of the individual from the 3D skeleton based on the skeleton data.
[0074] At block 308, the method 300 includes identification of the individual based on
the extracted gait features. For example, the gait features having higher value of arguments (Mi) amongst the gait features are selected for identification of an individual. In one implementation, the feature selection module 214 of the identification system 102 is configured to identify the individual based on the extracted gait features using an Adaptive Neutral Network (ANN) classifier with a feature selection layer decribed according to the Fig. 2c.
[0075] Although embodiments for the skeleton based identification of individuals
have been described in language specific to structural features and/or methods, it is to be understood that the invention is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as exemplary implementations for the skeleton based identification of individual.

I/We claim:
1. A method for skeleton based identification of individuals, the method comprising:
obtaining skeleton data from a three dimensional (3D) skeleton of an individual;
detecting one or more gait cycles based on the skeleton data;
extracting a plurality of gait features of the individual, for each of the one or more gait cycles, from the 3D skeleton based on the skeleton data, wherein the plurality of gait features include at least one of area related features and dynamic centroid distance related features; and
identifying the individual based on the plurality of gait features.
2. The method as claimed in claim 1, wherein the method further comprising receiving the 3D skeleton of the individual from a skeleton recording device.
3. The method as claimed in claim 1, wherein the identifying is based on an Adaptive Neutral Network (ANN) classifier having a feature selection layer.
4. The method as claimed in claim 1, wherein the skeleton data comprises 3D coordinates of a plurality of skeleton joints including head, shoulder centre, shoulder left, shoulder right, spine, hand left, hand right, elbow right, elbow left, wrist right, wrist left, hip left, hip right, hip centre, knee right, knee left, foot left, foot right, ankle right, and ankle left.
5. The method as claimed in claim 1, wherein the extracting comprises computing a mean of area occupied by an upper body portion and a mean of area occupied by a lower body portion of the individual to extract the area related features.
6. The method as claimed in claim 1, wherein the extracting comprises computing atleast one of a mean, a standard deviation, and a maximum of Euclidean distances between a centroid of an upper body portion and centroids of a right hand, a left hand, a right leg, and a left leg to extract the dynamic centroid distance related features.
7. An identification system (102) for identifying individuals, the identification system (102) comprising:
a processor (202);
a gait cycle detection module (212) coupled to the processor (202), the gait cycle detection module (212) configured to detect one or more gait cycles

based on skeleton data of an individual, wherein the skeleton data comprises three dimensional (3D) coordinates of a plurality of skeleton joints;
a feature extraction module (116) coupled to the processor (202), the feature extraction module (116) configured to extract a plurality of gait features of the individual for each of the one or more gait cycles based on the skeleton data, wherein the plurality of gait features include at least one of area related features and dynamic centroid distance related features; and
a feature selection module (214) coupled to the processor (202), the feature selection module (214) configured to identify the individual based on the extracted gait features.
8. The identification system (102) as claimed in claim 7, wherein the gait cycle detection module (212) is futher configured to receive a 3D skeleton of the individual from a skeleton recording device (105), and obtain the skeleton data from the 3D skeleton.
9. The identification system (102) as claimed in claim 7, wherein the plurality of skeleton joints include head, shoulder centre, shoulder left, shoulder right, spine, hand left, hand right, elbow right, elbow left, wrist right, wrist left, hip left, hip right, hip centre, knee right, knee left, foot left, foot right, ankle right, and ankle left.
10. The identification system (102) as claimed in claim 7, wherein the plurality of gait features further include angle related features and other static and dynamic features.
11. The identification system (102) as claimed in claim 7, wherein the feature extraction module (116) is configured to extract the area related features based on computing a mean of area occupied by an upper body portion and a mean of area occupied by a lower body portion of the individual.
12. The identification system (102) as claimed in claim 7, wherein the feature extraction module (116) is configured to extract the dynamic centroid distance related features based on computing a mean, a standard deviation, and a maximum of Euclidean distances between a centroid of an upper body portion and centroids of a right hand, a left hand, a right leg, and a left leg.
13. The identification system (102) as claimed in claim 7, wherein the one or more gait cycles include half-gait cycles.

14. The identification system (102) as claimed in claim 7, wherein the feature selection
module (214) is configured to identify the individual based on an Adaptive Neutral
Network (ANN) classifier having a feature selection layer.
15. A computer-readable medium having embodied thereon a computer program for
executing a method comprising:
obtaining skeleton data from a three dimensional (3D) skeleton of an individual;
detecting one or more gait cycles based on the skeleton data;
extracting a plurality of gait features of the individual, for each of the one or more gait cycles, from the 3D skeleton based on the skeleton data, wherein the plurality of gait features include at least one of area related features and dynamic centroid distance related features; and
identifying the individual based on the extracted gait features.

Documents

Application Documents

# Name Date
1 3169-MUM-2012-CORRESPONDENCE(2-11-2012).pdf 2018-08-11
1 ABSTRACT1.jpg 2018-08-11
2 3169-MUM-2012-CORRESPONDENCE(4-12-2012).pdf 2018-08-11
2 3169-MUM-2012-FORM 26(4-12-2012).pdf 2018-08-11
3 3169-MUM-2012-CORRESPONDENCE(6-11-2012).pdf 2018-08-11
3 3169-MUM-2012-FORM 18(2-11-2012).pdf 2018-08-11
4 3169-MUM-2012-FORM 1(6-11-2012).pdf 2018-08-11
5 3169-MUM-2012-CORRESPONDENCE(6-11-2012).pdf 2018-08-11
5 3169-MUM-2012-FORM 18(2-11-2012).pdf 2018-08-11
6 3169-MUM-2012-CORRESPONDENCE(4-12-2012).pdf 2018-08-11
6 3169-MUM-2012-FORM 26(4-12-2012).pdf 2018-08-11
7 3169-MUM-2012-CORRESPONDENCE(2-11-2012).pdf 2018-08-11
7 ABSTRACT1.jpg 2018-08-11