Sign In to Follow Application
View All Documents & Correspondence

System And Method For Tracking Body Joints

Abstract: Body joint tracking is applied in various industries and medical field. In body joint tracking, marker less devices plays an important role. However, the marker less devices are facing some challenges in providing optimal tracking due to occlusion, ambiguity, lighting conditions, dynamic objects etc. System and method of the present disclosure provides an optimized body joint tracking. Here, motion data pertaining to a first set of motion frames from a motion sensor are received. Further, the motion data are processed to obtain a plurality of 3 dimensional cylindrical models. Here, every cylindrical model among the plurality of 3 dimensional cylindrical model represents a body segment. The coefficients associated with the plurality of 3 dimensional cylindrical models are initialized to obtain a set of initialized cylindrical models. A set of dynamic coefficients associated with the initialized cylindrical models are utilized to track joint motion trajectories of a set of subsequent frames.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
07 July 2017
Publication Number
02/2019
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2024-03-11
Renewal Date

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point, Mumbai 400021, Maharashtra, India

Inventors

1. SINHA, Sanjana
Tata Consultancy Services Limited, Building 1B,Ecospace, Innovation Labs, Kolkata - STP, Kolkata - 700160, West Bengal, India
2. BHOWMICK, Brojeshwar
Tata Consultancy Services Limited, Building 1B,Ecospace, Innovation Labs, Kolkata - STP, Kolkata - 700160, West Bengal, India
3. SINHA, Aniruddha
Tata Consultancy Services Limited, Building 1B,Ecospace, Innovation Labs, Kolkata - STP, Kolkata - 700160, West Bengal, India
4. DAS, Abhijit
122, Hafiz Md.Ishaque Road, Writers Para, Haridevpur - 700082, West Bengal, India

Specification

Claims:1. A method for tracking body joints, the method comprising: receiving, by one or more hardware processors, motion data pertaining to a first set of motion frames from a motion sensor device, wherein the motion data comprises an initial joint motion data and a plurality of depth frames, wherein the initial joint motion data comprises locations in 3 dimensional world coordinates of a plurality of body joints associated with the plurality of depth frames and each depth frame among the plurality of depth frames comprises a set of 2D image coordinates and a depth value; initializing, by the one or more hardware processors, a plurality of cylindrical models for each motion frame among the first set of motion frames by processing the initial joint motion data and the plurality of depth frames, wherein the plurality of cylindrical models represents a plurality of body segments, connecting adjacent body joints among the plurality of body joints; tracking, by the one or more hardware processors, the plurality of initialized cylindrical models to obtain one or more an optimized motion trajectories of the plurality of body joints in a plurality of motion frames received in succession to the first set of motion frames based on one or more direction angles and one or more base coordinates associated with each of the plurality of cylindrical models, wherein the tracking is performed by utilizing a particle filter mechanism. 2. The method as claimed in claim 1, wherein initializing the plurality of cylindrical models for each motion frame among the first set of motion frames comprises: obtaining a plurality of 3 dimensional point clouds corresponding to the plurality of depth frames by mapping every 2 dimensional image coordinates to the 3 dimensional world coordinates; segmenting the plurality of 3 dimensional point clouds based on a segmentation threshold to obtain a first set of segmented 3 dimensional point clouds, wherein the first set of segmented 3 dimensional point clouds corresponds to the plurality of body segments, wherein the segmentation threshold is obtained based on an Euclidean distance computed for 3 dimensional world coordinates corresponding to the plurality of body joints of the initial joint motion data; smoothening the first set of segmented 3 dimensional point clouds corresponding to the plurality of body segments prior to model fitting to reduce one or more model outliers; fitting the plurality of cylindrical models to the set of smoothened point clouds corresponding to the plurality of body segments associated with each motion frame among the first set of motion frames, wherein a set of cylinder coefficients of each cylindrical model among the plurality of cylindrical models includes radius, length, the one or more direction angles and the one or more base coordinates; initializing the radius of the plurality of cylindrical models, a mean radius value obtained from a cylinder radius corresponding to the plurality of motion frames; initializing the one or more direction angles of the plurality of cylindrical models, a mean direction value obtained from a set of cylinder directions corresponding to from the plurality of motion frames; initializing the one or more base coordinates of the plurality of cylindrical models, an average base value obtained by projecting a plurality of joint center coordinates in the initial joint motion data to an axis of a corresponding cylindrical model from the plurality of motion frames; and initializing the length of the plurality of cylindrical models, a mean length value obtained from a distance between adjacent body joints in the initial joint motion data stream for plurality of motion frames. 3. The method as claimed in claim 1, wherein tracking, the plurality of initialized cylindrical models to obtain an optimized motion trajectories comprises: receiving the plurality of initialized cylindrical models and a second set of segmented point cloud data corresponding to the plurality of motion frames received in succession to the first set of motion frames from the motion sensor device, wherein the initialized plurality of cylindrical models are represented as a plurality of particles, wherein each particle among the plurality of particles comprises a state represented by the one or more direction angles and the one or more base coordinates, and a weight; propagating the plurality of particles to a new state and updating the state based on a state propagation rule by utilizing a state space model, wherein the state propagation rule is based on the plurality of initialized cylindrical models; updating the weight associated with the plurality of particles based on a likelihood function and a weight update rule by utilizing an observation model, wherein the weight update rule is based on the second set of segmented point cloud data and the plurality of initialized cylindrical models; and tracking the plurality of initialized cylindrical models based on the state and the weight of the plurality of particles associated with the plurality of initialized cylindrical models to obtain the optimized motion trajectories of the plurality of body joints. 4. A body joint tracking system, the system comprising: one or more memories comprising programmed instructions and a repository for storing the one or more signals, a signal database, a wavelet database, at least one parameter associated with the one or more signal and a prior information available for the one or more signals; one or more hardware processors operatively coupled to the one or more memories, wherein the one or more hardware processors are capable of executing the programmed instructions stored in the one or more memories; a motion sensor device and a joint tracking unit, wherein the joint tracking unit is configured to: receive, motion data pertaining to a first set of motion frames from a motion sensor device, wherein the motion data comprises an initial joint motion data and a plurality of depth frames, wherein the initial joint motion data comprises locations in 3 dimensional world coordinates of a plurality of body joints associated with the plurality of depth frames and each depth frame among the plurality of depth frames comprises a set of 2D image coordinates and a depth value; initialize, a plurality of cylindrical models for each motion frame among the first set of motion frames by processing the initial joint motion data and the plurality of depth frames, wherein the plurality of cylindrical models represents a plurality of body segments, connecting adjacent body joints among the plurality of body joints; and track, the plurality of initialized cylindrical models to obtain one or more optimized motion trajectories of the plurality of body joints in a plurality of motion frames received in succession to the first set of motion frames based on a set of direction angles and a set of base coordinates associated with each of the plurality of cylindrical models, wherein the tracking is performed by utilizing a particle filter mechanism. 5. The system as claimed in claim 4, wherein the joint tracking unit is configured to initialize the plurality of cylindrical models for each motion frame among the first set of motion frames by: obtaining a plurality of 3 dimensional point clouds corresponding to the plurality of depth frames by mapping every 2 dimensional image coordinates to the 3 dimensional world coordinates; segmenting the plurality of 3 dimensional point clouds based on a segmentation threshold to obtain a first set of segmented 3 dimensional point clouds, wherein the first set of segmented 3 dimensional point clouds corresponds to the plurality of body segments, wherein the segmentation threshold is obtained based on an Euclidean distance computed for 3 dimensional world coordinates corresponding to the plurality of body joints of the initial joint motion data; smoothening the first set of segmented 3 dimensional point clouds corresponding to the plurality of body segments prior to model fitting to reduce one or more model outliers; fitting the plurality of cylindrical models to the set of smoothened point clouds corresponding to the plurality of body segments associated with each motion frame among the first set of motion frames, wherein a set of cylinder coefficients of each cylindrical model among the plurality of cylindrical models includes radius, length, one or more direction angle and one or more base coordinates; initializing the radius of the plurality of cylindrical models, a mean radius value obtained from a cylinder radius corresponding to the plurality of motion frames; initializing the one or more direction angles of the plurality of cylindrical models, a mean direction value obtained from a set of cylinder directions corresponding to from the plurality of motion frames; initializing the one or more base coordinates of the plurality of cylindrical models, an average base value obtained by projecting a plurality of joint center coordinates in the initial joint motion data to an axis of a corresponding cylindrical model from the plurality of motion frames; and initializing the length of the plurality of cylindrical models, a mean length value obtained from a distance between adjacent body joints in the initial joint motion data stream for plurality of motion frames. 6. The method as claimed in claim 4, wherein the joint tracking unit is configured to track, the plurality of initialized cylindrical models to obtain an optimized motion trajectories by: receiving the plurality of initialized cylindrical models and a second set of segmented point cloud data corresponding to the plurality of motion frames received in succession to the first set of motion frames from the motion sensor device, wherein the initialized plurality of cylindrical models are represented as a plurality of particles, wherein each particle among the plurality of particles comprises a state represented by the one or more direction angles and the one or more base coordinates, and a weight; propagating the plurality of particles to a new state and updating the state based on a state propagation rule by utilizing a state space model, wherein the state propagation rule is based on the plurality of initialized cylindrical models; updating the weight associated with the plurality of particles based on a likelihood function and a weight update rule by utilizing an observation model, wherein the weight update rule is based on the second set of segmented point cloud data and the plurality of initialized cylindrical models; and tracking the plurality of initialized cylindrical models based on the state and the weight of the plurality of particles associated with the plurality of initialized cylindrical models to obtain the optimized motion trajectories of the plurality of body joints. , Description:FORM 2 THE PATENTS ACT, 1970 (39 of 1970) & THE PATENT RULES, 2003 COMPLETE SPECIFICATION (See Section 10 and Rule 13) Title of invention: SYSTEM AND METHOD FOR TRACKING BODY JOINTS Applicant: Tata Consultancy Services Limited A company Incorporated in India under the Companies Act, 1956 Having address: Nirmal Building, 9th Floor, Nariman Point, Mumbai 400021, Maharashtra, India The following specification particularly describes the invention and the manner in which it is to be performed. TECHNICAL FIELD The embodiments herein generally relates, in general, to motion tracking and, in particular, to a system and method for tracking body joints. BACKGROUND Tracking body joints is key research field and applied in a variety of applications including computer animation, video games, medical therapy, surveillance, human machine interaction and athlete performance analysis. Moreover, the joint motion analysis is an important aspect in health monitoring of patients suffering from neurological disorders, post-stroke patients, and an elderly subjects. In real-time body joint tracking, accuracy in tracking is a challenging task. The challenge is due to change in appearance due to non-rigid motion of a subject undergoing test, clothing, view point and lighting. Conventional real-time body joint tracking methods are marker-based and requires a subject undergoing test to wear obtrusive devices. Moreover, the conventional marker based joint tracking methods are complex, difficult to maintain, extremely expensive and may not be viable for a prolonged rehabilitation therapy. Hence marker less joint tracking systems are used for tracking body joints in real-time. However, the conventional marker less real-time body joint tracking systems are facing challenges to obtain optimized body joint tracking due to occlusion, ambiguity, lighting conditions and dynamic objects and the like. SUMMARY Embodiments of the present disclosure present technological improvements as solutions to one or more of the above-mentioned technical problems recognized by the inventors in conventional systems. For example, in one embodiment, a method for tracking body joints is provided. The method includes receiving, motion data pertaining to a first set of motion frames from a motion sensor device, wherein the motion data comprises an initial joint motion data and a plurality of depth frames, wherein the initial joint motion data comprises locations in 3 dimensional world coordinates of a plurality of body joints associated with the plurality of depth frames and each depth frame among the plurality of depth frames comprises a set of 2 dimensional image coordinates and a depth value, by one or more hardware processors. Further, the method includes, initializing, a plurality of cylindrical models for each motion frame among the first set of motion frames by processing the initial joint motion data and the plurality of depth frames, wherein the plurality of cylindrical models represents a plurality of body segments, connecting adjacent body joints among the plurality of body joints, by the one or more hardware processors. Furthermore, the method includes tracking, by the one or more hardware processors, the plurality of initialized cylindrical models to obtain one or more optimized motion trajectories of the plurality of body joints in a plurality of motion frames received in succession to the first set of motion frames based on one or more direction angles and one or more base coordinates associated with each of the plurality of cylindrical models, wherein the tracking is performed by utilizing particle filter mechanism, by the one or more hardware processors. In another aspect, a system for tracking body joints is provided. The system includes one or more memories comprising programmed instructions and repository for storing the one or more motion data, a database; and one or more hardware processors operatively coupled to the one or more memories, wherein the one or more hardware processors are capable of executing the programmed instructions stored in the one or more memories, a motion sensor device and a joint tracking unit, wherein the joint tracking unit is configured to receive, motion data pertaining to a first set of motion frames from a motion sensor device, wherein the motion data comprises an initial joint motion data and a plurality of depth frames, wherein the initial joint motion data comprises locations in 3 dimensional world coordinates of a plurality of body joints associated with the plurality of depth frames and each depth frame among the plurality of depth frames comprises a set of 2 dimensional image coordinates and a depth value. Further, the joint tracking unit is configured to initialize, a plurality of cylindrical models for each motion frame among the first set of motion frames by processing the initial joint motion data and the plurality of depth frames, wherein the plurality of cylindrical models represents a plurality of body segments, connecting adjacent body joints among the plurality of body joints. Furthermore, the joint tracking unit is configured to track, the plurality of initialized cylindrical models to obtain one or more optimized motion trajectories of the plurality of body joints in a plurality of motion frames received in succession to the first set of motion frames based on one or more direction angles and one or more base coordinates associated with each of the plurality of cylindrical models, wherein the tracking is performed by utilizing particle filter mechanism. In yet another aspect, a computer program product comprising a non-transitory computer-readable medium having embodied therein a computer program for system and method for tracking body joints, is provided. The computer readable program, when executed on a computing device, causes the computing device to receive, motion data pertaining to a first set of motion frames from a motion sensor device, wherein the motion data comprises an initial joint motion data and a plurality of depth frames, wherein the initial joint motion data comprises locations in 3 dimensional world coordinates of a plurality of body joints associated with the plurality of depth frames and each depth frame among the plurality of depth frames comprises a set of 2 dimensional image coordinates and a depth value. Further computer readable program, when executed on a computing device, causes the computing device to initialize, a plurality of cylindrical models for each motion frame among the first set of motion frames by processing the initial joint motion data and the plurality of depth frames, wherein the plurality of cylindrical models represents a plurality of body segments, connecting adjacent body joints among the plurality of body joints. Furthermore computer readable program, when executed on a computing device, causes the computing device to track, the plurality of initialized cylindrical models to obtain one or more optimized motion trajectories of the plurality of body joints in a plurality of motion frames received in succession to the first set of motion frames based on one or more direction angles and one or more base coordinates associated with each of the plurality of cylindrical models, wherein the tracking is performed by utilizing particle filter mechanism. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed. BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles: FIG. 1 illustrates a network environment implementing a system and method for tracking body joints, according to some embodiments of the present disclosure; FIG. 2 illustrates a block diagram of a system and method for tracking body joints, according to some embodiments of the present disclosure; FIG. 3 depicts an architecture diagram for a system and method for tracking body joints according to some embodiments of the present disclosure; FIG. 4 illustrates a detailed flow diagram for tracking body joints, according to some embodiments of the present disclosure. FIG. 5A illustrates an example 3 dimensional temporal trajectory of wrist joint during shoulder abduction, according to some embodiments of the present disclosure; FIG. 5B illustrates an example 3 dimensional temporal trajectory of elbow joint during shoulder flexion, according to some embodiments of the present disclosure; and FIG. 5C illustrates an example 3 dimensional temporal trajectory of wrist joint during elbow flexion, according to some embodiments of the present disclosure. It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems and devices embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown. DETAILED DESCRIPTION OF EMBODIMENTS Exemplary embodiments are described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. It is intended that the following detailed description be considered as exemplary only, with the true scope and spirit being indicated by the following claims. In the field of motion tracking, tracking body joints plays a vital role. Since the conventional methods are performing tracking of body joints by utilizing marker based methods and the marker based methods needs some obtrusive devices to be worn by the subjects undergoing tests, marker less body joints tracking methods are prominent. The marker less motion body joint tracking methods includes a motion sensor to capture a plurality of motion data. For example, Kinect® V1 is marker less motion sensing device mainly used for tracking body joints. However, the motion sensor based body joint tracking methods are prone to occlusion, IR interference, lighting conditions and distance from the motion sensor. The present subject matter overcomes the limitations of the conventional joint tracking methods by modeling a set of 3 dimensional point clouds associated with a body (for example, human body and animal body). Here, a set of 3 dimensional cylindrical models are initialized by utilizing the set of 3 dimensional point cloud data. Further, the set of initialized 3 dimensional cylindrical models are tracked based on a set of dynamic parameters associated with the set of initialized cylindrical models to obtain an optimized joint motion trajectories associated with the body. Here, the optimized joint locations are derived directly in 3 dimensional coordinate space to reduce adverse effects of noise associated with a depth data from depth sensor of a motion sensor device. An implementation of the system and method for signal analysis is described further in detail with reference to FIGS. 1 through 5C. Referring now to the drawings, and more particularly to FIGS. 1 through 5C, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method. FIG. 1 illustrates a network environment 100 implementing a system 102 for tracking body joints, according to an example embodiment of the present subject matter. The system for tracking body joints 102, hereinafter referred to as the system 102, is configured for tracking body joints using motion data received from a motion sensor device 120. In an embodiment, motion sensor device 120 can be a Kinect® V1 device for monitoring subject’s movements. The system 102 may be embodied in a computing device, for instance a computing device 104. Although the present disclosure is explained considering that the system 102 is implemented on a server, it may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a cloud-based computing environment and the like. In one implementation, the system 102 may be implemented in a cloud-based environment. It will be understood that the system 102 may be accessed by multiple users through one or more user devices 106-1, 106-2... 106-N, collectively referred to as user devices 106 hereinafter, or applications residing on the user devices 106. Examples of the user devices 106 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, a Smartphone, a Tablet Computer, a workstation and the like. The user devices 106 are communicatively coupled to the system 102 through a network 108. In an embodiment, the network 108 may be a wireless or a wired network, or a combination thereof. In an example, the network 108 can be implemented as a computer network, as one of the different types of networks, such as virtual private network (VPN), intranet, local area network (LAN), wide area network (WAN), the internet, and such. The network 106 may either be a dedicated network or a shared network, which represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), and Wireless Application Protocol (WAP), to communicate with each other. Further, the network 108 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices. The network devices within the network 108 may interact with the system 102 through communication links. As discussed above, the system 102 may be implemented in a computing device 104, such as a hand-held device, a laptop or other portable computer, a tablet computer, a mobile phone, a PDA, a smartphone, and a desktop computer. The system 102 may also be implemented in a workstation, a mainframe computer, a server, and a network server. In an embodiment, the system 102 may be coupled to a data repository, for example, a repository 112. The repository 112 may store data processed, received, and generated by the system 102. In an alternate embodiment, the system 102 may include the data repository 112. The components and functionalities of the system 102 are described further in detail with reference to FIG. 2. FIG. 2 illustrates a block diagram of a body joint tracking system 200 for tracking body joints, according to some embodiments of the present disclosure. The body joint tracking system 200 (hereinafter referred to as system 200) may be an example of the system 102 (FIG. 1). In an example embodiment, the system 200 may be embodied in, or is in direct communication with the system, for example the system 102 (FIG. 1). The system 200 includes or is otherwise in communication with one or more hardware processors such as a processor 202, at least one memory such as a memory 204, an I/O interface 206 and a joint tracking unit 250. In an embodiment, the joint tracking unit 250 can be implemented as a standalone unit in the system 200 comprising a point cloud segmentation module (not shown in FIG. 2), a cylinder model fitting module (not shown in FIG. 2), a state propagation module (not shown in FIG. 2) and a likelihood estimation module (not shown in FIG. 2). In another embodiment, the joint tracking unit 250 can be implemented as a module in the memory 204 comprising the point cloud segmentation module (not shown in FIG. 2), the cylinder model fitting module (not shown in FIG. 2), the state propagation module (not shown in FIG. 2) and the likelihood estimation module (not shown in FIG. 2). The processor 202, memory 204, and the I/O interface 206 may be coupled by a system bus such as a system bus 208 or a similar mechanism. The I/O interface 206 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The interfaces 206 may include a variety of software and hardware interfaces, for example, interfaces for peripheral device(s), such as a keyboard, a mouse, an external memory, a camera device, and a printer. Further, the interfaces 206 may enable the system 102 to communicate with other devices, such as web servers and external databases. The interfaces 206 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, local area network (LAN), cable, etc., and wireless networks, such as Wireless LAN (WLAN), cellular, or satellite. For the purpose, the interfaces 206 may include one or more ports for connecting a number of computing systems with one another or to another server computer. The I/O interface 206 may include one or more ports for connecting a number of devices to one another or to another server. The hardware processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the hardware processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 204. The memory 204 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. In an embodiment, the memory 204 includes a plurality of modules 220 and a repository 240 for storing data processed, received, and generated by one or more of the modules 220 and the signal analysis unit 250. The modules 220 may include routines, programs, objects, components, data structures, and so on, which perform particular tasks or implement particular abstract data types. The memory 204 also includes module(s) 220 and a data repository 240. The module(s) 220 include programs or coded instructions that supplement applications or functions performed by the body joint tracking system 200. The modules 220, amongst other things, can include routines, programs, objects, components, and data structures, which perform particular tasks or implement particular abstract data types. The modules 220 may also be used as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulates signals based on operational instructions. Further, the modules 220 can be used by hardware, by computer-readable instructions executed by a processing unit, or by a combination thereof. The modules 220 can include various sub-modules (not shown). The module 220 may include computer-readable instructions that supplement applications or functions performed by the signal analysis system 200. The data repository 240 may include received input signals 242, a signal database 244, a wavelet database 246 and other data 248. Further, the other data 248 amongst other things, may serve as a repository for storing data that is processed, received, or generated as a result of the execution of one or more modules in the module(s) 220 and the modules associated with the signal analysis unit 250. The repository 240 is further configured to maintain a plurality of parameters and prior information associated with a signal stored in the data repository 240. Although the data repository 240 is shown internal to the body joint tracking system 200, it will be noted that, in alternate embodiments, the data repository 240 can also be implemented external to the body joint tracking system 200, where the data repository 240 may be stored within a database (not shown in FIG. 2) communicatively coupled to the body joint tracking system 200. The data contained within such external database may be periodically updated. For example, new data may be added into the database (not shown in FIG. 2) and/or existing data may be modified and/or non-useful data may be deleted from the database (not shown in FIG. 2). In one example, the data may be stored in an external system, such as a Lightweight Directory Access Protocol (LDAP) directory and a Relational Database Management System (RDBMS). In another embodiment, the data stored in the data repository 240 may be distributed between the body joint tracking system 200 and the external database. The joint tracking unit 250 of the body joint tracking system 200 can be configured to receive, a plurality of motion data pertaining to a first set of motion frames from the motion sensor device 120. The motion data comprises an initial joint motion data and a plurality of depth frames, wherein the initial joint motion data comprises locations in 3 dimensional world coordinates represented as ?C ?_t?^j=?[X,Y,Z]?^Tof a plurality of body joints associated with the plurality of depth frames and each depth frame among the plurality of depth frames comprises a set of 2D image coordinates and a depth value. The depth value can be a distance in millimeter between the subject undergoing exercise and the motion sensor device 120. A line joining two adjacent body joints i and j can be ?b ?_t?^((i,j) ). The line ?b ?_t?^((i,j) ) can be represented as a body segment B^((i,j)) connecting two adjacent body joints i and j. Further, the line ?b ?_t?^((i,j) ) is associated with a direction information. In an embodiment, the Kinect® V1 device is utilized for capturing the plurality of motion data including the initial joint motion data and the plurality of depth frames at 30fps, at a distance of 1.8m to 2.5m from the subject undergoing exercise. In an embodiment, a plurality of active joint Range Of Motion (ROM) data pertaining to upper body portion (for example, shoulder abduction, shoulder flexion and extension, Elbow flexion and extension) of the subject undergoing exercise is received from the motion sensor device 120. In the context of present disclosure, ROM refers to any kind of movement of a body during dynamic postures involving joints in the body such as ROM exercises. Further, the joint tracking unit 250 of the body joint tracking system 200 can be further configured to initialize a plurality of cylindrical models for each motion frame among the first set of motion frames by processing the initial joint motion data and the plurality of depth frames. Here, the plurality of cylindrical models represents a plurality of body segments, connecting adjacent body joints among the plurality of body joints. Here, a plurality of 3 dimensional point clouds corresponding to the plurality of depth frames are obtained by mapping every 2D pixel coordinates (x, y) to the 3 dimensional world coordinates (X,Y,Z) based on a perspective projection as given below: X=((x-c_x))/f_x d , Y=((y-c_y))/f_y d , Z=d , where f_x and f_y refers to focal lengths of a depth sensor associated with the motion sensor device 120along x axis and y axis respectively, d is the depth value and (c_(x ,) c_y) are a coordinates of an image center. Further, the plurality of 3 dimensional point clouds Pt are segmented based on a segmentation threshold to obtain a first set of segmented 3 dimensional point clouds St representing a body segment B^((i,j)) using the equation, S_(t = ) { P ? ¦ ? P_t :||(P ? - (b_t ) ?^((i,j) ) )||

Documents

Application Documents

# Name Date
1 Form 3 [07-07-2017(online)].pdf 2017-07-07
2 Form 20 [07-07-2017(online)].jpg 2017-07-07
3 Form 18 [07-07-2017(online)].pdf_66.pdf 2017-07-07
4 Form 18 [07-07-2017(online)].pdf 2017-07-07
5 Drawing [07-07-2017(online)].pdf 2017-07-07
6 Description(Complete) [07-07-2017(online)].pdf_67.pdf 2017-07-07
7 Description(Complete) [07-07-2017(online)].pdf 2017-07-07
8 201721024066-FORM-26 [28-08-2017(online)].pdf 2017-08-28
9 201721024066-Proof of Right (MANDATORY) [14-09-2017(online)].pdf 2017-09-14
10 201721024066-REQUEST FOR CERTIFIED COPY [10-05-2018(online)].pdf 2018-05-10
11 201721024066-FORM 3 [11-07-2018(online)].pdf 2018-07-11
12 Abstract1.jpg 2018-08-11
13 201721024066-ORIGINAL UNDER RULE 6 (1A)-FORM 1-180917.pdf 2018-08-11
14 201721024066-CORRESPONDENCE(IPO)-(CERTIFIED COPY )-(21-5-2018).pdf 2018-08-11
15 201721024066-ORIGINAL UR 6( 1A) FORM 26-310817.pdf 2018-11-26
16 201721024066-OTHERS [22-07-2021(online)].pdf 2021-07-22
17 201721024066-FER_SER_REPLY [22-07-2021(online)].pdf 2021-07-22
18 201721024066-COMPLETE SPECIFICATION [22-07-2021(online)].pdf 2021-07-22
19 201721024066-CLAIMS [22-07-2021(online)].pdf 2021-07-22
20 201721024066-FER.pdf 2021-10-18
21 201721024066-US(14)-HearingNotice-(HearingDate-05-02-2024).pdf 2023-09-12
22 201721024066-FORM-26 [02-02-2024(online)].pdf 2024-02-02
23 201721024066-FORM-26 [02-02-2024(online)]-1.pdf 2024-02-02
24 201721024066-Correspondence to notify the Controller [02-02-2024(online)].pdf 2024-02-02
25 201721024066-Written submissions and relevant documents [16-02-2024(online)].pdf 2024-02-16
26 201721024066-PatentCertificate11-03-2024.pdf 2024-03-11
27 201721024066-IntimationOfGrant11-03-2024.pdf 2024-03-11
28 201721024066-FORM 4 [12-06-2024(online)].pdf 2024-06-12

Search Strategy

1 Searchstrategy201721024066E_27-01-2021.pdf

ERegister / Renewals

3rd: 12 Jun 2024

From 07/07/2019 - To 07/07/2020

4th: 12 Jun 2024

From 07/07/2020 - To 07/07/2021

5th: 12 Jun 2024

From 07/07/2021 - To 07/07/2022

6th: 12 Jun 2024

From 07/07/2022 - To 07/07/2023

7th: 12 Jun 2024

From 07/07/2023 - To 07/07/2024

8th: 12 Jun 2024

From 07/07/2024 - To 07/07/2025

9th: 30 Jun 2025

From 07/07/2025 - To 07/07/2026