Abstract: The present subject matter relates to a computing device (104) and method intended to provide fine-grained continuous location tracking for non-real time uses (e.g., for retracing the movement of office employees). As per the subject matter, the knowledge of an office layout is combined with the application of particle filters (PFs) over inertial tracking (using compass, accelerometer and gyroscope sensors) data. Viterbi-based path-likelihood maximization is integrated with PF framework to progressively reduce the uncertainty in the movement track of an individual computing device (104).
CLIAMS:1. A computing device (104) for backtracking user’s movement trajectory, the computing device (104) comprising:
a processor (202);
a particle filtering module (110), coupled to the processor (202), to apply a particle filter on an indoor map (302, 402) for estimating user’s movement trajectory, based on accelerometer information and heading information associated with the computing device (104), the computing device (104) being transportable by a user, wherein the accelerometer information and the heading information are based on reading of sensor data (224) from each of a plurality of sensors (208) of the computing device (104);
a backtracking module (112), coupled to the processor (202), to apply the particle filter on the indoor map (302, 402) for performing a Viterbi-based backtracking of the user’s movement trajectory on the indoor map (302, 402).
2. The computing device (104) as claimed in claim 1, wherein the plurality of sensors comprise accelerometer, compass magnetometer, and gyroscope.
3. The computing device (104) as claimed in claim 2, wherein the accelerometer provides an acceleration data (228) for estimating the accelerometer information, and the compass magnetometer and the gyroscope provide an orientation data (230), and wherein the heading information is estimated from the acceleration data (228) and the orientation data (230).
4. The computing device (104) as claimed in claim 1, further comprising a preprocessing module (218) to:
identify the indoor map (302, 402) of an indoor space based on the accelerometer information and the heading information; and
preprocess the indoor map (302, 402), wherein the indoor map (302, 402) includes a plurality of physical constraints.
5. The computing device (104) as claimed in claim 4, wherein the preprocessing module (218) preprocesses the indoor map (302, 402) by uniformly dividing the indoor map (302, 402) into equal size cells, and wherein cells respecting one of the plurality of constraints are labeled as valid cells and cells violating one of the plurality of constrains are labeled as invalid cells.
6. The computing device (104) as claimed in claim 5, wherein the particle filtering module (110) estimates the user’s movement trajectory by:
determining a course-level initial location of the computing device (104) on the indoor map (302, 402);
uniformly distributing particles over the determined initial location on indoor map (302, 402);
propagating the distributed particles based on the sensor data (224) and the plurality of constraints;
progressively updating the distribution of particles on the indoor map (302, 402), based on validity of the propagated particles; and
based on the progressive updating, estimating the user’s movement trajectory.
7. The computing device (104) as claimed in claim 6, wherein the backtracking module (112) is configured to:
estimate a transition probability matrix of valid particles by taking into account estimates of transition probabilities of the valid particles at each moving step of the user carrying the computing device (104), wherein the valid particles are particles lying in the valid cells; and
estimate observation density matrix of the valid particles by taking into account estimates of observation densities of the valid particles at each moving step of the user carrying the computing device (104).
8. The computing device (104) as claimed in claim 7, wherein the backtracking module (112) estimates the transition probability matrix and the observation density matrix, in parallel to the estimation of the user’s movement trajectory.
9. The computing device (104) as claimed in claim 7, wherein backtracking module (112) performs the Viterbi-based backtracking by:
using the particle filer to combine identified valid cells, the transition probability matrix, and the observation density matrix, wherein the particle filter is a real-time Bayesian tracking algorithm;
based on combination, computing by a Viterbi algorithm a most probable sequence of the identified valid cells that explains inertial measurements represented by each valid particle; and
backtracking the user’s movement trajectory on the indoor map (302, 402) of the indoor space, based on the most probable sequence of the valid cells.
10. The computing device (104) as claimed in claim 9, wherein an output of the Viterbi algorithm is a sequence of the valid cells on the indoor map (302, 402).
11. The computing device (104) as claimed in claim 1, wherein the Viterbi-based backtracking is performed offline.
12. The computing device (104) as claimed in claim 9, further corrects estimated state when particles are abruptly generated at invalid cells.
13. A method for backtracking user’s movement trajectories, the method comprising:
reading sensor data (224) from each of a plurality of sensors (208) of a computing device (104), the computing device (104) being transportable by a user, wherein based on the sensor data (224), accelerometer information and heading information of the computing device (104) are estimated;
applying, by a processor (202), a particle filter on an indoor map (302, 402) of an indoor space for estimating the user’s movement trajectories on the indoor map (302, 402), based on the accelerometer information and the heading information; and
applying, by the processor (202), the particle filter on the indoor map (302, 402) for performing a Viterbi-based backtracking of the user’s movement trajectories on the indoor map (302, 402).
14. The method as claimed in claim 13, wherein the plurality of sensors comprise accelerometer, compass magnetometer, and gyroscope.
15. The method as claimed in claim 14, wherein the accelerometer provides an acceleration data (228) for estimating the accelerometer information, and the compass magnetometer and the gyroscope provide an orientation data (230), and wherein the heading information is estimated from the acceleration data (228) and the orientation data (230).
16. The method as claimed in claim 13, further comprising:
identifying the indoor map (302, 402) of the indoor space based on the accelerometer information and the heading information; and
preprocessing the indoor map (302, 402), wherein the indoor map (302, 402) includes a plurality of physical constraints.
17. The method as claimed in claim 16, wherein the preprocessing comprising uniformly dividing the indoor map (302, 402) into equal size cells, and wherein cells respecting one of the plurality of constraints are labeled as valid cells and cells violating one of the plurality of constraints are labeled as invalid cells.
18. The method as claimed in claim 17, further comprising:
determining a course-level initial location of the computing device (104) on the indoor map (302, 402);
uniformly distributing particles over the determined initial location on the indoor map (302, 402);
propagating the distributed particles based on the sensor data (224) and the plurality of constraints;
progressively updating the distribution of particles on the indoor map (302, 402), based on validity of the propagated particles; and
based on the progressive updating, estimating the user’s movement trajectory.
19. The method as claimed in claim 18, further comprising:
estimate a transition probability matrix of valid particles by taking into account estimates of transition probabilities of the valid particles at each moving step of the user, wherein the valid particles are particles lying in the valid cells; and
estimate observation density matrix of the valid particles by taking into account estimates of observation densities of the valid particles at each moving step of the user.
20. The method as claimed in claim 19, further comprising estimating the transition probability matrix and the observation density matrix, in parallel to the estimation of the user’s movement trajectory.
21. The method as claimed in claim 19, further comprising:
using the particle filer to combine identified valid cells, the transition probability matrix, and the observation density matrix, wherein the particle filter is a real-time Bayesian tracking algorithm;
based on combination, computing by a Viterbi algorithm a most probable sequence of the identified valid cells that explains inertial measurements represented by each valid particle; and
backtracking the user’s movement trajectory on the indoor map (302, 402) of the indoor space, based on the most probable sequence of the valid cells.
22. The method as claimed in claim 21, wherein an output of the Viterbi algorithm is a sequence of valid cells on the indoor map (302, 402).
23. The method as claimed in claim 13, wherein the Viterbi-based backtracking is performed offline.
24. The method as claimed in claim 21, further comprising correcting estimated state when particles are abruptly generated at the invalid cells.
,TagSPECI:As Attached
| # | Name | Date |
|---|---|---|
| 1 | 1406-MUM-2015-IntimationOfGrant10-06-2022.pdf | 2022-06-10 |
| 1 | REQUEST FOR CERTIFIED COPY [31-03-2016(online)].pdf | 2016-03-31 |
| 2 | 1406-MUM-2015-PatentCertificate10-06-2022.pdf | 2022-06-10 |
| 2 | Form 3 [14-07-2016(online)].pdf | 2016-07-14 |
| 3 | Form 3 [14-10-2016(online)].pdf | 2016-10-14 |
| 3 | 1406-MUM-2015-FORM 3 [25-04-2022(online)].pdf | 2022-04-25 |
| 4 | PD015529IN-SC SPEC FOR FILING.pdf | 2018-08-11 |
| 4 | 1406-MUM-2015-Written submissions and relevant documents [25-04-2022(online)].pdf | 2022-04-25 |
| 5 | PD015529IN-SC FORM 5.pdf | 2018-08-11 |
| 5 | 1406-MUM-2015-FORM-26 [13-04-2022(online)].pdf | 2022-04-13 |
| 6 | PD015529IN-SC FORM 3.pdf | 2018-08-11 |
| 6 | 1406-MUM-2015-Correspondence to notify the Controller [05-04-2022(online)].pdf | 2022-04-05 |
| 7 | PD015529IN-SC FIGURES FOR FILING.pdf | 2018-08-11 |
| 7 | 1406-MUM-2015-US(14)-HearingNotice-(HearingDate-13-04-2022).pdf | 2022-03-18 |
| 8 | ABSTRACT1.jpg | 2018-08-11 |
| 8 | 1406-MUM-2015-PETITION UNDER RULE 137 [27-05-2020(online)].pdf | 2020-05-27 |
| 9 | 1406-MUM-2015-Power of Attorney-060116.pdf | 2018-08-11 |
| 9 | 1406-MUM-2015-Proof of Right [27-05-2020(online)].pdf | 2020-05-27 |
| 10 | 1406-MUM-2015-CLAIMS [22-05-2020(online)].pdf | 2020-05-22 |
| 10 | 1406-MUM-2015-Form 1-011015.pdf | 2018-08-11 |
| 11 | 1406-MUM-2015-COMPLETE SPECIFICATION [22-05-2020(online)].pdf | 2020-05-22 |
| 11 | 1406-MUM-2015-Correspondence-060116.pdf | 2018-08-11 |
| 12 | 1406-MUM-2015-Correspondence-011015.pdf | 2018-08-11 |
| 12 | 1406-MUM-2015-FER_SER_REPLY [22-05-2020(online)].pdf | 2020-05-22 |
| 13 | 1406-MUM-2015-FER.pdf | 2019-11-29 |
| 13 | 1406-MUM-2015-OTHERS [22-05-2020(online)].pdf | 2020-05-22 |
| 14 | 1406-MUM-2015-FER.pdf | 2019-11-29 |
| 14 | 1406-MUM-2015-OTHERS [22-05-2020(online)].pdf | 2020-05-22 |
| 15 | 1406-MUM-2015-Correspondence-011015.pdf | 2018-08-11 |
| 15 | 1406-MUM-2015-FER_SER_REPLY [22-05-2020(online)].pdf | 2020-05-22 |
| 16 | 1406-MUM-2015-COMPLETE SPECIFICATION [22-05-2020(online)].pdf | 2020-05-22 |
| 16 | 1406-MUM-2015-Correspondence-060116.pdf | 2018-08-11 |
| 17 | 1406-MUM-2015-Form 1-011015.pdf | 2018-08-11 |
| 17 | 1406-MUM-2015-CLAIMS [22-05-2020(online)].pdf | 2020-05-22 |
| 18 | 1406-MUM-2015-Proof of Right [27-05-2020(online)].pdf | 2020-05-27 |
| 18 | 1406-MUM-2015-Power of Attorney-060116.pdf | 2018-08-11 |
| 19 | ABSTRACT1.jpg | 2018-08-11 |
| 19 | 1406-MUM-2015-PETITION UNDER RULE 137 [27-05-2020(online)].pdf | 2020-05-27 |
| 20 | PD015529IN-SC FIGURES FOR FILING.pdf | 2018-08-11 |
| 20 | 1406-MUM-2015-US(14)-HearingNotice-(HearingDate-13-04-2022).pdf | 2022-03-18 |
| 21 | PD015529IN-SC FORM 3.pdf | 2018-08-11 |
| 21 | 1406-MUM-2015-Correspondence to notify the Controller [05-04-2022(online)].pdf | 2022-04-05 |
| 22 | PD015529IN-SC FORM 5.pdf | 2018-08-11 |
| 22 | 1406-MUM-2015-FORM-26 [13-04-2022(online)].pdf | 2022-04-13 |
| 23 | PD015529IN-SC SPEC FOR FILING.pdf | 2018-08-11 |
| 23 | 1406-MUM-2015-Written submissions and relevant documents [25-04-2022(online)].pdf | 2022-04-25 |
| 24 | 1406-MUM-2015-FORM 3 [25-04-2022(online)].pdf | 2022-04-25 |
| 25 | 1406-MUM-2015-PatentCertificate10-06-2022.pdf | 2022-06-10 |
| 26 | 1406-MUM-2015-IntimationOfGrant10-06-2022.pdf | 2022-06-10 |
| 1 | SearchStrategyMatrix-converted(5)_25-11-2019.pdf |