Abstract: The present subject matter discloses a mobile device (100) and a method for detection and classification of an acceleration-based step activity of a user. According to the present subject matter, the mobile device (100) implements the described method, where the method includes receiving acceleration signals from an inertial sensor (130) in the mobile device (100) carried by the user performing a step activity. The acceleration signals are divided into data windows. Peaks are identified within a frequency spectrum of each of the data windows, and non-stationary activity windows are identified from amongst the data windows, based on the peaks within the frequency spectrum of each of the data windows. Further, valid peaks are determined within each of the non-stationary activity windows, and the step activity of the user is classified into one of predefined acceleration-based step activities based on the valid peaks within the each non-stationary activity window.
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)
1. Title of the invention: ACCELERATION-BASED STEP ACTIVITY DETECTION AND
CLASSIFICATION ON MOBILE DEVICES
2. Applicant(s)
NAME NATIONALITY ADDRESS
TATA CONSULTANCY Indian Nirmal Building, 9th Floor, Nariman
SERVICES LIMITED Point, Mumbai, Maharashtra 400021,
India
3. Preamble to the description
COMPLETE SPECIFICATION
The following specification particularly describes the invention and the manner in which it
is to be performed.
TECHNICAL FIELD
[0001] The present subject matter relates, in general, to detection and classification of step activity of a user and, particularly but not exclusively, to detection and classification of acceleration-based step activity of a user on a mobile device.
BACKGROUND
[0002] Individuals are increasingly becoming conscious of their health and are incorporating a variety of activities in their daily routine to maintain their health. The activities may include walking, jogging, and running. Such activities performed by an individual are tracked for monitoring health of the individual. The tracking of activities includes determining a status of their activities including the type of activity being performed and the rate at which the activity is being performed.
BRIEF DESCRIPTION OF DRAWINGS
[0003] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer same features and components. [0004] Figure 1 illustrates a mobile device for detection and classification of an acceleration-based step activity of a user, in accordance with an implementation of the present subject matter.
[0005] Figure 2 illustrates a method for detection and classification of an acceleration-based step activity of a user on a mobile device, in accordance with an implementation of the present subject matter.
[0006] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like
represent various processes which may be substantially represented in computer readable medium and so executed by a computing device or processor, whether or not such computing device or processor is explicitly shown.
DETAILED DESCRIPTION
[0007] The subject matter disclosed herein relates to mobile device(s) for detection and classification of acceleration-based step activity of a user and method(s) for detection and classification of acceleration-based step activity of a user on a mobile device.
[0008] For the purposes of the present subject matter, the user can be an individual performing an acceleration-based step activity, and the performed acceleration-based step activity can be detected and classified on a mobile device carried by the user. Acceleration-based step activity refers to an activity, performed by an individual, in which the individual takes steps to set himself in motion and remain in motion. Each of the steps taken by the individual has multiple acceleration values associated with it, which can be used for step detection. The acceleration-based step activity can include walking, brisk walking, running, jogging, and the like. For the sake of simplicity, the acceleration-based step activity hereinafter is interchangeably referred to as step activity. [0009] In order to detect acceleration-based motion or movements of the user, inertial sensors, for example, accelerometers, are commonly used. The Micro-Electro-Mechanical Systems (MEMS) technology has enabled the manufacture of inertial sensors of size that fits into portable or mobile electronic devices, such as cellular phones, portable music players, pedometers, game controllers, and portable computers. Such inertial sensors have low cost and low power consumption. A step activity performed by the user can be detected by the inertial sensor in mobile device carried by the user.
[0010] The detection and classification of a step activity of a user through a mobile device is known. Conventionally, the mobile device is required to be trained or pre-run before the detection and classification of the step activity of the
actual user. In the training or the pre-running, data sets of acceleration signals, from the inertial sensor, are generally taken for a variety of test-users and for a variety of step activities performed by the test-users. The data sets of acceleration signals for the test-users and the step-activities are processed in the mobile device to classify the step activities into a predefined set of step activities. Since the acceleration signals detected for the actual user may be significantly different from those for the test-users considered during the training stage, the conventional methodologies may lead to an inaccuracy in step activity classification. [0011] Further, in some conventional methodologies for the detection and classification of the step activity, the user himself has to spend time and train the algorithm, used for the classification, using his own data sets of acceleration signals for each of the different step activities. The motion of the user during each of the activities is detected by the inertial sensors and its characteristics are stored in the memory of the mobile device. The stored characteristics are used as reference for the detection and classification of user’s step activities in real-time. [0012] Further, some conventional methodologies for detection and classification of step activity of the user are dependent on the placement and orientation of the mobile device. For this, the user has to ensure that the mobile device is placed at a predefined position and with a predefined orientation while he is performing the activity. In such cases, depending on how and where the mobile device is kept and any change in the placement and the orientation of the mobile device from its desired position while performing the activity, the acceleration signal profile changes, which leads to a substantially inaccurate classification of the activity of the user. In addition to the restrictions on the placement and orientation of the mobile devices, conventionally, prior training is also required over the data sets for different positions and orientations of the mobile device and for each of the step activities.
[0013] Further, during the detection and classification of step activity for a user, there may exist stationary periods in which either no activity is being performed or other non-step activities are being performed. The acceleration signals received during these stationary periods may be significant and may have
amplitudes comparable to the acceleration signals due to the step activity. The acceleration signals during the stationary periods are unwanted signals and are referred to as noise. Such noise in the acceleration signal may lead to false detection of step activity for the user.
[0014] The present subject matter describes mobile device(s) for detection and classification of step activity of a user, and method(s) for detection and classification of step activity of the user on the mobile device. In an implementation, the step activity may include walking, brisk walking, running, jogging, and a combination thereof. The mobile device may include a mobile phone, a smart phone, a portable music player, a pedometer, a game controller, a portable computer and the like, having an inertial sensor.
[0015] In an implementation of the present subject matter, for detection and classification of the step activity of the user, motion of the user is detected by the inertial sensor which then converts the motion into acceleration signals. The acceleration signals contain information about the step activity of the user and are processed in order to classify the step activity of the user.
[0016] In an implementation, the acceleration signals, representing the motion of the user, are generated by the inertial sensors in the form of a data stream. The data stream of the acceleration signals from the inertial sensors is divided into data windows of a predetermined time period. The obtained data windows are then processed one by one for the classification of the step activity of the user. In an implementation, the processing of each data window includes zero normalization, linear interpolation, and low-pass filtration. The low-pass filtration removes high frequency signals, which improves the signals within the data window for subsequent processing.
[0017] Further, for each of the data windows, a frequency spectrum is obtained, and peaks are identified within the frequency spectrum of each of the data windows. Based on the peaks within the frequency spectrum of a data window, it is identified whether the data window is a non-stationary activity window or a stationary activity window. A non-stationary activity window is a data window in which the signal contains peaks indicative of the step activity of
the user. A stationary activity window is a data window in which the signal, including any peaks therein, is indicative of noise and not the step activity of the user. In an implementation, the identification of non-stationary activity windows is based on a maximum amplitude peak, a number of peaks, and a peak sharpness measure of the maximum amplitude peak within the frequency spectrum of a corresponding data window. The maximum amplitude peak is the peak with the maximum amplitude from amongst all peaks in the frequency spectrum of the data window. The number of peaks is based on peaks whose amplitude is within a predefined percentage, for example, 75 %, of the amplitude of the maximum amplitude peak.
[0018] Unlike the conventional methodology, the present subject matter is based on the identification of non-stationary activity windows. The non-stationary activity windows are identified and subsequently analyzed for classification of the step activity of the user. Since the classification of step activity, in accordance with the present subject matter, is based on the non-stationary activity windows, and the stationary activity windows are not considered for the classification, the classification of the step activity is not affected by the noise in the acceleration signals. This facilitates in substantial elimination of any false detection and classification of the step activity which was present in the conventional methodologies.
[0019] Once the non-stationary activity windows are identified from amongst the data windows, peaks within the frequency spectrum of each non-stationary activity window is validated for being indicative of true activity steps. Each of the valid peaks is indicative of a step which the user takes while performing the step activity. Thus, the number of valid peaks is indicative of the number of steps or the step count of user performing the step activity. In an implementation, the determination of valid peaks is based on Dynamic Time Warping (DTW) process and Individual Peak Analysis (IPA) performed on each non-stationary activity window.
[0020] Further, based on the valid peaks within the frequency spectrum of each of the non-stationary activity windows, the step activity of the user is
classified into one of predefined acceleration-based step activities. In an implementation, the predefined acceleration-based step activities may include a combination of activities from amongst walking, brisk walking, running, jogging, and the like.
[0021] For classifying the step activity into one of the predefined acceleration-based step activities, activity weights are calculated based on the valid peaks within the frequency spectrum of each of the non-stationary activity windows and predefined threshold frequencies of each of the predefined acceleration-based step activities. Each activity weight is indicative of the contribution of the peaks in the frequency spectrum of the non-stationary activity window to the corresponding predefined acceleration-based step activity. The number of predefined threshold frequencies required and the number of activity weights to be calculated varies depending on the number of predefined acceleration-based step activities into which the step activity of the user is to be classified.
[0022] Further, in an implementation, the predefined acceleration-based step activity, into which the step activity being performed by the user is classified, is displayed on the mobile device. In an implementation, the predefined acceleration-based step activity, into which the step activity being performed by the user is classified, and a step count determined based on the number of valid peaks within the frequency spectrum of the non-stationary activity windows, are displayed on the mobile device. This facilitates in providing information to the user about which step activity he is performing and at what rate the step activity is being performed.
[0023] The present subject matter can be implemented into practice and does not need any prior training with data sets of acceleration signals for a variety of test-users or from the user himself. It saves time of the user since the classification is carried out in real-time and does not require training prior to actual use by the user which was otherwise required in the conventional methodologies. Also, in the present subject matter, the mobile device is agnostic of position and orientation, and thus can be placed in any position and with any orientation for the detection and classification of the step activity of the user. This removes
restrictions on the placement and orientation of the mobile device and also removes the need of training for the different placements and orientations of the mobile device in the convention.
[0024] The manner in which the mobile device(s) and method(s) shall be implemented has been explained in details with respect to Figure 1 and Figure 2. Although the description herein is with reference to hand-held mobile device(s), the method(s) may be implemented in other portable device(s) and system(s) as well, albeit with a few variations, as will be understood by a person skilled in the art. While aspects of described methods can be implemented in any number of different mobile devices, transmission environments, and/or configurations, the implementations are described in the context of the following hand-held mobile device(s).
[0025] Figure 1 illustrates a mobile device 100 for detection and classification of an acceleration-based step activity, in accordance with an implementation of the present subject matter. For the purpose of description and simplicity, the acceleration-based step activity is hereinafter referred to as the step activity. In an implementation, the mobile device 100 is a device having an inertial sensor 130 and can be carried while performing the step activity. The inertial sensor 130 may include an accelerometer.
[0026] The mobile device 100 may include a mobile phone, a smart phone, a portable music player, a pedometer, a game controller, a portable computer, and the like. As shown in Figure 1, the mobile device 100 is carried by a user 102 performing the step activity. The mobile device 100 may belong to the user 102. The user 102 may hold the mobile device 100 is his hand, or place the mobile device 100 in a pocket or a bag, or may couple the mobile device 100 using a coupling means, while performing the step activity.
[0027] In an implementation, the mobile device 100 includes processor(s) 104. The processor(s) 104 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) 104
is configured to fetch and execute computer-readable instructions stored in a memory.
[0028] The mobile device 100 includes interface(s) 106. The interfaces may include a variety of machine readable instruction-based and hardware-based interfaces that allow the mobile device 100 to communicate with other devices, including servers, data sources and external repositories. Further, the interface(s) 106 may enable the mobile device 100 to communicate with other communication devices, such as network entities, over a communication network. [0029] Further, the mobile device 100 includes a memory 108. The memory 108 may be coupled to the processor(s) 104. The memory 108 can include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[0030] Further, the mobile device 100 includes module(s) 110 and data 112. The module(s) 110 and the data may be coupled to the processor(s) 104. The modules 110, amongst other things, include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. The modules 110 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulate signals based on operational instructions. The data 112 serves, amongst other things, as a repository for storing data that may be fetched, processed, received, or generated by the module(s) 110. Although the data is shown internal to the mobile device 100, it may be understood that the data 112 can reside in an external repository (not shown in the Figure), which may be coupled to the mobile device 100. The mobile device 100 may communicate with the external repository through the interface(s) 106.
[0031] Further, the module(s) 110 can be implemented in hardware, instructions executed by a processing unit, or by a combination thereof. The processing unit can comprise a computer, a processor, a state machine, a logic
array or any other suitable devices capable of processing instructions. The processing unit can be a general-purpose processor which executes instructions to cause the general-purpose processor to perform tasks or, the processing unit can be dedicated to perform the required functions. In another aspect of the present subject matter, the module(s) 110 may be machine-readable instructions (software) which, when executed by a processor/processing unit, perform any of the desired functionalities. The machine-readable instructions may be stored on an electronic memory device, hard disk, optical disk or other machine-readable storage medium or non-transitory medium. In an implementation, the machine-readable instructions can also be downloaded to the storage medium via a network connection.
[0032] In an implementation, the module(s) 110 include a signal processing module 114, a step activity classifier 116, a display module 118, and other module(s) 120. The other module(s) 120 may include programs or coded instructions that supplement applications or functions performed by the mobile device 100. In said implementation, the data 112 includes signal data 122, signal processing data 124, activity classifier data 126, and other data 128. The other data 128 amongst other things, may serve as a repository for storing data that is processed, received, or generated as a result of the execution of one or more modules in the module(s) 110.
[0033] The description hereinafter describes the detection and classification of the step activity of the user 102 on the mobile device 100 carried by the user 102. In an implementation, while the user 102 is performing the step activity, the inertial sensor 130 in the mobile device 100 detects the motion of the user 102 due to the step activity and generates a data stream of the acceleration signals corresponding to the motion of the user 102. The data stream may be generated by the inertial sensor 130 at a sampling frequency of 80 Hz.
[0034] The signal processing module 114 receives the data stream of the acceleration signals generated by the inertial sensor 130, and processes the data stream for the classification of the step activity of the user 102. For the purpose of processing, the signal processing module 114 generates data windows of a
predetermined time period by dividing the data stream of the acceleration signals. In an implementation, the predetermined time period is 2 seconds. In an implementation, the predetermined time period may be a value between 1.5 seconds to 2.5 seconds. The data windows thus generated are stored in the signal data 122.
[0035] In the implementation, each of the data windows is processed for zero normalization, linear interpolation, and low-pass filtration. The low pass filtration may be performed using a low pass discrete time filter. With this, the noise and the disturbances incorporated by the high frequency signals are removed and the signal in the data windows is substantially improved for subsequent processing. [0036] After processing the data windows, for each of the data windows, a frequency spectrum is obtained and peaks are identified within the frequency spectrum of the each of the data windows. The frequency spectrum may be obtained by performing Fast Fourier Transform (FFT), or Discrete Fourier Transform (DFT), on the data windows. The frequency spectrum of a data window may have one or more peaks corresponding to a true step activity and may have one or more peaks corresponding to noise or other non-step based activities, such as movements of the mobile device 100. Further, based on the peaks within the frequency spectrum of the each of the data windows, the signal processing module 114 identifies the non-stationary activity windows from amongst all the data windows. By the identification of non-stationary activity windows, any false detection and classification of the step activity of the user 102 can be substantially eliminated.
[0037] In an implementation, for the purpose of identification of whether a data window is a non-stationary activity window, a maximum amplitude peak in the frequency spectrum is identified. The maximum amplitude peak is the peak which has the highest maximum amplitude from amongst all peaks in the frequency spectrum of the data window. Let the amplitude of the maximum amplitude peak is denoted by Ad, and the frequency at which the amplitude is maximum be denoted by fd. In addition to the maximum amplitude peak, a
number of peaks within 75 % of the height of the amplitude Ad is determined. Let the number of peaks be denoted by Pn.
[0038] Further, a measure of peak sharpness of the maximum amplitude peak in the frequency spectrum of the corresponding data window is computed. The peak sharpness measure is denoted by Psharpness . The greater is the peak sharpness measure Psharpness, the sharper is the peak. The peak sharpness measure Psharpness is computed based on the below equation (1):
where k is a number of DFT components about the maximum amplitude Ad, and fd is the frequency corresponding to the amplitude Ad. A(i) is the amplitude at the jth DFT component, and A(i + 1) is the amplitude at the DFT components next to the ith DFT component for the maximum amplitude peak in the frequency spectrum. In an example, k may be equal to 1, and the summation in equation (1) is taken for i = -1, 0, 1. ^(0) is the amplitude Ad.
[0039] Based on the above identifications and computations, the amplitude Ad is compared with a predefined threshold amplitude and the peak sharpness measure Psharpness is compared to a predefined threshold sharpness. The predefined threshold amplitude may be 1 and the predefined threshold sharpness may be 0.3. The predefined threshold sharpness of 0.3 differentiates distinct sharp peaks from otherwise wide peaks which may be caused due to noises or any other non-step activity. If the amplitude Ad is less than 1 and if the peak sharpness measure Psharpness is greater than 0.3 and if the number of peaks Pn is 0 or 1, then the data window is identified as a non-stationary activity window, else the data window is identified as a stationary activity window. Further, if the amplitude Ad is greater than or equal to 1 and if the peak sharpness measure Psharpness is greater than 0.3 and if the number of peaks Pn is 0, 1 or 2, then the data window is identified as a non-stationary activity window, else the data window is identified as a stationary activity window.
[0040] After the identification of the non-stationary activity windows from amongst the data windows, the signal processing module 114 analyzes the non-stationary windows for the classification of the step activity of the user 102. The frequency spectra of the non-stationary activity windows has peaks, out of which either some or all may be valid peaks. For the classification of the step activity, valid peaks in the frequency spectrum of the non-stationary activity windows are determined. The determination of valid peaks is based on Dynamic Time Warping (DTW) process and Individual Peak Analysis (IPA). Each of the valid peaks is indicative of a true activity step, which may represent the step taken by the user 102 while performing the acceleration-based step activity.
[0041] In the DTW process of determination of valid peaks, DTWt is an indicator value used to determine whether the DTW process has succeeded or failed to determine the valid peaks. When the DTW process fails for a non-stationary activity window, it is reflected in a confidence score CS. The confidence score is indicative of the percentage of peaks determined as valid in the frequency spectrum of the non-stationary activity window. When all the peaks in the frequency spectrum of the non-stationary window are inferred as valid peaks, the value of CS is 100 %, otherwise the value of CS is less than 100 %. [0042] In the DTW process, for the ith peak in the frequency spectrum of the non-stationary activity window, if the sample distance between the ithpeak and the (i — 2)th peak lies within a predefined sample distance range of DTWt± tolerance of t %, then both the ith peak and the (i — 2)th peak are assigned a status of ‘valid’. If the sample distance between the ith peak and the (i — 2)th peak does not lie within the predefined sample distance range of DTWt ± tolerance of t %, then the ith peak is assigned a status of ‘check’ and the (i — 2)th peak is assigned a status of ‘invalid’, if its status is not already ‘valid’. In an implementation, the tolerance t is 5%.
[0043] In the IPA, the peaks are individually validated. For validating the ith peak in the frequency spectrum of the non-stationary window as the true activity step, a product Prodi for the ith peak is computed based on the below equation (2):
where At is the amplitude of the ith peak, APM is the preceding peak minima. The preceding peak minima refers to the minimum amplitude of acceleration between the ith peak and the last occurred valid peak. Further, SDt is the sample distance between the ith peak and the last occurred valid peak before the ith peak. Further, fs is the sampling frequency. Based on the value of the product Prodi, a criterion for validating the ith peak in the frequency spectrum of the non-stationary window as the true activity step is checked. The criterion is given by:
Ad is the amplitude of the maximum amplitude peak in the frequency spectrum of the corresponding non-stationary window, and fd is the frequency at which the amplitude i4doccurs. T is a constant which is less than 1. In an example, T is taken as 0.75 as it substantially differentiates the valid peaks from the invalid peaks. The valid peaks correspond to actual activity steps and the invalid peaks represent false peaks generated by noise. If the criterion given by equation (3) is satisfied for the ith peak, then that peak is a valid peak.
[0044] After the peak validation process for the frequency spectrum of a non-stationary activity window is completed, a step cycle length (SCL) for each valid peak is calculated. The SCLt is the sample distance between the invalid peak in the frequency spectrum of the non-stationary activity window and the (i — 2)thvalid peak in the frequency spectrum of the non-stationary activity window. SCLi has a unit samples per step. For i = 1,2, valid peaks from the frequency spectrum of the previous non-stationary windows are used.
[0045] The process of identification of valid peaks within the frequency spectrum of the non-stationary activity windows is described hereinafter. For a non-stationary activity window, DTWt of the previous non-stationary activity window is checked and compared with 0. For the first non-stationary activity window, the value of DTWt is set to 0. If the DTWt is 0, IPA is performed on the
frequency spectrum of the current non-stationary activity window to identify the valid peaks as described above. Based on the values of SCL for the peaks obtained during the IP A, the value of DTWt is updated as the mean of all SCL values found for the current non-stationary activity window. If the DTWtof the last non-stationary activity window is not 0, then the DTW process is performed on the frequency spectrum of the current non-stationary activity window. After performing the DTW process, the confidence score CS is checked. If the confidence score CS for that window is 100%, then the value of DTWt is updated as the mean of all SCL values found for the current non-stationary activity window. If the confidence score CS for that window is not 100%, thenD7Wtis updated as 0, and the IPA is performed on the frequency spectrum of the non-stationary activity window. Based on the SCL values obtained after the IPA, the value of DTWt is updated as the mean of all SCL values found for the current non-stationary activity window. The above procedure is iteratively repeated for all the non-stationary activity windows. The data thus generated by the signal processing module 114 for the non-stationary activity windows is stored in the signal processing data 124.
[0046] Further, based on the valid peaks within the frequency spectrum of each of the non-stationary activity windows, the step activity classifier 116 classifies the step activity of the user 102 into one of the predefined acceleration-based step activities. In an implementation, for the classification of step activity of the user 102, the frequency spectrum of at least two continuous non-stationary activity windows are processed by the signal processing module 114 as described above. In an implementation, the predefined acceleration-based step activities may include a combination of activities from amongst walking, brisk walking, running, jogging, and the like. For this, the step activity classifier 116 calculates activity weights based on the data for the valid peaks and threshold frequencies of the predefined acceleration-based step activities. The process of calculation of activity weights and subsequent classification of the step activity of the user 102 is described below.
[0047] Let us consider a case where the predefined acceleration-based step activities include “walking”, “brisk walking”, and “running”. With this set of predefined acceleration-based step activities, two predefined threshold frequencies, namely a brisk-walking threshold frequency fbw and a running threshold frequency fr are used. The brisk-walking threshold frequency fbw refers to the threshold frequency marking the separation between walking-brisk-walking, and the running threshold frequency fr refers to the threshold frequency marking the separation between brisk-walking-running. Further, with this set of predefined acceleration-based step activities four activity weights, wt, w2, w3 and w4 are defined. The activity weight wt is a measure of the amount of walking activity, the activity weights w2 and w3 are a measure of the amount of brisk-walking activity, and the activity weight w4 is a measure of the amount of running activity. Initially all the activity weights,w1,w2,w3, and w4 are set to 0. [0048] For the purpose of classification of the step activity of the user 102 into one of “walking”, “brisk walking”, and “running”, the step activity classifier 116 calculates a step frequency SFi for the ith valid peak in the frequency spectrum of the non-stationary activity window. The step frequency SFi is computed based on the value of SCLi of the ith valid peak and sampling frequency fs. The value of step frequency SFi for the ith valid peak is calculated based on equation (5) below:
SFt = — …………… (5)
SFi has the unit of steps per second.
[0049] Further, for calculating the activity weights, the step activity classifier 116 compares the step frequency SFi with the threshold frequencies fbw and fr. If the step frequency SFi is greater than 1 and less than the brisk-walking threshold frequency fbw then the activity weight wt is calculated as:
wt = wt + (fbw — SFt)2 …………… (6) where the initial value of w1 is 0.
[0050] If the step frequency SFi is greater than or equal to the brisk-walking threshold frequency fbw and less than running threshold frequency fr, then the activity weights w2and w3 are calculated as follows: w2 = w2 + (fbW — SFi)2 …………… (7)
w3 = w3 + (fr ~ SFd2
where the initial values of w2 and w3 is 0.
[0051] Further, if the step frequency SFi is greater than or equal to the running
threshold frequency fr, then the activity weight w4is calculated as follows:
w4 = w4 + (fr — SFi)2 …………… (9) where the initial value of w4 is 0.
[0052] The above procedure of computation of activity weights is iteratively repeated for all the valid peaks of the non-stationary activity window under consideration.
[0053] After computation of the activity weights, the step activity classifier 116 classifies the step activity of the user 102 in to one of the predefined acceleration-based step activities based on the above calculated activity weights. For the classification, the activity weight wt is compared to the activity weight w4, the activity weight w2 is compared to the activity weight wt and the activity weight w3 is compared to the activity weight w4. If the value of the activity weight wt is greater than the activity weight w4 and the value of the activity weight wt is greater than or equal to activity weight w2, then the step activity of the user 102 is classified as “walking”. If the value of the activity weight wt is greater than the activity weight w4 and the value of the activity weight wt is less than the activity weightw2, then the step activity of the user 102 is classified as “brisk walking”. If the value of the activity weight wt is less than or equal to the activity weight w4 and the value of the activity weightw3 is greater thanw4, then also the step activity of the user 102 is classified as “brisk walking”. If the value of the activity weight wt is less than or equal to the activity weight w4 and the value of the activity weight w3 is less than or equal to w4, then the step activity of the user 102 is classified as “running”. The data associated with the classified activity is stored in the activity classifier data 126.
[0054] The number of predefined threshold frequencies and the number of activity weights to be calculated varies depending on the number of predefined acceleration-based step activities used for the classification. In an implementation using n number of predefined acceleration-based step activities, (n — 1) number of threshold frequencies is used and (n + 1) number of activity weight is calculated. In the implementation, all the (n + 1) activity weights have 0 as the initial values.
[0055] In an implementation, the display module 118 displays on the mobile device 100, that acceleration-based step activity under which the step activity being performed by the user 102 is classified. Further, in an implementation, a step count, determined based on the number of valid peaks within the frequency spectrum of the non-stationary activity window, is displayed on the mobile device 100 by the display module 118. This display facilitates the user 102 to monitor the step activity he is performing and the rate at which he is performing that step activity.
[0056] In an implementation, if no valid peak is identified in the frequency spectrum of the non-stationary activity windows, then the display module 118 may display “stationary” as the activity.
[0057] In an implementation, the step activity performed by the user 102 along with the step count may be stored in the activity classifier data 126 to generate a report for the user 102 stating the type of step activities performed by the user 102 in a session. Further, in the implementation, the step count for a step activity may be calibrated with respect to calories burnt, and the report may state the number of calories burnt by the user 102 in the session.
[0058] In an implementation, the acceleration signals generated by the inertial sensor, the data generated based on the processing done by the signal processing module 114, and the data associated with the type of step activity and the step count may be transmitted from the mobile device 100 to another device, such as a personal computer, a laptop, and the like, for various purposes including keep a track or a history of the step activities and the step counts performed by the user 102. The mobile device 100 may be equipped with compatible input/output
interfaces for such communication. In an implementation, the communication may take places via a wired communication link, such as a data cable, or via a wireless communication link, such as BluetoothTM, IR and WiFi. [0059] Figure 2 illustrates a method 200 for detection and classification of an acceleration-based step activity of a user 102. The method 200 is implemented in a mobile device 100. The order in which the method 200 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 200, or any alternative methods. Additionally, individual blocks may be deleted from the method 200 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 200 can be implemented in any suitable hardware.
[0060] The method 200 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. Further, although the method 200 may be implemented in any mobile device having an inertial sensor; in an example described in Figure 2, the method 200 is explained in context of the aforementioned mobile device 100, for the ease of explanation.
[0061] Referring to Figure 2, at block 202, acceleration signal generated by an inertial sensor 130 in the mobile device 100, while the user 102 is performing a step activity, are received from the inertial sensor 130 for processing. As mentioned earlier, while performing the step activity, the user 102 takes steps which set the user 102 into motion. This motion of the user 102 performing the step activity has an acceleration associated with it. The inertial sensor 130 in the mobile device 100 detects the motion of the user 102 and generates a data stream of acceleration signals for the motion of the user 102.
[0062] At block 204, the acceleration signal is divided into data windows of a predetermined period. In an implementation, the predetermined period is 2 seconds. In another implementation, the predetermined period may be between
1.5 seconds to 2.5 seconds. After obtaining the data windows, the data windows are processed for zero normalization, linear interpolation and low pass filtration. [0063] At block 206, peaks are identified within a frequency spectrum of each of the data windows. In an implementation, the frequency spectrum of the data windows may be obtained using Fast Fourier Transform (FFT). Further, at block 208, non-stationary activity windows are identified from amongst the data windows of the acceleration signals by obtaining a maximum amplitude peak, number of peaks and peak sharpness measure from the frequency spectrum of each data window. The computation of the peak sharpness measure, and the identification of the non-stationary activity windows are performed in a manner as described earlier in the description.
[0064] Further, at block 210, within the frequency spectrum of the non-stationary activity window, valid peaks are determined. These valid peaks are indicative of true activity steps and are determined based on Dynamic Time Warping (DTW) and Individual Peak Analysis (IPA). The valid peaks are determined as described earlier in the description.
[0065] At block 212, the step activity of the user 102 is classified into one of the predefined acceleration-based step activities based on the valid peaks obtained within the frequency spectrum of each of the non-stationary activity window. In an implementation, the classification of the step activity includes, computing step frequency for the valid peaks and computing activity weights based on the step frequencies and predefined threshold frequencies for the predefined acceleration-based step activities. The step activity of the user 102 is then classified based on the activity weights. The computation of step frequency and the activity weights and the classification are done in a manner as described earlier in the description. [0066] In an implementation, the classified activity and the step count determined based on the valid peaks in the frequency spectrum of the non-stationary activity windows are displayed on the mobile device 100 for the user 102.
[0067] Although implementations for mobile device(s) for detection and classification of acceleration-based step activity of a user 102 and method(s) for
detection and classification of acceleration-based step activity of a user 102 on a mobile device are described, it is to be understood that the present subject matter is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as implementations to detect and classify the step activity of user 102 on mobile device 100.
I/We claim:
1. A method for detection and classification of an acceleration-based step
activity of a user (102) on a mobile device (100), the method comprising:
receiving acceleration signals from an inertial sensor (130) in the mobile device (100) carried by the user (102) performing a step activity;
dividing the acceleration signals into data windows of a predetermined time period;
identifying peaks within a frequency spectrum of each of the data windows;
identifying non-stationary activity windows, from amongst the data windows, based on the peaks within the frequency spectrum of the each of the data windows;
determining valid peaks within the frequency spectrum of each of the non-stationary activity windows, wherein each of the valid peaks is indicative of a true activity step; and
classifying the step activity of the user (102) into one of predefined acceleration-based step activities based on the valid peaks within the frequency spectrum of the each of the non-stationary activity windows.
2. The method as claimed in claim 1 further comprising processing the data windows by performing zero normalization, linear interpolation, and low pass filtration on the data windows.
3. The method as claimed in claim 1, wherein the identifying of each of the non-stationary activity windows is based on a maximum amplitude peak, a number of peaks, and a measure of peak sharpness of the peaks within the frequency spectrum of a corresponding data window.
4. The method as claimed in claim 1, wherein the determining of the valid peaks within the frequency spectrum of the each of the non-stationary activity
windows is based on Dynamic Time Warping process and Individual Peak Analysis.
5. The method as claimed in claim 1, wherein the classifying the step activity of the user (102) into one of the predefined acceleration-based step activities comprises calculating activity weights for the each of the non-stationary activity windows based on the valid peaks within the frequency spectrum of that non-stationary activity window and predefined threshold frequencies for the predefined acceleration-based step activities.
6. The method as claimed in claim 1 further comprising displaying on the mobile device (100) the one of the predefined acceleration-based step activities being performed by the user (102).
7. The method as claimed in claim 6 further comprising displaying on the mobile device (100) a step count of the one of the predefined acceleration-based step activity being performed by the user (102), wherein the step count is based on the valid peaks within the frequency spectrum of the each of the non-stationary activity windows.
8. The method as claimed in claim 1, wherein the predefined acceleration-based step based activities comprise walking, brisk walking, running, and jogging.
9. A mobile device (100) for detection and classification of an acceleration-based step activity of a user (102), the mobile device (100) comprising:
a processor (104);
an inertial sensor (130) to detect motion of the user (102) performing a step activity and to generate acceleration signals based on the motion of the user (102);
a signal processing module (114) coupled to the processor (104), to:
generate data windows of a predetermined time period from the acceleration signals;
identify peaks within a frequency spectrum of each of the data windows;
identify non-stationary activity windows, from amongst the data windows, based on the peaks within the frequency spectrum of the each of the data windows;
determine valid peaks within the frequency spectrum of each of the non-stationary activity windows, wherein each of the valid peaks is indicative of a true activity step; and a step activity classifier (116) coupled to the processor (104), to classify the step activity of the user (102) into one of predefined acceleration-based step activities based on the valid peaks within the frequency spectrum of the each of the non-stationary activity windows.
10. The mobile device (100) as claimed in claim 9, wherein the signal processing module (114) performs zero normalization, linear interpolation and low pass filtration on the data windows.
11. The mobile device (100) as claimed in claim 9, wherein the signal processing module (114) identifies each of the non-stationary activity windows based on a maximum amplitude peak, a number of peaks, and a measure of peak sharpness of the peaks within the frequency spectrum of a corresponding data window.
12. The mobile device (100) as claimed in claim 9, wherein the signal processing module (114) determines the valid peaks within the frequency spectrum of the each of the non-stationary activity windows based on Dynamic Time Warping process and Individual Peak Analysis.
13. The mobile device (100) as claimed in claim 9, wherein the step activity classifier (116) classifies the step activity of the user into one of the predefined acceleration-based step activities by calculating activity weights for the each of the non-stationary activity windows based on the valid peaks within the frequency spectrum of that non-stationary activity window and predefined threshold frequencies for the predefined acceleration-based step activities.
14. The mobile device (100) as claimed in claim 9 further comprising a display module (118) to display on the mobile device (100) the one of predefined acceleration-based step activities.
15. The mobile device (100) as claimed in claim 14, wherein the display module (118) displays on the mobile device (100) a step count of the one of predefined acceleration-based step activities being performed by the user (102), wherein the step count is based on the valid peaks within the frequency spectrum of the each of the non-stationary activity windows.
16. The mobile device (100) as claimed in claim 9, wherein the inertial sensor (130) comprises an accelerometer.
17. A non-transitory computer readable medium having a set of computer readable instructions that, when executed, cause a mobile device to:
receive acceleration signals from an inertial sensor in the mobile device carried by a user performing an acceleration-based step activity;
divide the acceleration signals into data windows of a predetermined time period;
identify peaks within a frequency spectrum of each of the data windows;
identify non-stationary activity windows, from amongst the data windows, based on the peaks within the frequency spectrum of the each of the data windows;
determine valid peaks within the frequency spectrum of each of the non-stationary activity windows, wherein each of the valid peaks is indicative of a true activity step; and
classify the acceleration-based step activity of the user into one of predefined step activities based on the valid peaks within the frequency spectrum of the each of the non-stationary activity windows.
| # | Name | Date |
|---|---|---|
| 1 | 1875-MUM-2013-IntimationOfGrant20-07-2023.pdf | 2023-07-20 |
| 1 | 1875-MUM-2013-Request For Certified Copy-Online(15-05-2014).pdf | 2014-05-15 |
| 2 | 1875-MUM-2013-PatentCertificate20-07-2023.pdf | 2023-07-20 |
| 2 | SPEC.pdf | 2018-08-11 |
| 3 | PD009511IN-SC_Request for Priority Documents.pdf | 2018-08-11 |
| 3 | 1875-MUM-2013-Written submissions and relevant documents [14-03-2023(online)].pdf | 2023-03-14 |
| 4 | FORM 5.pdf | 2018-08-11 |
| 4 | 1875-MUM-2013-FORM-26 [01-03-2023(online)].pdf | 2023-03-01 |
| 5 | FORM 3.pdf | 2018-08-11 |
| 5 | 1875-MUM-2013-Correspondence to notify the Controller [14-02-2023(online)].pdf | 2023-02-14 |
| 6 | FIGURES.pdf | 2018-08-11 |
| 6 | 1875-MUM-2013-US(14)-HearingNotice-(HearingDate-03-03-2023).pdf | 2023-02-10 |
| 7 | ABSTRACT1.jpg | 2018-08-11 |
| 7 | 1875-MUM-2013-CLAIMS [26-11-2019(online)].pdf | 2019-11-26 |
| 8 | 1875-MUM-2013-SPECIFICATION(AMENDED)-(19-6-2013).pdf | 2018-08-11 |
| 8 | 1875-MUM-2013-COMPLETE SPECIFICATION [26-11-2019(online)].pdf | 2019-11-26 |
| 9 | 1875-MUM-2013-FER_SER_REPLY [26-11-2019(online)].pdf | 2019-11-26 |
| 9 | 1875-MUM-2013-POWER OF ATTORNEY(3-9-2013).pdf | 2018-08-11 |
| 10 | 1875-MUM-2013-MARKED COPY(19-6-2013).pdf | 2018-08-11 |
| 10 | 1875-MUM-2013-OTHERS [26-11-2019(online)].pdf | 2019-11-26 |
| 11 | 1875-MUM-2013-FORM 2(TITLE PAGE)-(19-6-2013).pdf | 2018-08-11 |
| 11 | 1875-MUM-2013-FORM 3 [05-11-2019(online)].pdf | 2019-11-05 |
| 12 | 1875-MUM-2013-FORM 18.pdf | 2018-08-11 |
| 12 | 1875-MUM-2013-Information under section 8(2) (MANDATORY) [05-11-2019(online)].pdf | 2019-11-05 |
| 13 | 1875-MUM-2013-FER.pdf | 2019-05-28 |
| 13 | 1875-MUM-2013-FORM 13(19-6-2013).pdf | 2018-08-11 |
| 14 | 1875-MUM-2013-ABSTRACT(19-6-2013).pdf | 2018-08-11 |
| 14 | 1875-MUM-2013-FORM 1(3-6-2013).pdf | 2018-08-11 |
| 15 | 1875-MUM-2013-CLAIMS(AMENDED)-(19-6-2013).pdf | 2018-08-11 |
| 15 | 1875-MUM-2013-DRAWING(19-6-2013).pdf | 2018-08-11 |
| 16 | 1875-MUM-2013-CORRESPONDENCE(19-6-2013).pdf | 2018-08-11 |
| 16 | 1875-MUM-2013-CORRESPONDENCE(3-9-2013).pdf | 2018-08-11 |
| 17 | 1875-MUM-2013-CORRESPONDENCE(3-6-2013).pdf | 2018-08-11 |
| 18 | 1875-MUM-2013-CORRESPONDENCE(3-9-2013).pdf | 2018-08-11 |
| 18 | 1875-MUM-2013-CORRESPONDENCE(19-6-2013).pdf | 2018-08-11 |
| 19 | 1875-MUM-2013-CLAIMS(AMENDED)-(19-6-2013).pdf | 2018-08-11 |
| 19 | 1875-MUM-2013-DRAWING(19-6-2013).pdf | 2018-08-11 |
| 20 | 1875-MUM-2013-ABSTRACT(19-6-2013).pdf | 2018-08-11 |
| 20 | 1875-MUM-2013-FORM 1(3-6-2013).pdf | 2018-08-11 |
| 21 | 1875-MUM-2013-FER.pdf | 2019-05-28 |
| 21 | 1875-MUM-2013-FORM 13(19-6-2013).pdf | 2018-08-11 |
| 22 | 1875-MUM-2013-FORM 18.pdf | 2018-08-11 |
| 22 | 1875-MUM-2013-Information under section 8(2) (MANDATORY) [05-11-2019(online)].pdf | 2019-11-05 |
| 23 | 1875-MUM-2013-FORM 2(TITLE PAGE)-(19-6-2013).pdf | 2018-08-11 |
| 23 | 1875-MUM-2013-FORM 3 [05-11-2019(online)].pdf | 2019-11-05 |
| 24 | 1875-MUM-2013-OTHERS [26-11-2019(online)].pdf | 2019-11-26 |
| 24 | 1875-MUM-2013-MARKED COPY(19-6-2013).pdf | 2018-08-11 |
| 25 | 1875-MUM-2013-FER_SER_REPLY [26-11-2019(online)].pdf | 2019-11-26 |
| 25 | 1875-MUM-2013-POWER OF ATTORNEY(3-9-2013).pdf | 2018-08-11 |
| 26 | 1875-MUM-2013-COMPLETE SPECIFICATION [26-11-2019(online)].pdf | 2019-11-26 |
| 26 | 1875-MUM-2013-SPECIFICATION(AMENDED)-(19-6-2013).pdf | 2018-08-11 |
| 27 | 1875-MUM-2013-CLAIMS [26-11-2019(online)].pdf | 2019-11-26 |
| 27 | ABSTRACT1.jpg | 2018-08-11 |
| 28 | 1875-MUM-2013-US(14)-HearingNotice-(HearingDate-03-03-2023).pdf | 2023-02-10 |
| 28 | FIGURES.pdf | 2018-08-11 |
| 29 | 1875-MUM-2013-Correspondence to notify the Controller [14-02-2023(online)].pdf | 2023-02-14 |
| 29 | FORM 3.pdf | 2018-08-11 |
| 30 | 1875-MUM-2013-FORM-26 [01-03-2023(online)].pdf | 2023-03-01 |
| 30 | FORM 5.pdf | 2018-08-11 |
| 31 | PD009511IN-SC_Request for Priority Documents.pdf | 2018-08-11 |
| 31 | 1875-MUM-2013-Written submissions and relevant documents [14-03-2023(online)].pdf | 2023-03-14 |
| 32 | SPEC.pdf | 2018-08-11 |
| 32 | 1875-MUM-2013-PatentCertificate20-07-2023.pdf | 2023-07-20 |
| 33 | 1875-MUM-2013-Request For Certified Copy-Online(15-05-2014).pdf | 2014-05-15 |
| 33 | 1875-MUM-2013-IntimationOfGrant20-07-2023.pdf | 2023-07-20 |
| 1 | 1875MUM2013_27-05-2019.pdf |