Sign In to Follow Application
View All Documents & Correspondence

A Method For Recognizing Gestures Using An Accelerometer Mounted Onto A Wearable Device

Abstract: Embodiment relates to recognizing gestures using an accelerometer mounted onto a wearable device. The method comprises obtaining orientation of the wearable device with respect to a plurality of degrees of freedom when wearable device is in a first static state and a second static state. The obtained orientation is compared with one or more prerequisite orientations to determine whether wearable device is oriented in prerequisite orientations. A dynamic gesture of wearable device is detected in plurality of degrees of freedom occurring between first static state and second static state till wearable device returns to second static state. The detected dynamic gesture is compared with predetermined gesture patterns. The predetermined gesture patterns provide gesture forms. A valid gesture is recognized using gesture forms and prerequisite orientations from at least one of first static state and second static state. Fig. 6

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
10 December 2013
Publication Number
26/2016
Publication Type
INA
Invention Field
ELECTRICAL
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2019-11-15
Renewal Date

Applicants

INDIAN INSTITUTE OF SCIENCE
Bangalore 560012, Karnataka, India.

Inventors

1. Dhruv Saxena
Robert Bosch Centre for Cyber Physical Systems, Indian Institute of Science, Bangalore 560012, Karnataka, India.
2. Hiteshwar Rao
Robert Bosch Centre for Cyber Physical Systems, Indian Institute of Science, Bangalore 560012, Karnataka, India.
3. Pragati Mehrotra
Centre for Product Design and Manufacturing, Indian Institute of Science, Bangalore 560012, Karnataka, India.
4. Anand Puntambekar
Centre for Product Design and Manufacturing, Indian Institute of Science, Bangalore 560012, Karnataka, India.
5. G.K.Ananthasuresh
Department of Mechanical Engineering, Centre for Product Design and Manufacturing, Indian Institute of Science, Bangalore 560012, Karnataka, India.

Specification

CLIAMS:We claim:
1. A method for recognizing gestures using an accelerometer mounted onto a wearable device, said method comprising:
obtaining orientation of the wearable device with respect to a plurality of degrees of freedom by the accelerometer when the wearable device is in a first static state and a second static state, wherein the obtained orientation is compared with one or more prerequisite orientations by a processing unit configured within the wearable device to determine whether the wearable device is oriented in at least one of the one or more prerequisite orientations;
detecting, a dynamic gesture of the wearable device in at least one of the plurality of degrees of freedom occurring between the first static state and the second static state, said dynamic gesture is detected till the wearable device returns to the second static state;
comparing, by the processing unit, the detected dynamic gesture with one or more predetermined gesture patterns, wherein the one or more predetermined gesture patterns provide at least one gesture form; and
recognizing, by the processing unit, a valid gesture using the at least one gesture form and the least one of the one or more prerequisite orientations from at least one of the first static state and the second static state.

2. The method as claimed in claim 1, wherein the accelerometer comprises a three-axis accelerometer.

3. The method as claimed in claim 1, wherein the orientation of the wearable device is obtained by determining from effect of acceleration due to gravity on the plurality of degrees of freedom.

4. The method as claimed in claim 1, wherein the orientation of the wearable device is obtained when the wearable device is in the first static state and the second static state for a first predetermined amount of time and a third predetermined amount of time respectively.

5. The method as claimed in claim 1, wherein the dynamic gesture of the wearable device is detected when the wearable device moves for a second predetermined amount of time.
6. The method as claimed in claim 1, wherein the valid gesture is selected from at least one of a left flick gesture, a right flick gesture, an up flick gesture, and a down flick gesture.

7. The method as claimed in claim 1, wherein the valid gesture generates control signals to operate one or more devices capable of receiving wireless signals.

8. The method as claimed in claim 7, wherein the control signals are transmitted over a wireless medium.

9. The method as claimed in claim 7, wherein the one or more devices are selected from at least one of mobile devices, contactless devices, computer, television, transceivers, music players, and other electrical or electronic appliances capable of receiving wireless signals.

10. The method as claimed in claim 1, wherein the wearable device is embedded as jewelry.

11. A wearable device for recognizing gestures comprising:
an accelerometer configured to provide orientation of the wearable device; and
a processing unit coupled to the accelerometer configured to:
obtain the orientation of the wearable device with respect to a plurality of degrees of freedom from the accelerometer when the wearable device is in a first static state and a second static state, wherein the obtained orientation is compared with one or more prerequisite orientations by a processing unit configured in the wearable device to determine whether the wearable device is oriented in at least one of the one or more prerequisite orientations;
detect a dynamic gesture of the wearable device in at least one of the plurality of degrees of freedom occurring between the first static state and the second static state, said dynamic gesture is detected till the wearable device returns to the second static state;
compare the detected dynamic gesture with one or more predetermined gesture patterns, wherein the one or more predetermined gesture patterns provide at least one gesture form; and
recognize a valid gesture using the at least one gesture form and the least one of the one or more prerequisite orientations from at least one of the first static state and the second static state; and
a storage unit configured to store the one or more prerequisite orientations, the one or more predetermined gesture patterns.

12. The wearable device as claimed in claim 11, wherein the storage unit further stores a first predetermined amount of time, a second predetermined amount of time and a third predetermined amount of time set for determining the wearable device is in the first static state, dynamic gesture occurring between the first static state and the second static state and the wearable device is in the second static state respectively.

13. The wearable device as claimed in claim 11 comprises a wireless transmitter to transmit at least one of the valid gesture and control signals generated by the valid gesture.

14. The method as claimed in claim 11, wherein the wearable device is embedded as jewelry.
,TagSPECI:TECHNICAL FIELD
The present disclosure relates to a wearable device with cognitive abilities. In particular, the present disclosure is related to a method for recognizing gestures using an accelerometer mounted on to the wearable device.
BACKGROUND
Presently, motion sensors are used to sense human experiences in daily lives. The motion sensors can be incorporated into a form of jewellery that can be worn by a user. For example, the motion sensors can be incorporated in a ring, locket, chains, necklace, bangles, etc. that is worn by the user. The motion sensors are typically used to control other electrical or electronic appliances by monitoring the subtle gestures made by the wearer, to monitor the health of the wearer, to know the wearer thoughts etc.
Conventional systems uses Inertial Measurement Units (IMU) to detect the dynamic gesture accomplished using the motion sensors. The IMU determines the dynamic gesture and for detecting the orientation of the motion sensors. They use a gyroscope and a magnetometer in combination with an accelerometer. However, manufacturing and using of the IMU is expensive and complicated. The IMU gathers information from the accelerometer and the gyroscope, and computation is done over that information to determine the orientation. This computation is heavy and requires a processor with advanced architecture.
Further, the conventional systems, used the accelerometers to recognize the gestures but in combination with the gyroscopes. Because, the orientation information and dynamic gestures are always detected by using the gyroscopes along with accelerometers. The accelerometers detect only the static gestures. In addition, there exists a separate computation system which is involved to carry out computation regarding detection of the gestures.
Also, the conventional system fails to disclose the method and motion sensors having only one accelerometer that can detect both the orientation and dynamic gestures of the motion sensors.
Hence, there exists a need to provide a device with cognitive ability that uses only one accelerometer for recognizing the dynamic gestures along with determining the orientations of the sensors and which eliminates using a different computation system. This enables small form factor for the wearable device.
SUMMARY
The shortcomings of the prior art are overcome and additional advantages are provided through the present disclosure. Additional features and advantages are realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the claimed disclosure.
The present disclosure relates to a method for recognizing gestures using an accelerometer mounted onto a wearable device. The method comprises obtaining orientation of the wearable device with respect to a plurality of degrees of freedom by the accelerometer when the wearable device is in a first static state and a second static state. The obtained orientation is compared with one or more prerequisite orientations by a processing unit configured within the wearable device to determine whether the wearable device is oriented in at least one of the one or more prerequisite orientations. Next, a dynamic gesture of the wearable device is detected in at least one of the plurality of degrees of freedom occurring between the first static state and the second static state. The dynamic gesture is detected till the wearable device returns to the second static state. Then, the detected dynamic gesture is compared with one or more predetermined gesture patterns by the processing unit. The one or more predetermined gesture patterns provide at least one gesture form. Then, a valid gesture is recognized by the processing unit using the at least one gesture form and the least one of the one or more prerequisite orientations from at least one of the first static state and the second static state.

A wearable device for recognizing gestures is disclosed by the present disclosure. The wearable device comprises an accelerometer, a processing unit and a storage unit. The accelerometer is configured to provide orientation of the wearable device. The processing unit is coupled to the accelerometer and is configured to obtain the orientation of the wearable device with respect to a plurality of degrees of freedom from the accelerometer when the wearable device is in a first static state and a second static state. The obtained orientation is compared with one or more prerequisite orientations by a processing unit configured in the wearable device to determine whether the wearable device is oriented in at least one of the one or more prerequisite orientations. Then, the processing unit detects a dynamic gesture of the wearable device in at least one of the plurality of degrees of freedom occurring between the first static state and the second static state. The dynamic gesture is detected till the wearable device returns to the second static state. After detecting, the processing unit compares the detected dynamic gesture with one or more predetermined gesture patterns, wherein the one or more predetermined gesture patterns provide at least one gesture form. Then, the processing unit recognizes a valid gesture using the at least one gesture form and the least one of the one or more prerequisite orientations from at least one of the first static state and the second static state. The storage unit configured to store the one or more prerequisite orientations and the one or more predetermined gesture patterns.

The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects and features described above, further aspects, and features will become apparent by reference to the drawings and the following detailed description.

BRIEF DESCRIPTION OF THE DRAWINGS

The novel features and characteristic of the disclosure are set forth in the appended claims. The embodiments of the disclosure itself, however, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying drawings. One or more embodiments are now described, by way of example only, with reference to the accompanying drawings.

Fig. 1a shows exemplary block diagram illustrating a wearable device for recognizing gestures in accordance with an embodiment of the present disclosure;

Fig. 1b shows the graph illustrating the first static state, dynamic state and the second static state

Figs. 2a-2c illustrates obtaining orientation of a wearable device in accordance with an embodiment of the present disclosure;

Fig. 3 shows an exemplary graph illustrating obtaining the orientation of a wearable device in accordance with an embodiment of the present disclosure;

Fig. 4a illustrates an exemplary gesture having a left flick gesture in accordance with an embodiment of the present disclosure;

Fig. 4b illustrates an exemplary gesture having a top up flick gesture in accordance with an embodiment of the present disclosure;

Fig. 5 illustrates an exemplary block diagram of a system showing transmission of control signals to one or more devices over a wireless medium in accordance with an embodiment of the present disclosure; and

Fig. 6 is an exemplary diagram illustrating method for recognizing gestures using an accelerometer mounted onto a wearable device in accordance with an embodiment of the present disclosure.

The figures depict embodiments of the disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the disclosure described herein.

DETAILED DESCRIPTION
The foregoing has broadly outlined the features and technical advantages of the present disclosure in order that the detailed description of the disclosure that follows may be better understood. Additional features and advantages of the disclosure will be described hereinafter which form the subject of the claims of the disclosure. It should be appreciated by those skilled in the art that the conception and specific aspect disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure. It should also be realized by those skilled in the art that such equivalent constructions do not depart from the spirit and scope of the disclosure as set forth in the appended claims. The novel features which are believed to be characteristic of the disclosure, both as to its organization and method of operation, together with further objects and advantages will be better understood from the following description when considered in connection with the accompanying figures. It is to be expressly understood, however, that each of the figures is provided for the purpose of illustration and description only and is not intended as a definition of the limits of the present disclosure.
Embodiment of the present disclosure relates to a wearable device with cognitive abilities. The wearable device of the present disclosure is used to recognize gestures made by a user. The device is worn by the user. The wearable device uses only one accelerometer for determining gestures. Particularly, the accelerometer in the wearable device is configured to determine orientations of the wearable device as well as gestures made by the user wearing the wearable device. The accelerometer determines orientation of the wearable device when the wearable device is in a first static state. The first static state is a state before the initiation of the gestures by the user. The orientation is retrieved from effect of acceleration due to gravity. This way, the present disclosure eliminates the use of gyroscopes for determining the orientations of the wearable device. After retrieving the orientation of the wearable device being in the first static state for a predetermined time, a check is made whether the retrieved orientation matches with one of prerequisite orientations which are preconfigured in the wearable device. If the retrieved orientation matches with the one of the prerequisite orientations, then only dynamic gesture made by the user is detected when the wearable device is being oriented in the one of the prerequisite orientations. The dynamic gesture is detected till the wearable device reaches a second static state. The second static state is state after the dynamic gesture is been made. In one embodiment the dynamic state is monitored for a predefined time. The dynamic gesture is compared with predetermined gesture patterns to know whether the dynamic gesture made by the user is a valid gesture. If the dynamic gesture matches with predetermined gesture patterns, then the dynamic gesture made by the user is recognized as the valid gesture. The predetermined gesture patterns provide type of gestures made by the user. Based on value of the valid gestures, electronic or electrical appliances are controlled. That is, the dynamic gesture initiates control signals to the electronic or electrical appliances for controlling.
Henceforth, embodiments of the present disclosure are explained with the help of exemplary diagrams and one or more examples. However, such exemplary diagrams and examples are provided for the illustration purpose for better understanding of the present disclosure and should not be construed as limitation on scope of the present disclosure.

Fig. 1 shows exemplary block diagram illustrating a wearable device 102 for recognizing gestures in accordance with an embodiment of the present disclosure. In one embodiment the wearable device 102 is embedded into/as jewelry. Particularly, the wearable device 102 can be worn as jewelry by a user. For example, the user can embed the wearable device in form of jewelry including, but not limiting to, ring, necklace, pendant, bracelet, locket, etc. In an embodiment, the present disclosure discloses the wearable device 102 embedded as a ring worn in one of the fingers of hand of the user. The wearable device 102 comprises an accelerometer 104, a processing unit 106 and a storage unit 108. In an embodiment, the accelerometer 104 is a three-axis accelerometer. The accelerometer 104 provides at least two pieces of information. First information is about orientation of the wearable device 102 i.e. determination of orientation of the wearable device 102 in a first static state and a second static state. Second information is about detecting a dynamic gesture when the wearable device 102 in a dynamic state that is between the first static state and the second static state. The first static state is a static state of the wearable device 102 before the dynamic gesture is initiated. The dynamic state is a state in which the dynamic gesture is accomplished. The second static state is a static state of the wearable device 102 after the dynamic gesture is finished i.e. when the movement or acceleration is completed. More particularly, the accelerometer 104 obtains the orientation of the wearable device 102 with respect to a plurality of degrees of freedom when the wearable device 102 is in the first static state and the second static state. In an embodiment, the orientation comprises alignments including but not limiting to a vertical alignment of the wearable device 102 and a horizontal alignment of the wearable device 102. The orientation of the wearable device 102 before the initiation of the dynamic gesture is obtained when the wearable device 102 is in the first static state for a first predetermined amount of time. For example, assuming, 500 milliseconds (ms) is a time period preset for obtaining the orientation of the wearable device 102. Then, the orientation of the wearable device 102 is obtained when the user keeps his hands wearing the wearable device 102 steady or static for the entire 500ms. In case, the user moves his hand before 500ms like at 200ms which is not the time period being preset for obtaining the orientation, then the processing unit 106 ignores the value to determine it as first static state i.e. processing unit 106 considers that the user is making unwilling hand movement which is not related to initiate any gesture with respect to the wearable device 102. In one embodiment the preset time can be a range value. For example, 300ms to 5000ms. Similarly, assuming the user keeps hand still for more than 5000ms say for 1 minute. Then the processing unit 106 does not perform obtaining the orientation as the detected time period i.e. 1 minute of the first static state is more than the time period being preset i.e. 300ms to 5000ms for the first static state. In an embodiment, the plurality of degrees of freedom comprises three degrees of freedom i.e. three axes namely x-axis, y-axis and z-axis. The orientation of the wearable device 102 is obtained from effect of acceleration due to gravity on the plurality of degrees of freedom.

The processing unit 106 performs comparing the orientation of the wearable device 102, using the values provided by the accelerometer 104, with one or more prerequisite orientations to determine whether the wearable device 102 is oriented in one of the one or more prerequisite orientations. In an embodiment, the processing unit 106 is either a microcontroller or a microprocessor or a programmable logic array. In an embodiment, the microcontroller is used to recognize the gestures. Determination whether the wearable device 102 is oriented in the one of the one or more prerequisite orientations is performed to assure that the wearable device 102 is about to make a valid gesture. Particularly, the correct orientation of the wearable device 102 (i.e. the orientation is one of the one or more prerequisite orientations) is one of the factors to recognize the gesture made by the user. The processing unit 106 performs detecting the dynamic gesture of the wearable device 102 in at least one of the plurality of degrees of freedom occurring between the first static state and the second static state. The dynamic gesture is detected till the wearable device 102 reaches the second static state (a calm state or the state the wearable device is steady). The dynamic gesture is detected when the wearable device 102 moves for a second predetermined amount of time before reaching the second static state. For example, assuming 600ms is the time period is preset within which the dynamic gesture is assured to be detected. If the user makes the movement for the entire 600ms, then the dynamic gesture is detected. In case, the user stops making the movements before elapse of the 600ms, then movement is not detected assuming the user has not made any gesture or the user is just making some random movements which is not related to the gesture important to the wearable device 102. In one embodiment the preset time can be a range value. For example, 400ms to 800ms. The processing unit 106 only takes those dynamic gestures which would happen for the time which would fall with the range of 400ms to 800ms. Any dynamic gestures which are outside the time range is not consider for gesture recognition. After the dynamic gesture for the second predetermined amount time is accomplished, the wearable device 102 reaches the second static state i.e. a calm state. In an embodiment, the orientation of the wearable device 102 is obtained when the wearable device 102 is in the second static state i.e. calm state which is post dynamic gesture. The orientation post the dynamic gesture is obtained when the user keeps the hand wearing the wearable device 102 for a third predetermined amount of time. For example, assuming 400ms is the time period preset to detect the second static state post the dynamic gesture. If the user keeps his hand wearing the wearable device 102 steady or static for the 1 second after making the dynamic gesture, then the orientation of the wearable device 102, after the dynamic gesture is accomplished, is obtained. If the user does not keep the hand steady or static for 400ms after the dynamic gesture, then it is assumed that the user has terminated making the gestures. In one embodiment the preset time can be a range value. For example, 300ms to 400ms. The processing unit 106 considers the second static state if the static values are receive for the time which would fall with the range of 300ms to 400ms. If user does not keep the wearable device 102 steady for the time within this range then the value is not consider for gesture recognition. Figure 1b shows the graph illustrating the first static state, dynamic state and the second static state. In the illustrated figure 1b, the first static state has a steady curve of value. The dynamic state incurs variations of the curve because of the dynamic movement of the wearable device 102 by the user to accomplish the gesture. The second static state again incurs a steady curve post the dynamic movement.

The processing unit 106 compares the detected dynamic gesture with one or more predetermined gesture patterns. The predetermined gesture patterns provide at least one gesture form. The gesture patterns are predetermined during manufacturing the wearable device 102. The gesture form includes but does not limit to, left flick gesture, right flick gesture, up flick gesture and a down flick gesture. The gesture form is accomplished when the wearable device 102 is aligned in the one of the vertical alignment and the horizontal alignment. For example, assuming the wearable device 102 is aligned vertically then either the left flick gesture or right flick gesture could be accomplished or made by the user wearing the wearable device 102. In case, the wearable device 102 is aligned horizontally, then either up flick gesture or the down flick gesture is accomplished. One skilled in art can understand that the other gesture can also be recognized by using the same method and pre-storing the gesture patterns in the wearable device 102.

After comparing the detected dynamic gesture with the one or more predetermined gesture patterns, recognition of valid gesture is performed by the processing unit 106. The detected dynamic gesture is recognized to be the valid gesture when the dynamic gesture matches at least one of the one or more predetermined gesture patterns that has the at least gesture form and when the wearable device 102 is found to be oriented in the one of the one or more prerequisite orientations in both the first static state and the second static state. For example, wearable device 102 makes a left flick gesture and orientation is obtained before (i.e. first static state and after i.e. second static state) the left flick gesture is made. The valid gesture is recognized when the left flick gesture matches the predetermined left flick gesture pattern and orientation before and after the left flick gesture is one of the one or more prerequisite orientations i.e. the wearable device 102 has either the vertical alignment or the horizontal alignment.

The storage unit 108 of the wearable device 102 stores the one or more prerequisite orientations, the one or more predetermined gesture patterns, the first predetermined amount of time set for the first static state, the second predetermined amount time set for the dynamic state and the third predetermined amount of time set for the second static state. The storage unit 510 includes, but not limited to, a computer readable media having executable instructions. Such computer readable media can be any available media which can be accessed by one or more client machines 402 including general purpose or special purpose computer. By way of example, and not limitation, such computer readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or network attached storage, or any other medium which can be used to store the desired executable instructions and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer readable media. Executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
The wearable device 102 also comprises a wireless transmitter (not shown in figure 1) to transmit at least one of the control signals generated by the valid gesture. The control signals are transmitted to one or more devices for controlling the one or more devices. The one or more devices includes, but does not limit to, mobile devices, contactless devices, computer, television, transceivers, music players, and other electrical or electronic appliances capable of receiving wireless signals. The wearable device 102 further can comprise actuators, additional sensors, wireless communication chips, inlets for additional devices such as mobile devices, Universal Serial Bus (USB) and related other devices. In embodiment the valid gesture is transmitted to one or more devices.
Figs. 2a-2c illustrates obtaining orientation of a wearable device 102 in accordance with an embodiment of the present disclosure. Orientation of the wearable device 102 is obtained from effect of acceleration due to gravity. Particularly, the orientation is determined by the accelerometer 104 by estimating gravity vector on all the three axes. The more forces exerted towards one of the three axes provides an information on the gravity vector to determine whether the wearable device 102 is oriented vertically or horizontally. For example, when the gravity vector coincides with vertical axis then it is assumed that the wearable device 102 is oriented vertically. Same way, if the gravity vector coincides with horizontal axis then it is assumed that the wearable device 102 is oriented horizontally.
Figs. 2a and 2b illustrate orientation of the wearable device 102 in accordance with an embodiment of the present disclosure. The force exerted on the wearable device 102 due to acceleration due to gravity is estimated i.e. gravity vector on the three-axes (x-axis, y-axis and the z-axis). The more force exerted on one of the three axes provides whether the wearable device 102 is oriented vertically or horizontally. Assuming, the wearable device 102 is oriented horizontally with respect to the surface of the ground. When the wearable device 102 is oriented horizontally then either up flick gesture or down flick gesture is assumed to be accomplished. In the illustrated fig. 2a, the user wears the wearable device 102 in his hand 202 on the finger 202a. When the user keeps his hand 202 steady or static (first static state) for the first predetermined time period say for 500ms horizontally to the surface of the ground, the orientation is obtained.
Fig. 2c illustrates orientation of the wearable device 102 assuming it is oriented vertically. Here, it is assumed that the gravity vector coincides with the vertical axis and therefore the wearable device 102 is assumed to be oriented vertically. When the wearable device 102 is oriented vertically then either left flick gesture or right flick gesture is assumed to be accomplished. In the illustrated fig. 2c, the user wears the wearable device 102 in his hand 202 on the finger 202a. When the user keeps his hand 202 steady or static (first static state) for the first predetermined time period say for 500ms vertically to the surface of the ground, the orientation is obtained. All these orientation values are stored as the one or more prerequisite orientations. These values are then used by the processing unit 106 to compare during first static state and second static state.
Fig. 3 shows an exemplary graph illustrating obtaining the orientation of the wearable device 102 in accordance with an embodiment of the present disclosure. Usually, while the orientation of wearable device 102 is under process of obtaining, the values from the accelerometer 104 might vary due to small variation due to shaking. These fluctuations in the values are ignored. Thus, the orientation is determined only when the signals related to orientation information falls within some threshold limits eliminating the disturbances, noises and variation due to very minor shaking of the wearable device 102. When the signals falls within the threshold limits for the first predetermined amount of time, the orientation of the wearable device 102 is decided. For example, in the illustrated fig. 3, the threshold limit 304 is set between a1 to a3values. Therefore, when the signal 302 falls within the threshold limit a1 to a3 values denoted by 304 for 500ms (set for the first predetermined amount of time for the first static state), the orientation of the wearable device 102 is obtained. Similarly it is done for second static state.
Fig. 4a illustrates an exemplary gesture having a left flick gesture in accordance with an embodiment of the present disclosure. The user makes dynamic gesture wearing the wearable device 102 on his hand as a ring. Before initiating the dynamic gesture, the user has to make the wearable device 102 in a first static state depicted as 402a for a first predetermined amount of time say for example 500ms where the orientation of the wearable device 102 is obtained. Assuming in the first static state the wearable device 102 is in the position I. In the illustrated fig.4a, the wearable device 102 is assumed to be oriented vertically. Therefore, the user can make either left flick gesture or right flick gesture. In the fig.4a is making left flick gesture. While the user initiates the wearable device 102 to enter the dynamic state where the dynamic gesture of left flick gesture is initiated, that is, movement from position I to position J, the acceleration on the z-axis at first decreases and encounters a low spike. Then, as the gesture is continued, the acceleration on the z-axis increases and encounters a high spike. Finally, the left flick gesture is detected for the second predetermined amount of time for example, 600ms. Post the left flick gesture in the dynamic state, the user makes the wearable device 102 to reach the second static state depicted as 402b at position J, where the orientation of the wearable device 102 is obtained. After detecting the second static state for the third predetermined amount of time say for example, 400ms, the actual displacement of the gesture along the one of the three axes is measured. Particularly, the gesture is filtered based on the values of acceleration spikes on all the 3 axes. The magnitude of the acceleration spikes on the axis perpendicular to the hand must be greater than specified thresholds while the acceleration spikes on the other two axes must be less than the specified thresholds. If the above conditions are met, then another filter determines if the positive acceleration spike happened before or after the negative acceleration spike. Together all this information is used to determine the direction of the gesture as well as check if the gesture made was a valid gesture or not.

The gesture detected is compared with one or more predetermined gesture patterns stored in the storage unit 108. If the detected gesture matches the one of the one or more predetermined gesture patterns and if the wearable device 102 detected to be oriented vertically that matches with one of the one or more prerequisite orientations, then the detected gesture is recognized as valid gesture.

In case, when the user makes the right flick gesture, the acceleration on the z-axis at first increases and will have a high spike which is opposite to the case of left flick gesture.

Fig. 4b illustrates an exemplary gesture having an up flick gesture in accordance with an embodiment of the present disclosure. The user makes dynamic gesture wearing the wearable device 102 on his hand as a ring. Before initiating the dynamic gesture, the user has to make the wearable device 102 in a first static state depicted as 402a for a first predetermined amount of time say for example 500ms where the orientation of the wearable device 102 is obtained. Assuming in the first static state the wearable device 102 is in the position L. In the illustrated fig.4b, the wearable is assumed to be oriented horizontally. Therefore, the user can making the dynamic gesture has either up flick gesture or down flick gesture. In the fig.4b is making up flick gesture. While the user initiates the wearable device 102 to enter the dynamic state where the dynamic gesture of up flick gesture is initiated, that is, movement from position L to position M. Finally, the top up flick gesture is detected for the second predetermined amount of time for example, 600ms. Post the top up flick gesture in the dynamic state, the user makes the wearable device 102 to reach the second static state depicted as 402b at position M, where the orientation of the wearable device 102 is obtained. Particularly, the gesture is filtered based on the values of acceleration spikes on all the 3 axes. The magnitude of the acceleration spikes on the axis perpendicular to the hand must be greater than specified thresholds while the acceleration spikes on the other two axes must be less than the specified thresholds. If the above conditions are met, then another filter determines if the positive acceleration spike happened before or after the negative acceleration spike. Together all this information is used to determine the direction of the gesture as well as check if the gesture made was a valid gesture or not. The gesture detected is compared with one or more predetermined gesture patterns stored in the storage unit 108. If the detected gesture matches the one of the one or more predetermined gesture patterns and if the wearable device 102 detected to be oriented horizontally that matches with one of the one or more prerequisite orientations, then the detected gesture is recognized as valid gesture.
Fig. 5 illustrates an exemplary block diagram of a system showing transmission of control signals to one or more devices 506 over a wireless medium 504 in accordance with an embodiment of the present disclosure. The control signals are dependent on the valid gesture recognized by the wearable device 102. The wearable device 102 also comprises a wireless transmitter 502 that transmits the control signal to the one or more device 506 over the wireless medium 504. The control signals are transmitted to control the one or more devices 506. The one or more devices 506 includes, are not limited to, mobile devices, contactless devices, computer, television, transceivers, music players, and other electrical or electronic appliances capable of receiving wireless signals. The wireless medium 504 includes, are not limited to, an e-commerce network, Bluetooth, infrared, a peer to peer (P2P) network, Local Area Network (LAN), Wide Area Network (WAN) and any wireless network such as Internet and WIFI etc. For example, the top up flick gesture made by the user using the wearable device 102 can increase the volume of the computer or change the slide show or turn on the lights etc.

Fig. 6 is an exemplary diagram illustrating method for recognizing gestures using an accelerometer 104 mounted onto the wearable device 102 in accordance with an embodiment of the present disclosure. At step 602, the accelerometer 104 obtains the orientation of the wearable device 102 with respect to the plurality of degrees of freedom when the wearable device 102 is in the first static state and the second static state. The plurality of degrees of freedom includes three axes i.e. x-axis, y-axis and z-axis. The orientation is obtained with effect of acceleration due to gravity (zero acceleration) when the wearable device 102 is in the first static state and the second static state for the first predetermined amount of time and the third predetermined amount of time respectively. The processing unit 106, at step 604, compares the obtained orientation with the one or more prerequisite orientations to determine whether the wearable device 102 is oriented in the one of the one or more prerequisite orientations. This determination is also a factor to recognize gestures only when the wearable device 102 is oriented in the one of the one or more prerequisite orientations. At step 606, the dynamic gesture initiated by the user for the second predetermined amount of time is detected that is dynamic gesture occurring between the first static state and the second static state. The dynamic gesture is detected when the wearable device 102 is in that state for predefined amount of time and reaches the second static state. At step 608, the processing unit 106, compares the detected dynamic gesture (at step 606) with the one or more predetermined gesture patterns. At step 610, the processing unit 106 recognizes the detected gesture as the valid gesture only when the dynamic gesture matches the one of the one or more predetermined gesture patterns and when the wearable device 102 is oriented in at least one of the one or more prerequisite orientations in at least one of the first static state and the second static state. The valid gesture includes but not limiting to left flick gesture, right flick gesture, up flick gesture and right flick gesture. The recognized valid gesture initiates control signal that are transmitted to the one or more devices 506 by the wireless transmitter 502 over the wireless medium 504.

The terms "an embodiment", "embodiment", "embodiments", "the embodiment", "the embodiments", "one or more embodiments", "some embodiments", and "one embodiment" mean "one or more (but not all) embodiments of the invention(s)" unless expressly specified otherwise.

The terms "including", "comprising", “having” and variations thereof mean "including but not limited to", unless expressly specified otherwise.

The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise.

The terms "a", "an" and "the" mean "one or more", unless expressly specified otherwise.

Devices that are in communication with each other need not be in continuous communication with each other, unless expressly specified otherwise. In addition, devices that are in communication with each other may communicate directly or indirectly through one or more intermediaries.

A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.

Further, although process steps, method steps, algorithms or the like may be described in a sequential order, such processes, methods and algorithms may be configured to work in alternate orders. In other words, any sequence or order of steps that may be described does not necessarily indicate a requirement that the steps be performed in that order. The steps of processes described herein may be performed in any order practical. Further, some steps may be performed simultaneously.

The foregoing description of various embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of the invention be limited not by this detailed description, but rather by the claims appended hereto. The above specification, examples and data provide a complete description of the manufacture and use of the composition of the invention. Since many embodiments of the invention can be made without departing from the spirit and scope of the invention, the invention resides in the claims hereinafter appended.

Additionally, advantages of present disclosure are illustrated herein.
Embodiment of the present disclosure makes use of only one accelerometer 104 for detecting the acceleration and as well as orientation of the wearable device 102. Thus, using the gyroscopes is eliminated.
Embodiment of the present disclosure provides the wearable device 102 to be embedded as the jewelry with cognitive abilities in a small compact structure. Thus, different computing unit to carry out the measurements or computations for recognizing gestures is eliminated. The computations is carried out the processing unit 106 configured in the wearable device 102.
Embodiment of the present disclosure, being amenable for embedding into a personal piece of jewelry such as a ring, can be used to interface with a computer, smart phone, or any physical electrical or electronic device. The wearable device 102 is like a mouse, keyboard, or a joystick but one that is personalized and discreetly worn as jewelry.

Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.

With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.
In addition, where features or aspects of the disclosure are described in terms of Markush groups, those skilled in the art will recognize that the disclosure is also thereby described in terms of any individual member or subgroup of members of the Markush group.

While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.


Referral Numerals:
Reference Number Description
102 Wearable Device
104 Accelerometer
106 Processing Unit
108 Storage Unit
202 Hand of the user
202a Finger worn with the wearable device 102
402a First Static State
402b Second Static State
502 Wireless Transmitter
504 Wireless Medium
506 Devices

Documents

Application Documents

# Name Date
1 325145.Form 27.pdf 2023-11-20
1 Form 5.pdf 2013-12-12
2 325145-Form 27_Statement of working_26-09-2022.pdf 2022-09-26
2 Form 3.pdf 2013-12-12
3 Figures_IP24436.pdf 2013-12-12
3 5699-CHE-2013-EDUCATIONAL INSTITUTION(S) [02-12-2021(online)].pdf 2021-12-02
4 Complete Specification_IP24436.pdf 2013-12-12
4 5699-CHE-2013-EVIDENCE FOR REGISTRATION UNDER SSI [02-12-2021(online)].pdf 2021-12-02
5 Form-18(Online).pdf 2014-01-03
5 325145-Form 27_Statement of Working_23-09-2021.pdf 2021-09-23
6 5699-CHE-2013-RELEVANT DOCUMENTS [22-03-2020(online)].pdf 2020-03-22
6 5699-CHE-2013 FORM-1 20-02-2014.pdf 2014-02-20
7 5699-CHE-2013-IntimationOfGrant15-11-2019.pdf 2019-11-15
7 5699-CHE-2013 CORRESPONDENCE OTHERS 20-02-2014.pdf 2014-02-20
8 abstract5699-CHE-2013.jpg 2014-10-18
8 5699-CHE-2013-PatentCertificate15-11-2019.pdf 2019-11-15
9 5699-CHE-2013-Correspondence-281215.pdf 2016-06-14
9 5699-CHE-2013_Abstract_Granted 325145_15-11-2019.pdf 2019-11-15
10 5699-CHE-2013-FER.pdf 2019-01-04
10 5699-CHE-2013_Claims_Granted 325145_15-11-2019.pdf 2019-11-15
11 5699-CHE-2013-FORM 4(ii) [02-07-2019(online)].pdf 2019-07-02
11 5699-CHE-2013_Description_Granted 325145_15-11-2019.pdf 2019-11-15
12 5699-CHE-2013-OTHERS [01-08-2019(online)].pdf 2019-08-01
12 5699-CHE-2013_Drawings_Granted 325145_15-11-2019.pdf 2019-11-15
13 5699-CHE-2013-FER_SER_REPLY [01-08-2019(online)].pdf 2019-08-01
13 5699-CHE-2013_Marked up Claims_Granted 325145_15-11-2019.pdf 2019-11-15
14 5699-CHE-2013-CLAIMS [01-08-2019(online)].pdf 2019-08-01
14 5699-CHE-2013-DRAWING [01-08-2019(online)].pdf 2019-08-01
15 5699-CHE-2013-CORRESPONDENCE [01-08-2019(online)].pdf 2019-08-01
16 5699-CHE-2013-CLAIMS [01-08-2019(online)].pdf 2019-08-01
16 5699-CHE-2013-DRAWING [01-08-2019(online)].pdf 2019-08-01
17 5699-CHE-2013_Marked up Claims_Granted 325145_15-11-2019.pdf 2019-11-15
17 5699-CHE-2013-FER_SER_REPLY [01-08-2019(online)].pdf 2019-08-01
18 5699-CHE-2013_Drawings_Granted 325145_15-11-2019.pdf 2019-11-15
18 5699-CHE-2013-OTHERS [01-08-2019(online)].pdf 2019-08-01
19 5699-CHE-2013-FORM 4(ii) [02-07-2019(online)].pdf 2019-07-02
19 5699-CHE-2013_Description_Granted 325145_15-11-2019.pdf 2019-11-15
20 5699-CHE-2013-FER.pdf 2019-01-04
20 5699-CHE-2013_Claims_Granted 325145_15-11-2019.pdf 2019-11-15
21 5699-CHE-2013-Correspondence-281215.pdf 2016-06-14
21 5699-CHE-2013_Abstract_Granted 325145_15-11-2019.pdf 2019-11-15
22 5699-CHE-2013-PatentCertificate15-11-2019.pdf 2019-11-15
22 abstract5699-CHE-2013.jpg 2014-10-18
23 5699-CHE-2013 CORRESPONDENCE OTHERS 20-02-2014.pdf 2014-02-20
23 5699-CHE-2013-IntimationOfGrant15-11-2019.pdf 2019-11-15
24 5699-CHE-2013 FORM-1 20-02-2014.pdf 2014-02-20
24 5699-CHE-2013-RELEVANT DOCUMENTS [22-03-2020(online)].pdf 2020-03-22
25 Form-18(Online).pdf 2014-01-03
25 325145-Form 27_Statement of Working_23-09-2021.pdf 2021-09-23
26 Complete Specification_IP24436.pdf 2013-12-12
26 5699-CHE-2013-EVIDENCE FOR REGISTRATION UNDER SSI [02-12-2021(online)].pdf 2021-12-02
27 Figures_IP24436.pdf 2013-12-12
27 5699-CHE-2013-EDUCATIONAL INSTITUTION(S) [02-12-2021(online)].pdf 2021-12-02
28 Form 3.pdf 2013-12-12
28 325145-Form 27_Statement of working_26-09-2022.pdf 2022-09-26
29 Form 5.pdf 2013-12-12
29 325145.Form 27.pdf 2023-11-20

Search Strategy

1 SearchStrategy_17-12-2018.pdf

ERegister / Renewals

3rd: 11 Feb 2020

From 10/12/2015 - To 10/12/2016

4th: 11 Feb 2020

From 10/12/2016 - To 10/12/2017

5th: 11 Feb 2020

From 10/12/2017 - To 10/12/2018

6th: 11 Feb 2020

From 10/12/2018 - To 10/12/2019

7th: 11 Feb 2020

From 10/12/2019 - To 10/12/2020

8th: 09 Dec 2020

From 10/12/2020 - To 10/12/2021

9th: 02 Dec 2021

From 10/12/2021 - To 10/12/2022

10th: 25 Nov 2022

From 10/12/2022 - To 10/12/2023