Abstract: The present invention provides a method for gesture recognition using Deterministic Finite Automata (DFA). The method includes receiving an input data from a sensor dividing a space of gesture into blocks and assigning the block to states of the DFA in accordance to the input data. Further the method includes constructing a gesture specific DFA and recognizing a gesture in accordance to the constructed gesture specific DFA. FIG. 1
FIELD OF INVENTION
[001] The present invention relates to gesture recognition and more
particularly to a generalized framework for the gesture recognition using
Deterministic Finite Automata (DFA).
BACKGROUND 5 OF INVENTION
[002] Gesture recognition techniques are widely used for
interpreting human gestures. Gestures can be made through face, hand,
fingers, or any other body motion. The use of the gesture recognition
techniques can enable a system to recognize or identify the normal or
10 specific gestures and use them to convey information or facilitate device
control.
[003] Existing gesture recognition systems allows an electronic
device to capture input from a user using an input interface. The gesture
recognition techniques and algorithms used by the existing systems uses a
15 predefined set of rules to recognize the gestures, which are often specific to
the type of the input interface. Further, the same set of rules is not
applicable for recognizing other type of gestures. Thus, most of the existing
methods are application specific and fails to recognize complex gestures
including movements of more than one object.
20 [004] In the light of above discussion, there remains a need for a
gesture recognition method for recognizing complex gestures, which is
independent of the input interface used by the device.
OBJECT OF INVENTION
25 [005] The principal object of the embodiments herein is to provide
a method and system for recognizing gestures using Deterministic Finite
Automata (DFA).
3/28
[006] Another object of the invention is to provide a generalized
framework for recognizing gestures independent of user interface.
[007] Another object of the invention is to provide a system and
method for recognizing complex gestures.
5
SUMMARY
[008] Accordingly the invention provides a method for gesture
recognition using Deterministic Finite Automata (DFA). The method
10 includes receiving an input data from a sensor. Further, the method includes
dividing a space of gesture into blocks and assigning the blocks to states of
the DFA in accordance to the input data. Furthermore, the method includes
constructing a gesture specific DFA in accordance to an alphabet including
valid and invalid stroke performed by the user, state transition rule, initial
15 state, set of final states, and set of finite states. Furthermore, the method
includes recognizing a gesture in accordance to the constructed gesture
specific DFA.
[009] Furthermore, the method includes receiving a gesture input
from the user, constructing a string of symbol of the gesture in accordance
20 to the alphabet and determining whether the string of symbol matches with
the constructed gesture specific DFA. Furthermore, the method includes
recognizing the gesture in response to determining that the string of symbol
matches with the constructed gesture specific DFA.
[0010] Furthermore, the method includes recognizing multi-stroke
25 gesture by using a sequential representation of the strokes performed by the
user. Furthermore, the method includes transferring objects between a first
device and a second device using send and receives command in
accordance to the recognized gesture.
4/28
[0011] Accordingly the invention provides a system for gesture
recognition using Deterministic Finite Automata (DFA). The system
includes an interface module configured to receive an input data including
gesture motion from a sensor. The system also includes a DFA module
configured to divide a space of gesture into blocks and assign 5 the blocks to
states of the DFA. The DFA module is also configured to construct a
gesture specific DFA in accordance to an alphabet including a valid and
invalid stroke performed by the user, state transition rule, initial state, set of
final states, and set of finite states. The DFA module is further configured
10 to recognize multi-stroke gesture by using a sequential representation of the
stroke. Further, the system includes a gesture recognition module
configured to recognize a gesture in accordance to the constructed gesture
specific DFA and a storage module configured to store the constructed
gesture specific DFA.
15 [0012] Furthermore, the system includes receiving a gesture input
from the user using the interface module and constructing a string of
symbol of the gesture in accordance to the alphabet using the DFA module.
Furthermore, the system includes determining whether the string of symbol
matches with the constructed gesture specific DFA using the gesture
20 recognition module. Furthermore, the system includes recognizing the
gesture in response to determining that the string of symbol matches with
the constructed gesture specific DFA.
[0013] Furthermore, the system includes transferring objects
between a first device and a second device in accordance to the recognized
25 gesture using send and receives command in accordance to the recognized
gesture.
[0014] These and other aspects of the embodiments herein will be
better appreciated and understood when considered in conjunction with the
following description and the accompanying drawings. It should be
5/28
understood, however, that the following descriptions, while indicating
preferred embodiments and numerous specific details thereof, are given by
way of illustration and not of limitation. Many changes and modifications
may be made within the scope of the embodiments herein without departing
from the spirit thereof, and the embodiments herein 5 include all such
modifications.
BRIEF DESCRIPTION OF FIGURES
[0015] This invention is illustrated in the accompanying drawings,
10 throughout which like reference letters indicate corresponding parts in the
various figures. The embodiments herein will be better understood from the
following description with reference to the drawings, in which:
[0016] FIG. 1 illustrates an apparatus with multiple modules, in
accordance with various embodiments of the present invention;
15 [0017] FIG. 2 illustrates a virtual gesture space divided into
subspaces, in accordance with various embodiments of the present
invention;
[0018] FIG. 3 illustrates construction of alphabet using simplified
single stroke-gesture recognition, in accordance with various embodiments
20 of the present invention;
[0019] FIG. 4 illustrates a flow diagram for constructing a
Deterministic Finite Automata (DFA) for gesture recognition, in
accordance with various embodiments of the present invention;
[0020] FIG. 5 illustrates a flow diagram for validating an input
25 gesture using the constructed DFA, in accordance with various
embodiments of the present invention;
[0021] FIG. 6 illustrates an exemplary state transition diagram
representing a DFA for the single-stroke gestures, in accordance with
various embodiments of the present invention;
6/28
[0022] FIG. 7 illustrates an exemplary state transition diagram of a
DFA designed for describing a complex gesture, in accordance with various
embodiments of the present invention;
[0023] FIG. 8 illustrates a flow diagram of the DFA as described in
the FIG. 7, in accordance with various embodiments 5 of the present
invention;
[0024] FIG. 9 illustrates possible scenarios in a multi-stroke
gesture recognition framework, in accordance with various embodiments of
the present invention;
10 [0025] FIG. 10 illustrates a generalized framework of the gesture
recognition used in an object transfer application, in accordance with
various embodiments of the present invention;
[0026] FIG. 11 illustrates a state transition diagram corresponding
to the gestures performed for the object transfer between devices depicted
15 in the FIG.10, in accordance with various embodiments of the present
invention;
[0027] FIG. 12 illustrates a generalized framework of gesture
recognition used in Augmented Reality (AR) application, in accordance
with various embodiments of the present invention; and
20 [0028] FIG. 13 illustrates a computing environment implementing
the application, in accordance with various embodiments of the present
invention.
7/28
DETAILED DESCRIPTION OF INVENTION
[0029] The embodiments herein and the various features and
advantageous details thereof are explained more fully with reference to the
non-limiting embodiments that are illustrated in 5 the accompanying
drawings and detailed in the following description. Descriptions of wellknown
components and processing techniques are omitted so as to not
unnecessarily obscure the embodiments herein. The examples used herein
are intended merely to facilitate an understanding of ways in which the
10 embodiments herein can be practiced and to further enable those of skill in
the art to practice the embodiments herein. Accordingly, the examples
should not be construed as limiting the scope of the embodiments herein.
[0030] The embodiments herein achieve a method and system to
provide a generalized framework for gesture recognition using
15 Deterministic Finite Automata (DFA). A virtual space is divided into subspaces
and assigned to independent states of a DFA module. A single or
multi strokes representation is determined based on orientation and
movement of a pointer involved in a gesture. The present invention
provides a DFA based methodology to identify the single or multi-stroke
20 based gestures. The method provides a complete set of possible strokes to
address any possible movement of pointer involved in the gesture. Further,
the DFA module is used to construct a gesture specific DFA to represent
any complex gesture performed by a user. The constructed DFA represents
a predefined gesture, which is used by a gesture recognition module to
25 recognize any input gesture captured by a device.
[0031] Throughout the description, the term subspace and block is
used interchangeably.
[0032] Throughout the description, the term complex gesture and
multi stroke gesture is used interchangeably.
8/28
[0033] Throughout the description, the term invalid stroke and
unacceptable stroke is used interchangeably.
[001] The generalized framework disclosed by the method enhances the
input methods for recognizing the gestures performed by the user. The
generalized framework for the gesture recognition can be 5 used for various
applications, for example, authentication, object movement, augmented
reality, gaming, user interface designing, or any other application.
Similarly, the generalized framework for the gesture recognition can be
used in various electronic systems, for example, mobile phones, Personal
10 Digital Assistant (PDA), augmented reality systems, gaming systems, or
any other system.
[002] Referring now to the drawings, and more particularly to
FIGS. 1 through 13, where similar reference characters denote
corresponding features consistently throughout the figures, there are shown
15 preferred embodiments.
[003] FIG. 1 illustrates an apparatus with multiple modules, in
accordance with various embodiments of the present invention. The FIG. 1
shows an apparatus or system 100 including an interface module 102, a
Deterministic Finite Automata (DFA) module 104, a gesture recognition
20 module 106, a display module 108, and a storage module110. The interface
module 102 can be configured to provide a user interface to capture gesture
performed by a user over a real gesture space. In an example, the input
interface module 102 described herein can be touch screen, touch pad,
camera, joystick, or any other input interface module. In an example the
25 gesture described herein can include, but is not limited to, a hand
movement in front of a camera, a video tracking result, a pattern on a touch
screen, a touch pad using a stylus (or finger), or any other motion or
movement made by the user.
9/28
[004] The DFA module 104 can be configured to divide the gesture
space into multiple non-overlapping blocks. The DFA module 104 can be
configured to include different states, which are assigned to the multiple
non-overlapping blocks of the gesture space. The DFA module 104 can be
configured to construct a gesture 5 specific DFA.
[005] The gesture recognition module 106 can be configured to
provide the generalized framework for gesture recognition using the DFA
module 104. The gesture recognition framework can be provided
independent of the interface module 102 used by the apparatus100. The
10 display module 108 displays the gesture performed by the user on the
display screen, along with other display functions of the apparatus100. The
storage module 110 can be configured to provide a storage space for storing
the constructed DFA and the captured user gesture, along with the standard
memory functions. The storage module 110 described herein can be
15 configured to include an internal memory or use an external memory.
[006] FIG. 2 illustrates a virtual gesture space divided into
subspaces, in accordance with various embodiments of the present
invention. The FIG. 2 depicts a virtual gesture space which is mapped to
the real gesture space. The real gesture space described herein can include
20 touch panel, view of the camera, sensor, or any other input sensor. The
entire rectangular virtual space of is divided into non-overlapping blocks
having M rows and N columns such that the device can create total of M x
N subspaces. Further, these subspaces are assigned to independent states of
the finite automaton.
25 [007] The gesture performed by the user over the real gesture space
is sensed and mapped to the virtual gesture space. The representation of a
gesture can then be simplified as movement of a pointer from a source
(start) subspace to a destination (final) subspace through the intermediate
subspaces. Thus, the apparatus is enabled for tracking multi-stroke gesture
10/28
performed by the user. The movement of the pointer from a subspace to an
adjacent subspace represents a single-stroke gesture. Thus, the multi-stroke
gesture performed by the user during the movement of the pointer from the
source to the destination can be represented by a string of symbols. This
string includes sequence of all the single-stroke gesture, 5 which represents
the multi-stroke gesture. In an example, the number of subspaces created
can vary based on the user requirement. A higher number of subspaces can
enable more accurate gesture recognition. In an embodiment the gesture
space can include any shape which can be then divided into non
10 overlapping subspaces.
[008] FIG. 3 illustrates construction of alphabet using simplified
single-stroke gesture recognition, in accordance with various embodiments
of the present invention. The FIG. 3 depicts a complete set of possible
single strokes in a gesture to address any possible movement of the pointer.
15 The complete set of possible single strokes described can be for example, a,
b, c, d, e, f, g, and h as shown in the FIG. 3. Theses possible single strokes
in the gesture can be differentiated based on the orientation and movement
of the pointer. Further, the FIG. 3 represents an unacceptable stroke (also
referred as invalid movement of pointer interchangeably) ‘u’. In an
20 example, the sequence of the single-stroke gestures can represent a multistroke
gesture. Further, a gesture specific DFA can be constructed to
represent any multi-stroke gesture.
[009] In an embodiment, the DFA (denoted by M) can generally
define as a five tuple, given in equation below:
25 M = {Σ, Q, δ, S, QF}
Where, represents the alphabets (a set of finite symbols or number of
possible inputs), is the set of finite states, is the set of production rules
(or rule transition table), S is the start state, and is the set of final states
(or accept states).
11/28
[0010] The method disclosed herein defines the input alphabet Σ
having a vector representation as depicted in the FIG. 3. The pointer of an
input gesture can enter into one of the eight possible sub-spaces and are
represented by symbols such that = {a, b, c, d, e, f, g, h, u}, where a, b,
c, d, e, f, g, and h represent the set of possible (valid) single 5 strokes and ‘u’
represents any other stroke such as an invalid or unacceptable stroke.
[0011] As depicted in the FIG. 3, the horizontal stroke in right
direction is ‘a’, the upward diagonal stroke in right direction is ‘b’,
vertically upward stroke is ‘c’, the upward diagonal stroke in left direction
10 is ‘d’, the horizontal stroke in left direction is ‘e’, the downward diagonal
stroke in left direction is ‘f’, the vertically downward stroke is ‘g’, the
downward diagonal stroke in right direction is ‘d’, and any stroke other
than these defined strokes is a invalid stroke represented by ‘u’. In an
embodiment, the symbols used to represent the strokes, can have any user
15 defined characters.
[0012] FIG. 4 illustrates a flow diagram 400 for constructing
Deterministic Finite Automata (DFA) for gesture recognition, in
accordance with various embodiments of the present invention. As depicted
in the flow diagram 400, at step 402, the gesture space is divided into
20 desired number of blocks.
[0013] At step 404, the possible gesture map based on the
composite (complex) strokes is obtained from the input gesture performed
by the user. Upon receiving the gesture, at step 406, the number of states
required to present the gesture map are finalized. At step 408, the alphabet
25 Σ required for defining the DFA (M) is constructed. The alphabet Σ
includes all possible single strokes including the invalid stroke. At step 410,
a DFA (M) specific to user gesture is constructed. The DFA (M) is
12/28
constructed by using the state transition rules, initial (start) state, and a set
of final states based on the user gesture.
[0014] In an embodiment, the method enables the apparatus to
construct multiple DFAs corresponding to the multiple user gestures. Each
constructed DFA can represent a different gesture 5 and execute a
corresponding function.
[0015] FIG. 5 illustrates a flow diagram 500 for validating an input
gesture using the constructed DFA, in accordance with various
embodiments of the present invention. At step 502, the apparatus 100
10 accept the input gesture performed by the user. At step 504, a string
comprising combination of symbols of alphabet Σ is constructed based on
the mapping of the input gesture with the symbols of the alphabet Σ. These
symbols of the string represent the multi-stroke gesture as a sequence of
single-stroke gestures. The constructed string of symbols represents the
15 input gesture performed.
[0016] At step 506, the string of symbols is compared with the
constructed DFA of the FIG. 4. The constructed DFA represents described
herein is the DFA of a predefined or registered gesture. If the input string is
accepted by the DFA, then at step 508, the gesture is recognized and the
20 apparatus 100 can execute a predefined function for the input gesture. At
step 508, upon determining a mismatch the apparatus repeats the steps 502-
508.
[0017] FIG. 6 illustrates an exemplary state transition diagram
representing a DFA for the single-stroke gestures, in accordance with
25 various embodiments of the present invention. The FIG. 6 depicts a
simplified representation of a divided virtual gesture space 602 with nine
non-overlapping sub-spaces assigned to each individual state, a state
transition diagram 604 of the DFA. The blocks of the divided virtual space
602 are assigned to the states such as q0, q1, q2, q3, q4, q5, q6, q7, and q8
13/28
as shown in the FIG. 6. The state q9 is a blocked / invalid state assigned to
the invalid stroke. All possible gestures begin from the central block
assigned to the state q0. The central block is alternatively denoted by ‘S’
representing the start state of a DFA. Starting from the block q0 there can
be eight possible gestures (single stroke movement from start 5 block S/q0 to
any other adjacent block). The different single stroke gestures include
movement of the pointer from states q0 to q1, q0 to q2, q0 to q3, q0 to q4,
q0 to q5, q0 to q6, q0 to q7, and q0 to q8. Any other movement of the
pointer forces the state transition to enter into the blocked state q9.
10 [0018] The state transition diagram 604 of the DFA represents the
eight acceptable events (single strokes) within the divided virtual space
602. The state transition diagram of DFA 604 defines a DFA (M) as
.
Where, = {a, b, c, d, e, f, g, h, u}. The characters a, b, c, d, e, f, g, and h
15 represent the valid strokes. The character ‘u’ represents an unacceptable
stroke, which leads the state transition to enter into the blocked state.
[0019] The set of possible states is given by = {q0, q1, q2, q3,
q4, q5, q6, q7, q8, q9}. The state q0 represents the start state (S).
[0020] The set of acceptable states is given by = {q1, q2, q3,
20 q4, q5, q6, q7, q8} and the production rules of the DFA in the FIG. 6 is
defined as .
[0021] The production rules for state transition diagram of the
DFA 604 are as follows:
δ(S, a) = q5 (rule states that the pointer movement from the S in direction
25 of vector ‘a’ allows the state transition to enter into the state q5, which is an
acceptable state),
14/28
δ(S, b) = q3 (rule states that the pointer movement from the S in direction
of vector ‘b’ allows the state transition to enter into the state q3, which is an
acceptable state),
δ(S, c) = q2 (rule states that the pointer movement from the S in direction
of vector ‘c’ allows the state transition to enter into the state 5 q2, which is an
acceptable state),
δ(S, d) = q1 (rule states that the pointer movement from the S in direction
of vector ‘d’ allows the state transition to enter into the state q1, which is an
acceptable state),
10 δ(S, e) = q4 (rule states that the pointer movement from the S in direction of
vector ‘e’ allows the state transition to enter into the state q4, which is an
acceptable state),
δ(S, f) = q6 (rule states that the pointer movement from the S in direction of
vector ‘f’ allows the state transition to enter into the state q6, which is an
15 acceptable state)
δ(S, g) = q7 (rule states that the pointer movement from the S in direction
of vector ‘g’ allow the state transition to enter into the state q7, which is an
acceptable state),
δ(S, h) = q8 (rule states the pointer movement from the S in direction of
20 vector ‘h’ allows the state transition to enter into the state q8, which is an
acceptable state), and
δ(S, u) = q9 (rule states that the pointer movement from the S in any other
direction termed vector ‘u’ allows the state transition to enter into the state
q9, which is an unacceptable state).
25 [0022] The rules stated below indicate any stroke starting from any
state other than the q0 (which include the states q1, q2, q3, q4, q5, q6, q7,
and q8) in direction of any vector such as a, b, c, d, e, g, f, h, or u allows the
state transition to enter into the state q9 representing the unacceptable state.
δ(q1, a | b | c | d |e | f | g | h | u) = q9
15/28
δ(q2, a | b | c | d |e | f | g | h | u) = q9
δ(q3, a | b | c | d |e | f | g | h | u) = q9
δ(q4, a | b | c | d |e | f | g | h | u) = q9
δ(q5, a | b | c | d |e | f | g | h | u) = q9
δ(q6, a | b | c | d |e | f 5 | g | h | u) = q9
δ(q7, a | b | c | d |e | f | g | h | u) = q9
δ(q8, a | b | c | d |e | f | g | h | u) = q9
δ(q9, a | b | c | d |e | f | g | h | u) = q9
[0023] Once the state transition enters into a blocked state, then any
10 further movement (stroke) is considered to be an invalid stroke and the
state transition is held in the blocked state q9.
[0024] FIG. 7 illustrates an exemplary state transition diagram of
the DFA designed for describing a complex gesture, in accordance with
various embodiments of the present invention. The FIG. 7 shows a virtual
15 gesture space 702 divided into nine blocks, on which a complex gesture is
mapped. The FIG. 7 also depicts a state transition diagram 704 of a DFA.
Further, the FIG. 7 shows a complex gesture or a multi stroke gesture
comprising series of strokes q1 -> q4 -> q0 -> q5 ->q8. These single strokes
represent the movement of the gesture pointer from a subspace represented
20 by the state q1 to a sub-space represented by the state q8 through the
intermediate subspaces q4, q0, and q5. The complex gesture starts at the q1,
moves in direction of the stroke g and enters into the state q4, then moves
in the direction of the stroke a and enters into the state q0, further moves in
the direction of the stroke a and then enters into the state q5, thereafter
25 moves in the direction of the stroke g and enters into the final acceptable
state q8.
[0025] The DFA (M1) for the complex gesture of the FIG. 6 can be
represented by M1 = {Σ , Q, δ, S, QF}, where Σ is a set of alphabets as
described and Σ={a, b, c, d, e, f, g, h, u}, Q ={q1, q4, q0, q5, q8, q9} is the
16/28
set of states, S = q1 is the start state, QF = {q8} is the set of final states (or
acceptable state), and δ is the set of production rules as defined below:
δ (q1, g) = q4
δ (q1, a | b | c | d | e | f | h | u) = q9
5 δ (q4, a) = q0
δ (q4, b | c | d | e | f | g | h | u) = q9
δ (q0, a) = q5
δ(q0, b | c | d | e | f | g | h | u) = q9
δ (q5, g) = q8
10 δ (q5, a | b | c | d | e | f | h | u) = q9
δ (q8, a | b | c | d | e | f | g | h | u) = q9
δ (q9, a | b | c | d | e | f | g | h | u) = q9
[0026] The state transition enters into the unacceptable state q9 in
accordance to the rules defined in the rule table. Once the state transition
15 enters into a blocked state any further stroke in the direction of any vector
is an invalid stroke and the state transition is held in the blocked state q9.
[0027] FIG. 8 illustrates a flow diagram 800 of the DFA as
described in the FIG. 7, in accordance with various embodiments of the
present invention. The flow diagram 800 represents the steps performed
20 during the verification (recognition) of the complex gesture q1 -> q4 -> q0 ->
q5 ->q8 as represented in the FIG. 7. The user performs a gesture, which is
captured by the interface module 102. The gesture is mapped to the divided
virtual gesture space. The string of symbols using the alphabet Σ is
generated by mapping the sequence of single strokes in a gesture to the
25 vectors a, b, c, d, e, f, g, h, and u of the alphabet Σ.
[0028] At step 802, the string of symbols based on the alphabet
mapping is parsed and the state transition enters into the starting state in
accordance to the received string of symbols. In an example, the string of
17/28
symbols for a gesture to be recognized as a valid gesture is g, a, a, g as
described in the state transition diagram of the FIG. 7.
[0029] At step 804, the start state is verified. At step 806, in
response to verifying that the start state is q1, the first symbol is accepted
else the state transition enters into a blocked state q9, as 5 shown at step 826.
[0030] At step 808, the accepted first symbol of the string is
verified with the alphabet symbol ‘g’. At step 810 in response to verifying
that the first symbol is ‘g’, the state transition enters into the state 'q4 and
accepts the second symbol of the string. At step 826, in response to
10 verifying that the first symbol is not ‘g’, the state transition enters into the
blocked state q9.
[0031] At step 812, the second symbol is verified. If the second
symbol is ‘a’, then at step 814, the state transition enters into the state q0.
At step 826, in response to verifying that the second symbol is not ‘a’, the
15 state transition enters into the blocked state q9.
[0032] At step 816, the third symbol is verified. If the third symbol
is ‘a’, at step 818, the execution enters into the state q5. At step 826, in
response to verifying that the third symbol is not ‘a’, the state transition
enters into the blocked state q9.
20 [0033] At step 820, the fourth symbol is verified. At step 822, in
response to verifying that the fourth symbol is ‘g’, the state transition
enters into the state q8. At step 826, in response to verifying that the
fourth symbol is not ‘g’, the state transition enters into a blocked state q9.
Upon a successful verification of all the symbols, the execution recognizes
25 the input gesture.
[0034] The various steps described with respect to FIG. 8 may be
performed in the order presented, in a different order or simultaneously.
Further, in some embodiments, some steps listed in the FIG. 8 may be
omitted or added without departing from the scope of the invention.
18/28
[0035] FIG. 9 illustrates possible scenarios in a multi-stroke
gesture recognition framework, in accordance with various embodiments of
the present invention. The FIG. 9 depicts a virtual space divided into
multiple non-overlapping blocks along with the possible scenarios of the
multi-stroke gestures. The eighth scenario depicted in 5 the FIG. 9, along
with scenario mentioned in the FIG. 3, represents boundary conditions of
any multi-stroke based gesture.
[0036] FIG. 10 illustrates a generalized framework of the gesture
recognition used in an object transfer application, in accordance with
10 various embodiments of the present invention. The FIG. 10 depicts devices
1000 and 1002 with their virtual spaces divided into nine non overlapping
blocks and corresponding nine states q0 to q8 along with the blocked state
q9. The devices 1000 and 1002 can communicate with each other through
any available communication channel. The virtual space of the device 1000
15 depicts a multi stroke gesture with states q1 -> q4 -> q6 -> q7-> q8 -> q5 ->
q3. The virtual space of the device 1002 depicts a multi stroke gesture with
states q3 -> q0 -> q6. For the devices 1000 and 1002 DFA corresponding
to the respective gestures can be constructed. A user can send an object
from the device 1000 by performing the gesture q1 -> q4 -> q6 -> q7-> q8 -
20 > q5 -> q3. A valid gesture performed by the user executes a send
command and the selected object is sent through the communication
channel. A user can perform a valid gesture q3 -> q0 -> q6 on the device
1002, which executes a receive command, and receives the object sent by
the device 1000 over the communication channel.
25 [0037] FIG. 11 illustrates a state transition diagram corresponding
to the gestures performed for the object transfer between devices depicted
in the FIG.10, in accordance with various embodiments of the present
invention. A state transition diagram 1102 of a DFA corresponds to a send
gesture command and a state transition diagram 1104 of a DFA
19/28
corresponds to a receive gesture command. The state transition diagram
1102 represents the DFA corresponding to a send gesture defines a start
state q1. The valid stroke ‘g’ can allow the state transition to enter into a
state q4. Any other stroke such as a, b, c, d, e, f, h, and u can allow the state
transition to enter into the unacceptable state q9. If the second 5 stroke is ‘g’
then the state transition can enter into the state q6, else for all other strokes
the state transition can enter into the unacceptable or the blocked state q9.
At the state q6 with third stroke as ‘a’, the state transition can enter into the
state q7 else into the blocked state q9. At q7 with fourth stroke ‘a’, the state
10 transition enters into the state q8 else for any other stroke the state
transition enters into the blocked state q9. At the state q8 with fifth stroke
‘c’, the state transition can enter into the state q5, else into the blocked state
q9. At the state q5 with sixth stroke ‘c’, the state transition enters into the
final state q3, and for any other stroke the state transition enters into the
15 blocked state q9. Once a blocked state is reached, any further stroke
performed is an invalid stroke and the state transition is held in the blocked
state q9.
[0038] The state diagram 1104 represents the DFA corresponding
to the receive gesture defines the start state q3. The valid stroke ‘f’ can
20 allow the state transition to enter into the state q0. Any other stroke
comprising (a, b, c, d, e, f, h, and u) can allow the state transition to enter
into the unacceptable state q9. At the state q0 with stroke ‘f’, the state
transition enters into the state q6 else for any other stroke the state
transition can enter into the blocked state q9. Once the blocked state q9 is
25 reached any further stroke performed by the user is an invalid stroke and
the state transition is held in the blocked state.
[0039] FIG. 12 illustrates a generalized framework of gesture
recognition used in Augmented Reality (AR) application, in accordance
with various embodiments of the present invention. The FIG. 12 shows a
20/28
divided virtual space 1202 of a device with sixteen non-overlapping blocks
assigned to the states q1 to q16 and q17 (representing a blocked state). The
generalized framework for gesture recognition can be used in the AR
applications where it is often required to fetch a specific data related to an
object displayed on the device screen. The divided 5 virtual space 1202
depicts a gesture q2 -> q6 -> q10 -> q14, which can execute a data fetch
operation. Further, the FIG. 12 shows a state transition diagram 1204 of a
DFA corresponding to the gesture depicted in divided virtual space 1202.
The start state for the gesture is defined as q2. At the state q2 with stroke
10 ‘g’, the state transition can enter into the state q6 else the state transition
can in to the blocked state q17. At the state q6 with stroke ‘g’, the state
transition can enter into the state q10 else for any other stroke the state
transition enters into the blocked state q17. At the state q10 with stroke ‘g’,
the state transition can enter into the state q14, which represents acceptable
15 (final) state else the state transition can enter into the block state q17. Once
the blocked state is reached any further stroke performed by the user is an
invalid stroke and the state transition is held in the blocked state. Similarly,
the generalized framework for the gesture recognition can be used in
various electronic systems, for example, mobile phones, Personal Digital
20 Assistants (PDAs), or any other system.
[0040] FIG. 13 illustrates a computing environment implementing
the application, in accordance with various embodiments of the present
invention. As depicted the computing environment comprises at least one
processing unit that is equipped with a control unit and an Arithmetic Logic
25 Unit (ALU), a memory, a storage unit, a clock chip, plurality of networking
devices, and a plurality Input output (I/O) devices. The processing unit is
responsible for processing the instructions of the algorithm. The processing
unit receives commands from the control unit in order to perform its
21/28
processing. Further, any logical and arithmetic operations involved in the
execution of the instructions are computed with the help of the ALU.
[0041] The overall computing environment can be composed of
multiple homogeneous and/or heterogeneous cores, multiple CPUs of
different kinds, special media and other accelerators. The 5 processing unit is
responsible for processing the instructions of the algorithm. The processing
unit receives commands from the control unit in order to perform its
processing. Further, any logical and arithmetic operations involved in the
execution of the instructions are computed with the help of the ALU.
10 Further, the plurality of process units may be located on a single chip or
over multiple chips.
[0042] The algorithm comprising of instructions and codes required
for the implementation are stored in either the memory unit or the storage
or both. At the time of execution, the instructions may be fetched from the
15 corresponding memory and/or storage, and executed by the processing unit.
The processing unit synchronizes the operations and executes the
instructions based on the timing signals generated by the clock chip.
[0043] In case of any hardware implementations various
networking devices or external I/O devices may be connected to the
20 computing environment to support the implementation through the
networking unit and the I/O device unit.
[0044] The embodiments disclosed herein can be implemented
through at least one software program running on at least one hardware
device and performing network management functions to control the
25 elements. The elements shown in FIGS. 1 and 13 include blocks which can
be at least one of a hardware device, or a combination of hardware device
and software module.
[0045] The foregoing description of the specific embodiments will
so fully reveal the general nature of the embodiments herein that others can,
22/28
by applying current knowledge, readily modify and/or adapt for various
applications such specific embodiments without departing from the generic
concept, and, therefore, such adaptations and modifications should and are
intended to be comprehended within the meaning and range of equivalents
of the disclosed embodiments. It is to be understood that 5 the phraseology or
terminology employed herein is for the purpose of description and not of
limitation. Therefore, while the embodiments herein have been described in
terms of preferred embodiments, those skilled in the art will recognize that
the embodiments herein can be practiced with modification within the spirit
10 and scope of the embodiments as described herein.
STATEMENT OF CLAIMS
We Claim:
1. A method for gesture recognition using Deterministic Finite
Automata (DFA), the method comprising:
receiving an input data from a sensor, wherein said input
data comprises at least one gesture motion;
dividing a space of gesture into at least one block;
assigning said at least one block to at least one state of said
DFA in accordance to said input data;
constructing a gesture specific DFA; and
recognizing a gesture in accordance to said constructed
gesture specific DFA.
2. The method of claim 1, wherein said at least one gesture motion is
provided by a user based on at least one stroke, wherein said at
least one stroke comprises at least one of valid stroke and invalid
stroke.
3. The method of claim 2, wherein said at least one stroke further
comprises a pointer indicating at least one orientation of said
gesture motion.
4. The method of claim 1, wherein said method constructs said gesture
specific DFA in accordance to at least one of alphabet, state
transition rule, initial state, set of final states, and set of finite
states, wherein said alphabet comprises said at least one of valid
stroke and invalid stroke.
5. The method of claim 1, wherein recognizing a gesture in accordance
with said constructed gesture specific DFA, further comprises:
24/28
receiving said gesture input from said user;
constructing at least one string of symbol of said gesture in
accordance to said alphabet;
determining whether said at least one string of symbol
matches with said constructed gesture specific DFA; and
recognizing said gesture in response to determining that said
at least one string of symbol matches with said constructed gesture
specific DFA.
6. The method of claim 1, wherein said space of gesture comprises at
least one of a real gesture space of said sensor and a virtual gesture
space.
7. The method of claim 1, wherein said method further comprises
recognizing multi-stroke gesture by using a sequential
representation of said at least one stroke.
8. The method of claim 7, wherein said at least one stroke is spanned
over said at least one block.
9. The method of claim 1, wherein said method further comprises
transferring at least one object between a first device and a second
device in accordance to said at least one recognized gesture.
10. The method of claim 9, wherein said object is transferred using at
least one of send command and receive command executed by said
at least one of first device and second device in accordance to said
at least one recognized gesture.
11. A system for gesture recognition using Deterministic Finite
Automata (DFA), the system comprising:
25/28
an interface module configured to receive an input data from
a sensor, wherein said input data comprises at least one gesture
motion;
a DFA module configured to:
divide a space of gesture into at least one block,
assign said at least one block to at least one state of
said DFA in accordance to said input data, and
construct a gesture specific DFA; and
a gesture recognition module configured to recognize a
gesture in accordance to said constructed gesture specific DFA.
12. The system of claim 11, wherein said at least one gesture motion is
provided by a user based on at least one stroke, wherein said at
least one stroke comprises at least one of valid stroke and invalid
stroke.
13. The system of claim 12, wherein said at least one stroke further
comprises a pointer indicating at least one orientation of said
gesture motion.
14. The system of claim 11, wherein said DFA module is configured
to construct said gesture specific DFA in accordance to said at least
one of alphabet, state transition rule, initial state, set of final states,
and set of finite states, wherein said alphabet comprises said at
least one of valid stroke and invalid stroke.
15. The system of claim 11, wherein said gesture recognition module
is further configured to:
receive said gesture input from said user using said interface
module;
26/28
construct at least one string of symbol of said gesture in
accordance to said alphabet using said DFA module;
determine whether said at least one string of symbol matches
with said constructed gesture specific DFA; and
recognize said gesture in response to determining that said at
least one string of symbol matches with said constructed gesture
specific DFA.
16. The system of claim 11, wherein the system further comprises:
a storage module configured to store said constructed
gesture specific DFA; and
a display module configured to display said space of gesture
to said user, wherein said space of gesture comprises at least one of
a real gesture space of said sensor and a virtual gesture space.
17. The system of claim 11, wherein said DFA module is further
configured to recognize multi-stroke gesture by using a sequential
representation of said at least one stroke, wherein said at least one
stroke is spanned over said at least one block.
18. The system of claim 11, wherein said interface module is
configured to transfer at least one object between a first device and
a second device in accordance to said at least one recognized
gesture.
27/28
19. The system of claim 18, wherein said object is transferred using at
least one of send and receive commands executed by said at least
one of first device and second device in accordance to said at least
one recognized gesture.
| # | Name | Date |
|---|---|---|
| 1 | 2866-DEL-2012-PROOF OF ALTERATION [15-01-2024(online)].pdf | 2024-01-15 |
| 1 | Power of Authority.pdf | 2012-09-25 |
| 2 | 2866-DEL-2012-RELEVANT DOCUMENTS [24-08-2022(online)].pdf | 2022-08-24 |
| 2 | Form-5.pdf | 2012-09-25 |
| 3 | Form-3.pdf | 2012-09-25 |
| 3 | 2866-DEL-2012-FORM 4 [21-05-2021(online)].pdf | 2021-05-21 |
| 4 | Form-1.pdf | 2012-09-25 |
| 4 | 2866-DEL-2012-RELEVANT DOCUMENTS [22-04-2021(online)].pdf | 2021-04-22 |
| 5 | Drawings.pdf | 2012-09-25 |
| 5 | 2866-DEL-2012-IntimationOfGrant05-02-2021.pdf | 2021-02-05 |
| 6 | 2866-DEL-2012-PatentCertificate05-02-2021.pdf | 2021-02-05 |
| 6 | 2866-DEL-2012-Correspondence-Others-(02-11-2012).pdf | 2012-11-02 |
| 7 | SEL_New POA_ipmetrix.pdf | 2014-10-07 |
| 7 | 2866-DEL-2012-Proof of Right (MANDATORY) [25-11-2019(online)].pdf | 2019-11-25 |
| 8 | FORM 13-change of POA - Attroney.pdf | 2014-10-07 |
| 8 | 2866-DEL-2012-FER_SER_REPLY [22-11-2019(online)].pdf | 2019-11-22 |
| 9 | 2866-DEL-2012-FER.pdf | 2019-05-24 |
| 9 | 2866-DEL-2012-PETITION UNDER RULE 137 [22-11-2019(online)].pdf | 2019-11-22 |
| 10 | 2866-DEL-2012-ASSIGNMENT DOCUMENTS [10-10-2019(online)].pdf | 2019-10-10 |
| 10 | 2866-DEL-2012-FORM-26 [11-10-2019(online)].pdf | 2019-10-11 |
| 11 | 2866-DEL-2012-8(i)-Substitution-Change Of Applicant - Form 6 [10-10-2019(online)].pdf | 2019-10-10 |
| 12 | 2866-DEL-2012-ASSIGNMENT DOCUMENTS [10-10-2019(online)].pdf | 2019-10-10 |
| 12 | 2866-DEL-2012-FORM-26 [11-10-2019(online)].pdf | 2019-10-11 |
| 13 | 2866-DEL-2012-FER.pdf | 2019-05-24 |
| 13 | 2866-DEL-2012-PETITION UNDER RULE 137 [22-11-2019(online)].pdf | 2019-11-22 |
| 14 | 2866-DEL-2012-FER_SER_REPLY [22-11-2019(online)].pdf | 2019-11-22 |
| 14 | FORM 13-change of POA - Attroney.pdf | 2014-10-07 |
| 15 | 2866-DEL-2012-Proof of Right (MANDATORY) [25-11-2019(online)].pdf | 2019-11-25 |
| 15 | SEL_New POA_ipmetrix.pdf | 2014-10-07 |
| 16 | 2866-DEL-2012-Correspondence-Others-(02-11-2012).pdf | 2012-11-02 |
| 16 | 2866-DEL-2012-PatentCertificate05-02-2021.pdf | 2021-02-05 |
| 17 | 2866-DEL-2012-IntimationOfGrant05-02-2021.pdf | 2021-02-05 |
| 17 | Drawings.pdf | 2012-09-25 |
| 18 | 2866-DEL-2012-RELEVANT DOCUMENTS [22-04-2021(online)].pdf | 2021-04-22 |
| 18 | Form-1.pdf | 2012-09-25 |
| 19 | Form-3.pdf | 2012-09-25 |
| 19 | 2866-DEL-2012-FORM 4 [21-05-2021(online)].pdf | 2021-05-21 |
| 20 | Form-5.pdf | 2012-09-25 |
| 20 | 2866-DEL-2012-RELEVANT DOCUMENTS [24-08-2022(online)].pdf | 2022-08-24 |
| 21 | Power of Authority.pdf | 2012-09-25 |
| 21 | 2866-DEL-2012-PROOF OF ALTERATION [15-01-2024(online)].pdf | 2024-01-15 |
| 1 | 2019-05-2216-12-57_24-05-2019.pdf |