Sign In to Follow Application
View All Documents & Correspondence

System And Method For Using Mobile Sensors As Input Mechanism For Multiplicity Of Systems

Abstract: In an embodiment of the present invention there is provided a system for controlling remote applications wherein a sender unit is configured to send data representing a set of gestures supported by the sender unit and to selectively forward predetermined sensor data representing a set of gestures performed by a handheld device. The system of the present invention comprises of a receiver unit coupled with the sender unit and configured to register said pre identified set of remote applications residing at said receiver unit to receive actionable input, to respond sender unit with a sub set of said supported gestures as required sensor data, to receive and interpret said sensor data as actionable input and to control said registered remote applications based on said interpretation of sensor data.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
24 May 2011
Publication Number
23/2011
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

HCL Technologies Ltd.
50-53 Greams Road  Chennai - 600006 Tamil Nadu  India.

Inventors

1. Bala Aravind Ganesan
c/o HCL Technologies Ltd.  of 50-53 Greams Road  Chennai - 600006 Tamil Nadu  India.
2. Ramprasath Venugopal
c/o HCL Technologies Ltd.  of 50-53 Greams Road  Chennai - 600006 Tamil Nadu  India.

Specification

FORM-2
THE PATENTS ACT, 1970
(39 OF 1970)
AND
THE PATENTS RULES, 2003
(As Amended)
COMPLETE SPECIFICATION
(See section 10; rule 13)
'System and Method for Using mobile Sensors as Input Mechanism for Multiplicity of
Systems"
HCL Technologies Ltd., a corporation organized and existing under the laws of India, of 50-53 Greams Road, Chennai - 600006 Tamil Nadu, India.
The following specification particularly describes the nature of this invention and the manner in which it is to be performed:

Field of Technology
The present invention generally relates to field of controlling remote applications and more particularly to the field of controlling said remote applications via handheld devices using gesture recognition and interpretation.
Background
Mobile phones (and related hand held devices) are the most pervasive portable computers currently available. They contain various sensors, such as accelerometers, camera, magnetic force, Global Positioning System (GPS), gyroscope and microphones etc. These sensors allow the users to perform a natural way of interacting with the mobile applications. E.g. the accelerometer is used to detect the shake/speed of the device; light is used to detect the surrounding brightness.
Further, gestures performed by the user like moving of the phone are also known to be taken as an input by the mobile application and perform any specific action. "A gesture is a motion of the body (human or a device) that contains information. E.g. waving goodbye is a gesture. Gesture recognition allows users to perceive their bodies (or devices) as an input mechanism, without having to rely on the limited input capabilities of current mobile devices.
However, such gesture recognition capability is not available to desktop computers and related devices. This is because desktop computers or personal computers unlike handheld devices are not equipped with a variety of sensors. To provide such gesture recognition capability in desktop and related devices, a separate motion/gesture sensor device is required. Such motion/gesture sensor device further increases the bulk as well as cost associated with the desktop devices and is not a feasible solution for most of

the situations. For example, "Nintendo", a motion sensor gaming firm uses Wiimote that is special motion capture hardware to control the game. Similarly, Microsoft's Kinect also attempts to detect human movement gestures using camera and sensor device.
US 2009/0217211 A1, ENHANCED INPUT USING RECOGNIZED GESTURES, describes a scheme of recognizing a user's gesture from a first and second set of images but is admittedly limited to recognizing 'gestures' from static images that need to be captured and stored. Contrastingly, the present invention enables handheld devices (including mobile devices) having sensors, to act as an input device to control and communicate with the applications on a desktop computer. This is achieved either by feeding the raw inputs or by identifying the gesture event from the inputs.
US 6,947,975 B2, MOBILE DEVICE PERIPHERAL INTERFACE SYSTEM AND METHOD is merely concerned with transfer of program instructions from a mobile device to a less portable device such as desktop computer and execution of program instructions on the desktop computer, with the intent of utilizing rich computer resources of a typical desktop machine and fails to disclose gesture recognition or gesture interpretation.
Hence, there is a long felt need to implement said gesture recognition ability without using a separate motion sensing device, in desktop computers that lack needed sensors to implement such desired technology.
Summary
The present invention provides a method and system for controlling remote applications at a desktop computer and related peripheral devices via gesture recognition and interpretation.

In an embodiment of the present invention there is provided a system for controlling remote applications wherein a sender unit is configured to send data representing a set of gestures supported by the sender unit and to selectively forward predetermined sensor data representing a set of gestures performed by a handheld device. The system of the present invention comprises of a receiver unit coupled with the sender unit and configured to register said pre identified set of remote applications residing at said receiver unit to receive actionable input, to respond sender unit with a sub set of said supported gestures as required sensor data, to receive and interpret said sensor data as actionable input and to control said registered remote applications based on said interpretation of sensor data.
The present invention also provides a corresponding method for controlling remote applications comprising the steps of registering said remote applications residing at the receiver unit for receiving predetermined sensor data as an actionable input; pairing a sender unit with a receiver unit for establishing a communication channel thereon, sending a set of data representing a set of supported gestures by the sender unit towards the receiver unit, responding with a sub set of said supported gestures as required sensor data towards the sender unit by said receiver unit, sending predetermined sensor data representing a set of gestures performed by a handheld device by the sender unit towards said receiver unit, receiving and interpreting said sensor data for controlling said remote applications residing at the receiver unit based on said interpretation of sensor data, by said receiver unit. In another embodiment of the present invention, sender unit and receiver unit are paired to exchange said sensor data.
In another embodiment of the present invention receiver unit is configured to host a repository of registry that comprises of a set of services corresponding to applications that are registered to receive notifications for actionable inputs.

In another embodiment of the present invention, multiple sender units can be paired with multiple receiver units
In yet another embodiment of the present invention the receiver unit comprises of an active device that is configured to provide two way communication, to communicate and to keep the pairing alive with the sending unit or a passive device configured to provide one way communication to receive sensor data.
Brief Description of the Drawings
Figure 1 illustrates an exemplary architecture of the present invention.
Figure 2 illustrates exemplary embodiments of receiver units of the present invention and associated communication channels.
Figure 3 illustrates a sequence diagram for an exemplary communication between a sender unit and the receiver unit of the present invention.
Figure 4 illustrates exemplary gestures performed by a sender unit for controlling remote applications at the receiver unit.
Detailed Description
The various features of the preferred embodiment of present invention together with its objects and advantages thereof may be best understood by reference to the description taken in conjunction with the accompanying schematic drawing(s) of the architecture.
The present invention allows handheld devices (hereinafter referred to as sender unit) like smart phones etc. having multiple set of sensors to remotely control remote applications on a desktop device (hereinafter referred to as

receiver unit) that lack such sensors. In particular, the present invention allows such remote controlling of applications via gesture recognition and interpretation. In a typical embodiment of the present invention, gestures are performed by a "user", preferably by moving a handheld device in predefined directions and thereby controlling remote applications at the desktop computers (receiver unit) based on recognition and interpretation of said gestures.
In essence, the present invention provides for a natural remote human interfacing to desktop applications using hand held devices without making use of any specialized motion capturing hardware.
In the present invention, a receiver device is configured to act as
1. Active device e.g. including but not limited to desktop computers,
laptops and like devices. Active devices comprise of
- Gesture recognizer module [105]
- Gesture controller module [107]
- Gesture service registry [109]
- Pairing module [103]
- communication module [104]
2. Passive devices e.g. including but not limited to printer, FAX machines
etc. Passive devices comprise of a
- Pairing module
- Communication module

Now referring Figure 1 to explain the exemplary architecture of the present invention. Figure 1 explains the exemplary sender unit/device [101] and a receiver unit/device [102] via a block diagram representation.
The exemplary architecture of the present system comprises of
- Pairing module [103] A pairing module [103] is resident on a
sender unit [101] as well as a receiver unit [102] and is required to
establish a dedicated logical connection between these two. A
sender unit [101] can establish a logical connection with the receiver
unit [102] using any of the below mentioned networks /pairing
channels including but not limited to USB (Universal Serial Bus),
NFC (Near Field Communication), Bluetooth and GPRS (General
Packet Radio Service) network.
Pairing between sender unit [101] and receiver unit [102] is particularly required to exchange communication methodology, network settings like IP address, port no. etc. In a typical scenario, a receiver [102] waits for a sender unit [101] to request a connection via any one of the aforementioned pairing channels. A receiver unit [102] can be connected with 'n' no. of sender unit at the same instant and similarly, a sender unit can also be simultaneously connected via said pairing channels, to 'n' no. of receiver units.
- Communication module allows the establishment of a data socket
connection between the sender unit [101] and the receiver unit
[102]. It allows for continuous sending of sensor inputs /recognized
gestures for which the receiver unit [102] has registered for.

- Gesture Recognizer module is resident at both the sender unit
[101] and receiver unit [102] and is configured to receive sensor
inputs from the sender unit [101], to recognize the registered gesture
and/or to receive the recognized gesture from the sender unit [101].
The received sensor input (i.e. received gesture /registered gesture)
is then sent to the Gesture Controller module [107].
The receiver unit [102] is configured to receive gesture information in two ways
- As a raw sensor data from the sender unit [101]
- As a processed sensor data (recognized gesture) from the sender unit [101]. The gesture recognizer [105] residing on the receiver unit [101] processes the raw sensor data and interprets it as a meaningful gesture.
- Gesture Controller Module [107] resident at the receiver unit
[102] is configured to perform the programmed action based on
the inputs received from the gesture recognizer module [105].
The system of the present invention is configured to provide a framework to trigger predetermined events based on the received sensor inputs/gestures. The active device applications are adapted to use this framework to get notified on these events.
- Gesture Services Registry [109] resident at the receiver unit
[102] maintains a set of services that are requesting (or intend to
request) for the gesture inputs. Any application that needs to be
controlled by interpreting the sender unit [101] (mobile) gesture
will create a separate service entry into the registry. Each service
entry so created for the application indicates the gestures handled
by such application and the corresponding action to be taken for
each such gesture.

In an exemplary embodiment, a power point application may have a service entry in the Gesture Service Registry [109]. In such a scenario, it may register for mobile (sender unit [101]) move-left and move-right gestures. For example, the gesture for "move-right" may be to scroll through the next slide and for "move-left" may be to scroll through the previous slide.
In the like manner, different applications at the receiver unit [102] may register with a gesture service registry [109] to recognize different "mobile gestures" and perform predefined operations.
Platform sensor Application Program Interface (API) [105] resident on the sender unit [101] provides an interface between the underlying sensor devices and the gesture recognizer [105]. Platform sensor Application Program Interface (API) [105] is also configured to provide information on available set of sensors and the type of output values that a particular sensor is supposed to give.
Now referring Figure 2, to explain different embodiments of receiver units [102] as illustrated in Figure 1.
In an exemplary embodiment, a receiver unit [102] can be classified as an active device [202] and as a passive device [203]. In a typical embodiment, an active device [202] could be a desktop computer [204], a laptop [205] or a tablet [206].
Within the realms of the present invention, active devices [202] are configured to

- execute a gesture recognizer module as illustrated in Figure 1 and
- to keep a session alive with the sender unit [101]
On the Contrary, passive devices [203] do not have a gesture recognizer module and do not keep a session alive the sender device [101]. Examples of passive devices [203] include but not limited to a printer [209], a FAX machine [210], a projector device [211] etc.
Passive devices [203] merely rely on the pairing module and the communication module to perform their designated functions.
In essence, a passive device [203] is configured to "passively" respond to sender unit's gestures. E.g. a sender unit [101] can print documents to the printer upon detecting a gesture input on the sender unit [101] and may do nothing else but its assigned dedicated function. Similarly, where the passive device [203] (receiver unit) is a projector device [211], the sender unit can simply project the desired data via said projector device [211] (receiver unit) by performing a desired gesture. As such, wherein the receiver unit is a passive device [203], only predefined, limited, and dedicated functions can be performed at the receiver unit. A receiver unit when classified as a passive device [203] may not register customized applications for receiving gesture inputs and perform corresponding functions. Hence, a receiver unit acting as a passive device [203] does not require a dedicated gesture recognizer module or a dedicated connection/session with the sender unit.
Still referring to Figure 2, various pairing channels [207] and communication channels [208] that can be established between sender units [201] and receiver units [202] and [203] are illustrated.

Now referring to Figure 3 for explaining the steps of the present invention.
Initially, sender unit [301] and receiver unit [302] are paired [303] and a dedicated logical connection [304] is established between them. This is followed by sharing [305] of available gestures/sensor list by the sender unit [301] and followed by forwarding of requested sensors [306] by the receiver unit [302].
Thereafter, a set of gestures are performed by the sender unit and the same are continuously sent by the sender unit towards the receiver unit [307]. Based on the received gesture /sensor values, receiver unit performs the corresponding programmed action [308].
Figure 4 illustrates an exemplary set of gestures performed by the sender unit and the corresponding functions performed at the receiver unit. A user [400] holding a mobile device may rotate [402] the sender unit /mobile device which may cause a three dimensional figure at the receiver unit [401] to rotate in response [403].
Similarly, "shaking" gesture [404] at a sender unit may cause a slide show application at the receiver unit to scroll [405] and multi touch operation/gesture [406] at the sender unit may cause a musical keyboard application [407] at the receiver unit to operate.
An ordinary person skilled in the art would appreciate that various other gestures, not necessarily illustrated or described herein may be performed by the sender unit for performing different functions at the receiver unit.

The present invention is not intended to be restricted to any particular form or arrangement, or any specific embodiment, or any specific use, disclosed herein, since the same may be modified in various particulars or relations without departing from the spirit or scope of the claimed invention herein shown and described of which the apparatus or method shown is intended only for illustration and disclosure of an operative embodiment and not to show all of the various forms or modifications in which this invention might be embodied or operated.

We Claim
1. A system for controlling remote applications, said system comprising
a sender unit configured to send data representing a set of gestures supported by said sender unit and to selectively forward predetermined sensor data representing a set of gestures performed by a handheld device;
- a receiver unit coupled with the sender unit and configured to register said
pre identified set of remote applications residing at said receiver unit to
receive actionable input, to respond sender unit with a sub set of said
supported gestures as required sensor data, to receive and interpret said
sensor data as actionable input and to control said registered remote
applications based on said interpretation of sensor data.
2. A system for controlling remote applications as claimed in claim 1, wherein said sender unit comprises of a gesture recognizer module configured to recognize and forward a set of gestures as sensor data supported by the sender unit.
3. A system for controlling remote applications as claimed in claim 1 wherein the receiver unit is configured to recognize said received sensor data as an actionable input and to notify preregistered applications.
4. A system for controlling remote applications as claimed in claim 1, wherein said sender unit and receiver unit are paired to exchange said sensor data.
5. A system for controlling remote applications as claimed in claim 1, wherein
said sensor data comprises of
- raw sensor data and;
- processed sensor data.

6. A system for controlling remote applications as claimed in claim 1 and 5,
wherein the system comprises of
- a gesture controller configured to perform predetermined actions based on said received sensor data as an actionable input.; and
- a gesture recognizer operable at said sender unit and receiver unit and configured to recognize gesture(s) from raw sensor data.

7. A system for controlling remote applications as claimed in claim 1, wherein said receiver unit is configured to host a repository of registry, said registry comprising of a set of services corresponding to applications that are registered to receive notifications for actionable inputs.
8. A system for controlling remote applications as claimed in claim 1, wherein sender unit comprises of handheld devices and receiver unit comprises of a non-mobile devices.
9. A system for controlling remote applications as claimed in claim 1 and 10, wherein receiver unit comprises of

- an active device configured to provide two way communication, to communicate and to keep the pairing alive with the sending unit;
- a passive device configured to provide one way communication to receive sensor data.

10. A system for controlling remote applications as claimed in claim 1 wherein plurality of sender units are paired with a plurality of receiver units.
11. A method for controlling remote applications, said method comprising the steps of

- registering said remote applications residing at the receiver unit for
receiving predetermined sensor data as an actionable input;
- pairing a sender unit with a receiver unit for establishing a
communication channel thereon;
- sending a set of data representing a set of supported gestures by the sender unit towards the receiver unit;
- responding with a sub set of said supported gestures as required sensor data towards the sender unit by said receiver unit;
- sending predetermined sensor data representing a set of gestures performed by a handheld device by the sender unit towards said receiver unit;
- receiving and interpreting said sensor data for controlling said remote
applications residing at the receiver unit based on said interpretation of
sensor data, by said receiver unit.
12. A method for controlling remote applications as claimed in claim 11, wherein the communication channel established after pairing comprises of USB (Universal Serial Bus), NFC (Near Field Communication), Bluetooth, GPRS (General Packet Radio Service) network.
13. A method for controlling remote applications as claimed in claim 11 , wherein the step of sending a set of data representing a set of supported gestures by the sender unit towards the receiver unit is performed by a gesture recognizer module residing at the sender unit.

14. A method for controlling remote applications as claimed in claim 11, wherein the receiver unit is configured for recognizing said received sensor data as an actionable input and for notifying preregistered applications.
15. A method for controlling remote applications as claimed in claim 11 wherein said sensor data comprises of
- raw sensor data and;
- processed sensor data.
16. A method for controlling remote applications as claimed in claim 11 and 14
wherein a gesture controller module is configured for performing
predetermined actions based on said received sensor data as an actionable
input and a gesture recognizer operable at said sender unit and receiver unit
and is configured for recognizing gesture(s) from raw sensor data.
17. A method for controlling remote applications as claimed in claim 11 wherein said receiver unit is configured for hosting a repository of registry, said registry comprising of a set of services corresponding to applications that are registered for receiving notifications for actionable inputs.
18. A method for controlling remote applications as claimed in claim 11 wherein sender unit comprises of handheld devices and receiver unit comprises of non-mobile devices.
19. A method for controlling remote applications as claimed in claim 11 wherein receiver unit comprises of active device(s) configured for communicating and keeping the pairing alive with the sending unit and passive device(s) configured for providing one way communication for receiving sensor data.

20. A method for controlling remote applications as claimed in claim 11, wherein pairing comprises of establishing a plurality of communication channels with a plurality of receiver units, by a plurality of sender units.
Dated this 24th day of May 2011
Of Anand and Anand, Advocates Agents for the Applicants

Documents

Application Documents

# Name Date
1 1759-CHE-2011 FORM-9 01-06-2011.pdf 2011-06-01
1 1759-CHE-2011-AbandonedLetter.pdf 2017-12-08
2 1759-CHE-2011 FORM-18 01-06-2011.pdf 2011-06-01
2 1759-CHE-2011-FER.pdf 2017-05-12
3 Form-3.pdf 2011-09-03
3 1759-CHE-2011 OTHER PATENT DOCUMENT 28-12-2011.pdf 2011-12-28
4 Form-1.pdf 2011-09-03
4 1759-CHE-2011 CORRESPONDENCE OTHERS 28-12-2011.pdf 2011-12-28
5 1759-CHE-2011 CORRESPONDENCE OTHERS 14-12-2011.pdf 2011-12-14
5 1759-CHE-2011 FORM-1 14-12-2011.pdf 2011-12-14
6 1759-CHE-2011 POWER OF ATTORNEY 14-12-2011.pdf 2011-12-14
7 1759-CHE-2011 CORRESPONDENCE OTHERS 14-12-2011.pdf 2011-12-14
7 1759-CHE-2011 FORM-1 14-12-2011.pdf 2011-12-14
8 1759-CHE-2011 CORRESPONDENCE OTHERS 28-12-2011.pdf 2011-12-28
8 Form-1.pdf 2011-09-03
9 1759-CHE-2011 OTHER PATENT DOCUMENT 28-12-2011.pdf 2011-12-28
9 Form-3.pdf 2011-09-03
10 1759-CHE-2011-FER.pdf 2017-05-12
10 1759-CHE-2011 FORM-18 01-06-2011.pdf 2011-06-01
11 1759-CHE-2011-AbandonedLetter.pdf 2017-12-08
11 1759-CHE-2011 FORM-9 01-06-2011.pdf 2011-06-01

Search Strategy

1 PatseerSearchStrategy_12-05-2017.pdf