Abstract: SYSTEM AND METHOD FOR PROVIDING AND CONTROLLING VISUAL DISPLAY USEFUL FOR REMOTE CONTROL OF A VARIETY OF DEVICES 5 An illustrative example system designed according to an embodiment of this invention includes. A processor is configured to receive an indication from a detector regarding a detected user gesture and select a surface to be used as a display surface for a visual interface. The processor is configured to communicate with or drive a display that generator is configured to display at least one image of the visual 10 interface on the selected surface. The visual interface is configured to facilitate the user controlling or obtaining information regarding at least one device that is distinct from the processor and the display generator.
1. Technical Field
1ooo11 The subject matter of this document pertains to visual displays useful
for facilitating remotely controlling a variety of devices. More particularly, and
without limitation, the subject matter of this document pertains to selecting a surface
5 for displaying a visual interface and controlling a source of the display.
2. Background
[ooozl It is becoming increasingly possible to communicate with a variety of
devices using wired links or wireless communication protocols. Additionally, more
10 devices are provided with controllers that allow for automated or semi-automated
operation. These and other developments have contributed to the development of the
"Internet of Things" (IoT). It is conceivable to control a variety of devices within a
home or business environment remotely based on communications over the IoT.
100031 One way in which individual devices may be remotely controlled is
15 through a software application on a smartphone or similar device. Such an
application is typically dedicated to controlling one particular device or a particular
brand of device. One challenge associated with expanding an IoT is that as the
number of participating devices increases, the required number of applications
similarly increases. The scalability of this approach is limited by the likely
20 inconvenience to an individual to learn and then utilize multiple applications.
Additionally, the number of individual applications required will tend to burden the
IoT approach with overhead rather than introduce additional convenience or
efficiency for the individual user.
100041 Another issue with using applications on a smartphone or similar
25 device is that the user often will have to divert attention between the smartphone and
the actual device the user intends to control. The associated inconvenience to an
individual may outweigh any benefit of having remote control capability.
looosj One other drawback to an approach based on individual applications
for controlling individual devices is that the individual will require immediate access
30 to the smartphone or other device used for control. Studies have shown that many
people do not carry their mobile device with them at all times and it may prove more
convenient to manually control a particular device rather than retrieve a mobile device
for remote control.
SUMMARY
100061 An illustrative example system designed according to an embodiment
of this invention includes. A processor is configured to receive an indication from a
detector regarding a detected user gesture and select a surface to be used as a display
5 surface for a visual interface. The processor is configured to communicate with or
drive a display that generator is configured to display at least one image of the visual
interface on the selected surface. The visual interface is configured to facilitate the
user controlling or obtaining information regarding at least one device that is distinct
from the processor and the display generator.
10 100071 In an example system having one or more features of the system of the
previous paragraph, the processor is configured to identify at least one candidate
surface near the user, determine at least one surface characteristic of the candidate
surface, and determine whether the candidate surface is useful as the selected surface.
looosl In an example system having one or more features of the system of
15 either of the previous paragraphs, the display generator is configured to direct the at
least one displayed image toward the selected surface, and adjust at least one feature
of the displayed image based on at least one of a surface characteristic of the selected
surface or a relationship between the display generator and the selected surface.
100091 An example system having one or more features of the system of any
20 of the previous paragraphs includes a camera situated relative to the display generator
for providing information regarding the relationship between the display generator
and the selected surface.
[oooiol In an example system having one or more features of the system of any
of the previous paragraphs, the detector is part of the system and the detector provides
25 an indication of at least one subsequent user gesture, the processor communicates
with the device to be controlled based on the subsequent user gesture, and the display
generator alters the displayed image of the visual interface based on at least the
subsequent user gesture.
loooiil An example system having one or more features of the system of any
30 of the previous paragraphs includes a communication network that links the processor
with a plurality of devices that can be controlled based on user gestures.
1000121 In an example system having one or more features of the system of any
of the previous paragraphs, the detector is part of the system, is configured to be worn
by the user, and the detector comprises a plurality of sensors for detecting a plurality
of types of movement that may be included as part of the user gesture.
[oooi31 An example system having one or more features of the system of any
of the previous paragraphs includes a user feedback portion that provides an
5 indication to the user that the user gesture has been detected.
[oooi4j In an example system having one or more features of the system of any
of the previous paragraphs, the processer is configured to recognize a user gesture as
an initiation gesture, provide a signal to indicate to the user that the initiation gesture
has been received, and determine whether the user repeats the initiation gesture as a
10 prerequisite to selecting the surface for the display surface.
[oooisl In an example system having one or more features of the system of any
of the previous paragraphs, the detector is part of the system and the detector
comprises an identifier that indicates an identity of the detector.
[OOOI~AI n illustrative method designed according to an embodiment of this
15 invention includes selecting a surface to be used as a display surface for a visual
interface based at least in part on a detected user gesture, and displaying at least one
image of the visual interface on the selected surface. The visual interface is
configured to facilitate a user controlling or obtaining information regarding at least
one device.
20 ~oooi7j An example method having one or more features of the method of the
previous paragraph includes identifying at least one candidate surface near the user,
determining at least one surface characteristic of the candidate surface, and
determining whether the candidate surface is useful as the selected surface.
~oooisj An example method having one or more features of the method of any
25 of the previous paragraphs includes directing the at least one displayed image toward
the selected surface, and adjusting at least one feature of the displayed image based on
at least one of a surface characteristic of the selected surface or a relationship between
a source of the displayed image and the selected surface.
1000191 An example method having one or more features of the method of any
30 of the previous paragraphs includes obtaining information regarding the relationship
between the display generator and the selected surface from a camera.
jooozol An example method having one or more features of the method of any
of the previous paragraphs includes detecting at least one subsequent user gesture,
communicating with the device to be controlled based on the subsequent user gesture,
4
and altering the displayed image of the visual interface based on at least the
subsequent user gesture.
~000211 An example method having one or more features of the method of any
of the previous paragraphs includes communicating over a communication network
5 with a plurality of devices that can be controlled based on user gestures.
1000221 An example method having one or more features of the method of any
of the previous paragraphs includes detecting the user gesture using a detector
configured to be worn by the user, and detecting a plurality of types of movement that
may be included as part of the user gesture.
10 roooz31 An example method having one or more features of the method of any
of the previous paragraphs includes providing an indication to the user that the user
gesture has been detected.
loooil An example method having one or more features of the method of any
of the previous paragraphs includes recognizing a user gesture as an initiation gesture,
15 indicating to the user that the initiation gesture has been detected, and determining
whether the user repeats the initiation gesture as a prerequisite to selecting the surface
for the display surface.
100021 An example method having one or more features of the method of any
of the previous paragraphs includes using a detector for detecting the user gesture, and
20 wherein the detector comprises an identifier of the detector.
100031 The various features and advantages of this invention will become
apparent to those skilled in the art from the following detailed description. The
drawings that accompany the detailed description can be briefly described as follows.
BRIEF DESCRIPTION OF THE DRAWINGS
100041 Figure 1 schematically illustrates selected portions of a system that
provides and controls a displayed visual user interface to facilitate a user controlling
or obtaining information from at least one device.
looosl Figure 2 schematically illustrates selected additional detail of selected
30 portions of the example system of Figure 1.
[ooosl Figure 3 is a flowchart diagram summarizing an example approach to
displaying a visual user interface on a selected surface.
DETAILED DESCRIPTION
100071 At least one disclosed embodiment of a system and method for
displaying and controlling a visual user interface leverages at least one surface of an
existing object in an opportunistic manner that facilitates an individual obtaining
5 information from or remotely controlling at least one device. With the disclosed
embodiment, it is possible to select a display surface and to control a source of a
visual interface display to accommodate any considerations associated with the
selected display surface, such as a characteristic of the selected surface or a
relationship between the selected surface and the source of the display.
10 looosl Figures 1 and 2 schematically show selected portions of a system 20
that facilitates an individual or user 22 controlling at least one of a plurality of devices
or obtaining information regarding at least one of the devices. In some embodiments,
the system 20 is at least part of an Internet of Things (IoT).
100091 The example system 20 includes at least one detector 24 that is
15 configured to detect a gesture performed by the user 22. As shown in Figure 2,
example detectors may comprise a bracelet 24a or a ring 24b that is configured to be
worn by the user 22. The detector 24 provides an indication, such as signaling, of a
gesture that may comprise movement of the user's arm, hand or finger. In this
example, the detector 24 is worn by the user 22 but other embodiments include other
20 detectors configured to detect a user gesture without being worn by the user 22.
loo0101 The example detector 24 includes a plurality of sensors to detect a
variety of gestures. Some embodiments include a gyroscope to provide an indication
of an orientation of the detector 24 and an accelerator to provide an indication of
motion of the detector 24. With such a detector 24 it is possible to detect a variety of
25 distinct gestures, such as a swipe to the left, a swipe to the right, a swipe up, a swipe
down, forward or backward movement, a vertical circle (clockwise or anti-clockwise),
a horizontal circle (clockwise or anti-clockwise), a rapid movement or a slow
movement. Each gesture, which may be a single motion or a combination of motions,
results in a different output from the detector 24.
30 1oooiil Some embodiments of the detector 24 also include a magnetometer and
a pressure sensor to provide additional information, such as information regarding
how an individual is making a gesture or interacting with a nearby object or surface.
loooizl The example detector 24 includes an identifier that provides an
indication of an identity of the detector 24. One example includes a hyperspectral
6
code that is detectable by a camera or video input device. Some examples include a
wireless communication interface that includes a signal signature that is useful to
identify the detector 24.
[oooi~l A controller 26 includes at least one processor for processing
5 information from the detector 24. The controller 26 controls operation of a display
generator 28, such as a projector, to provide a display of a visual user interface on at
least one surface near the user 22. In the example of Figure 1, candidate surfaces for
the displayed visual interface include a vertical surface 30 on a refrigerator (or other
appliance), a horizontal surface 32 on a table or bench, and a vertical wall surface 34.
10 1000141 In this example, the system 20 includes a camera 36 for gathering
information regarding the environment within which the user 22 is situated when the
visual interface is desired by the user. The controller 26 determines which surfaces
are nearby the user 22 and selects at least one of the surfaces as a display surface.
The controller 26 directs the display generator 28 to display an appropriate visual
15 interface on the selected surface 30, 32 or 34. The controller 26 also controls the
content of the visual interface based on one or more user gestures detected by the
detector 24, which indicates the user's interaction with or use of the visual interface.
loooisl In the example of Figure 2, the controller 26 includes an input module
that is configured to receive communications or indications from the detector 24
20 regarding user gestures. A decision engine module 42 is configured to determine
what gesture has been detected and to respond accordingly. For example, one or more
gestures may be predetermined to indicate a desire to initiate a session using a visual
interface and other gestures may be used to indicate the device the user 22 desires to
access or control. Still other gestures are useful for indicating commands to control a
25 chosen device. A rendering engine module 44 provides commands or information to
the display generator 28 for controlling the content and appearance of the visual
interface display on the selected surface 30, 32, or 34. A semantic recognition
module 46 receives information from the camera 36 for identifying candidate surfaces
for the visual interface display. A data and control module 48 facilitates
30 communications over at least one communication link 50 between the controller 26
and a plurality of devices 52. In some examples, the communication link 50
comprises a bus line while in others it includes wireless communication links. In the
example of Figure 2, the system 20 provides and controls a visual interface display
that facilitates interaction with or control over a variety of devices 52, such as
household or appliances or entertainment devices.
1000161 Each of the modules of the controller 26 may be realized as hardware,
firmware, software or a combination of them in at least one processor. The schematic
5 division of the illustrated modules is for discussion purposes only and those skilled in
the art will realize that there are a variety of ways to realize the functionality of the
controller 26 using one or more dedicated processors or portions of computing
devices.
1000171 The controller 26 may be situated in a housing with the projector 28
10 and camera 36 or situated in a separate location. In the example of Figures 1 and 2 at
least the projector 28 and the camera 36 are mounted on a ceiling of the space within
which the user 22 desires to have access to the IoT associated with the plurality of
devices 52. The projector 28 in some embodiments is steerable with a horizontal
range of 360" and a vertical range of 90". The camera 36 can be similarly steerable or
15 may include a fish eye lens to provide a view of at least a substantial portion of the
space where the user 22 may be located when desiring access to the IoT.
[oooia1 While a ceiling mounted projector is an illustrated example of the
display generator 28, other embodiments are possible. For example, the display
generator may be a portable device, such as a smartphone that has projector
20 capabilities or another portable projection device. In some examples, the display
generator 28 may be worn by the user or carried by the user. The controller 26 is
suitably programmed to take into account any features of the display generator that
may have an effect on the visibility or usability of the generated display. Some
example controllers 26 may be configured to control a variety of display generators
25 28, depending on the components used with a particular embodiment.
1000191 Figure 3 is a flowchart diagram 60 summarizing an example approach
according to an embodiment of this invention. The detector 24 detects a user gesture
at 62. The controller 26 determines at 64 whether the detected user gesture is a
known or predefined initiation gesture that indicates a user desire to initiate a visual
30 interface display to facilitate a session using the IoT. If not, the controller 26 waits
for another gesture recognition event.
looozol If the detected gesture is an initiation gesture, the controller 26
communicates with the detector 24 (or another device) to provide an indication at 66
to the user 22 that the gesture was recognized or received. In some examples, the
8
detector 24 is equipped with a vibration device that provides tactile feedback to the
user. In this example, the user 22 is required to repeat the initiation gesture within a
preset period, which begins once the initial initiation gesture was received by the
controller 26, to confirm that the detected gesture was intentional. At 68 the
5 controller 26 determines whether the user 22 repeats the initiation gesture based on
information from the detector 24. If not, the controller 26 returns to a wait mode until
another gesture is detected.
rooozil If the user 22 confirmed a desire to initiate the session by repeating the
initiation gesture, the controller 26 proceeds to identify candidate surfaces, such as the
10 surfaces 30, 32 and 34, near the user 22. The controller 26 uses information gathered
by the camera 36 regarding the contents of the room or area where the user is located
to identify the candidate surfaces. Some example embodiments include known visual
information processing techniques for recognizing different candidate surfaces.
rooozzj Depending on any of a variety of predetermined factors, such as
15 proximity to the user 22 or surface contour or sheen, the controller 26 selects a
candidate surface as the display surface at 72. At 74 the controller 26 directs the
projector 28 to generate a display and aim it at the selected display surface so the
visual user interface appears on the selected surface at 76. A steerable projector 28
allows for positioning the display on particular portions of the selected surface to
20 increase the visibility or accessibility of the visual interface.
ioooz~j Given that the selected surface may have a variety of surface
characteristics, the illustrated example includes the ability to adjust the display at 78
to accommodate any surface characteristics that may impact the quality of the user
experience when viewing the visual interface. For example, the contour,
25 reflectiveness or roughness of the surface may require an enhanced contrast or
brightness of the display to realize a pleasing appearance. Additionally, ambient light
conditions may require an adjustment to one or more features of the displayed visual
interface to facilitate easy viewing by the user. Another feature of the disclosed
example is that it is capable of adjusting a keystone or other perspective aspect of the
30 display to account for a spatial relationship between the projector 28 and the selected
display surface. Any or all such adjustments can be made based on a determination
by the controller regarding the selected surface and the arrangement of the
components of the system 20 relative to the location of the selected surface. In some
examples, the controller 26 is programmed to use one or more known techniques to
determine how to adjust the displayed visual interface at 78.
1000241 Assuming the user 22 makes additional gestures indicating user
interaction with the displayed visual interface, which may be determined based on
5 information from at least one of the detector 24 or the camera 36, the controller 26
controls the display at 80 to provide visible information to the user regarding the
user's intended interaction with the selected device or devices 52.
[ooozsl One feature of the disclosed example is that the controller 26 may
choose a different display surface depending on at least one factor, such as the
10 location of the user 22, the device 52 the user desires to access or control, the time of
day, ambient light conditions, and the orientation or contour of a candidate surface.
The selected display surface will not always be the same over time as the user intends
to take advantage of the IoT on different days or for different purposes. Additionally,
the displayed image of the visual interface may be different at different times or for
15 different purposes. The illustrated system is versatile and customizable by a user to
meet a variety of needs or preferences.
1000261 Once a user completes a session by making a session close gesture or
letting a predetermined amount of time pass between recognizable gestures, the
controller 26 commands the display generator to cease displaying the visual interface
20 until a subsequent initiation gesture is detected.
100027j While various features and aspects are described above in connection
with one or more particular embodiments, those features and aspects are not
necessarily exclusive to the corresponding embodiment. The disclosed features and
aspects may be combined in other ways than those specifically mentioned above. In
25 other words, any feature of one embodiment may be included with another
embodiment or substituted for a feature of another embodiment.
1000281 The preceding description is exemplary rather than limiting in nature.
Variations and modifications to the disclosed examples may become apparent to those
skilled in the art that do not necessarily depart from the essence of this invention. The
30 scope of legal protection given to this invention can only be determined by studying
the following claims.
W e claim:
1. A system, comprising:
a processor configured to ". ,gl/ L L
receive an indication from a detector regarding a detected user gesture,
5 select a surface to be used as a display surface for a visual interface,
and
generate at least one command for displaying at least one image of the
visual interface on the selected surface, the visual interface being configured to
facilitate the user controlling or obtaining information regarding at least one device
10 that is distinct from the processor and the display generator.
2. The system of claim 1, wherein the processor is configured to
identify at least one candidate surface near the user;
determine at least one surface characteristic of the candidate surface; and
15 determine whether the candidate surface is useful as the selected surface.
3. The system of claim 1, comprising a display generator and wherein the display
generator is configured to
direct the at least one displayed image toward the selected surface; and
adjust at least one feature of the displayed image based on at least one of
a surface characteristic of the selected surface and
a relationship between the display generator and the selected surface.
4. The system of claim 3, comprising
25 a camera situated relative to the display generator for providing informat'ion
regarding the relationship between the display generator and the selected surface.
5. The system of claim 1, comprising a detector configured to detect the user
gesture and wherein
the detector provides an indication of at least one subsequent user gesture;
the processor communicates with the device to be controlled based on the
5 subsequent user gesture; and
the processor generates at least one command for altering the displayed image
of the visual interface based on at least the subsequent user gesture.
6. The system of claim 1, comprising a communication network that links the
10 processor with a plurality of devices that can be controlled based on user gestures.
7. The system of claim 1, comprising a detector configured to detect the user
gesture and wherein
the detector is configured to be worn by the user; and
15 the detector comprises a plurality of sensors for detecting a plurality of types
of movement that may be included as part of the user gesture.
8. The system of claim 1, comprising a user feedback portion that provides an
indication to the user that the user gesture has been detected.
20
9. The system of claim 1, wherein the processer is configured to
recognize a user gesture as an initiation gesture;
provide a signal to indicate to the user that the initiation gesture has been
received; and
25 determine whether the user repeats the initiation gesture as a prerequisite to
selecting the surface for the display surface.
10. The system of claim 1, comprising a detector configured to detect the user
gesture and wherein the detector comprises an identifier that indicates an identity of
30 the detector to the processor.
11. A method, comprising the steps of:
selecting a surface to be used as a display surface for a visual interface based
at least in part on a detected user gesture; and
generating at least one command for displaying at least one image of the
5 visual interface on the selected surface, the visual interface being configured to
facilitate a user controlling or obtaining information regarding at least one device.
12. The method of claim 11, comprising
identifying at least one candidate surface near the user;
determining at least one surface characteristic of the candidate surface; and
determining whether the candidate surface is useful as the selected surface.
13. The method of claim 1 1, comprising
directing the at least one displayed image toward the selected surface; and
adjusting at least one feature of the displayed image based on at least one of
a surface characteristic of the selected surface and
a relationship between a source of the displayed image and the selected
surface.
20 14. The method of claim 13, comprising
obtaining information regarding the relationship between a display generator
and the selected surface from a camera.
15. The method of claim 11, comprising
25 detecting at least one subsequent user gesture;
communicating with the device to be controlled based on the subsequent user
gesture; and
altering the displayed image of the visual interface based on at least the
subsequent user gesture.
3 0
16. The method of claim 11, comprising communicating over a communication
network with a plurality of devices that can be controlled based on user gestures.
17. The method of claim 1 1, comprising ;
detecting the user gesture using a detector configured to be worn by the user;
and
detecting a plurality of types of movement that may be included as part of the
user gesture.
18. The method of claim 1 1, comprising providing an indication to the user that
the user gesture has been detected.
19. The method of claim 1 1, comprising
recognizing a user gesture as an initiation gesture;
indicating to the user that the initiation gesture has been detected; and
determining whether the user repeats the initiation gesture as a prerequisite to
selecting the surface for the display surface.
20. The method of claim 1 1, comprising
using a detector for detecting the user gesture and
wherein the detector comprises an identifier of the detector.
Dated this 20tb day of November 2013
JAYA r ANDEYA
INIPA-1345
AGENT FOR THE APPLICANT
To
The Controller of Patents
The Patents Office at New Delhi
| # | Name | Date |
|---|---|---|
| 1 | 3396-del-2013-Correspondence-Others-(01-04-2014).pdf | 2014-04-01 |
| 1 | 3396-DEL-2013-FER.pdf | 2019-08-27 |
| 2 | 3396-del-2013-GPA.pdf | 2014-04-04 |
| 2 | 3396-del-2013-Abstract.pdf | 2014-04-04 |
| 3 | 3396-del-2013-Form-5.pdf | 2014-04-04 |
| 3 | 3396-del-2013-Claims.pdf | 2014-04-04 |
| 4 | 3396-del-2013-Form-3.pdf | 2014-04-04 |
| 4 | 3396-del-2013-Correspondence-others.pdf | 2014-04-04 |
| 5 | 3396-del-2013-Description (Complete).pdf | 2014-04-04 |
| 5 | 3396-del-2013-Form-2.pdf | 2014-04-04 |
| 6 | 3396-del-2013-Drawings.pdf | 2014-04-04 |
| 6 | 3396-del-2013-Form-1.pdf | 2014-04-04 |
| 7 | 3396-del-2013-Drawings.pdf | 2014-04-04 |
| 7 | 3396-del-2013-Form-1.pdf | 2014-04-04 |
| 8 | 3396-del-2013-Description (Complete).pdf | 2014-04-04 |
| 8 | 3396-del-2013-Form-2.pdf | 2014-04-04 |
| 9 | 3396-del-2013-Correspondence-others.pdf | 2014-04-04 |
| 9 | 3396-del-2013-Form-3.pdf | 2014-04-04 |
| 10 | 3396-del-2013-Form-5.pdf | 2014-04-04 |
| 10 | 3396-del-2013-Claims.pdf | 2014-04-04 |
| 11 | 3396-del-2013-GPA.pdf | 2014-04-04 |
| 11 | 3396-del-2013-Abstract.pdf | 2014-04-04 |
| 12 | 3396-DEL-2013-FER.pdf | 2019-08-27 |
| 12 | 3396-del-2013-Correspondence-Others-(01-04-2014).pdf | 2014-04-01 |
| 1 | SEARCGSTRATEGY_19-08-2019.pdf |