Abstract: A gesture controlled display system includes a display screen configured to display a plurality of display images, a sensor configured to generate hand tracking data in response to a user hand motion in proximity to the display screen, a database storing a plurality of gestures, each gesture associated with a display screen action, and at least one processor coupled with the display system, the sensor, the database, and a non-transitory processor-readable medium storing processor-executable code for causing the at least one processor to receive the hand tracking data from the sensor, interpret the hand tracking data to identify a gesture made by the user hand, and modify the display screen according to the display screen action associated with the gesture made by the user.
, Description:SMART GESTURE CONTROLLED DISPLAY AND MENU-LESS INTERACTION
BACKGROUND
[0001] The inventive concepts disclosed herein relate generally to the field of display systems. More particularly, embodiments of the inventive concepts disclosed herein relate to interactive one-touch or no-touch gesture controlled display systems.
[0002] Aircraft display systems, such as cockpit display systems and in-flight entertainment display systems, provide multiple ways to interact with displayed information. Most require a pilot, aircraft crew member, or passenger to use buttons, touch screens, keypads, and cursors to interact with the system. Many cockpit display system and in-flight entertainment systems use menu-based applications that require a user to click through several layers of menus to carry out a desired function. For example, to reach a particular menu, users sometimes have to tap the screen multiple times (e.g., once on each of several menus to get to a particular sub menu). For example, for a passenger to play music, movies, or a TV Channel on some cabin entertainment system, the passenger has to first unlock the display screen, then manually navigate to different menu options manually every time.
[0003] Cockpit displays provide pilots and aircraft crew member with information related to navigation of the aircraft as well as aircraft performance information and critical flight data. For example, to safely fly an aircraft, the pilot and flight crew need to know certain information, such as current airspeed, altitude, and heading. However, to access some information, pilots and crew members must navigate through several layers of menus and screens. For example, to access a display screen showing the nearest airports, a pilot or crew member may have to conduct a three-step process and navigate through the following menus: MENU, MAP, AIRPORTS/NDBS/NAVIDS. Such searching and navigation of menus may be distracting and takes valuable time away from flying the aircraft.
SUMMARY
[0004] In one aspect, the inventive concepts disclosed herein are directed to a gesture controlled display system. The gesture controlled display system includes a display screen, a sensor, a database, and at least one processor coupled with a non-transitory processor-readable medium storing processor-executable code. The display screen is configured to display a plurality of display images. The sensor is configured to generate hand tracking data in response to a user hand motion in proximity to the display screen. The database stores a plurality of gestures and each gesture is associated with a display screen action. The code causes the at least one processor to receive the hand tracking data from the sensor, interpret the hand tracking data to identify a gesture made by the user hand, and modify the display screen according to the display screen action associated with the gesture made by the user.
[0005] In a further aspect, the inventive concepts disclosed herein are directed to method of using gestures to control a display. The method includes receiving hand tracking data from a sensor configured to generate hand tracking data in response to a user hand motion in proximity to a display screen, interpreting the hand tracking data to identify a gesture made by the user hand in a database storing a plurality of gestures. Each gesture is associated with a display screen action. The method further includes modifying content displayed on the display screen according to the display screen action associated with the gesture made by the user.
[0006] In a further aspect, the inventive concepts disclosed herein are directed to a gesture controlled display system. The gesture controlled display system includes at least one processor coupled with a non-transitory processor-readable medium storing processor-executable code for causing the at least one processor to receive hand tracking data from a plurality of sensors configured to generate hand tracking data in response to a user hand motion in proximity to a display screen, interpret the hand tracking data to identify a gesture made by the user hand, the gesture made by the user hand corresponding to one of a plurality of gestures stored in a database storing a plurality of gestures and where each gesture is associated with a display screen action, and modify the display screen according to the display screen action associated with the gesture made by the user.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] Implementations of the inventive concepts disclosed herein may be better understood when consideration is given to the following detailed description thereof. Such description makes reference to the included drawings, which are not necessary to scale, and in which some features may be exaggerated and some features may be omitted or may be represented schematically in the interest of clarity. Like reference numerals in the drawings may represent and refer to the same or similar element, feature, or function. In the drawings:
[0008] FIG. 1 is a schematic illustration of an exemplary embodiment of an aircraft control center or cockpit according to the inventive concepts disclosed herein;
[0009] FIG. 2 is a block diagram of a gesture controlled display system according to the inventive concepts disclosed herein;
[0010] FIG. 3 is a block diagram of an exemplary embodiment of a controller of the gesture controlled display system of FIG. 2; and
[0011] FIG. 4 is a diagram of an exemplary embodiment of a method of using gestures to control a display according to the inventive concepts disclosed herein.
DETAILED DESCRIPTION
[0012] Before explaining at least one embodiment of the inventive concepts disclosed herein in detail, it is to be understood that the inventive concepts are not limited in their application to the details of construction and the arrangement of the components or steps or methodologies set forth in the following description or illustrated in the drawings. In the following detailed description of embodiments of the instant inventive concepts, numerous specific details are set forth in order to provide a more thorough understanding of the inventive concepts. However, it will be apparent to one of ordinary skill in the art having the benefit of the instant disclosure that the inventive concepts disclosed herein may be practiced without these specific details. In other instances, well-known features may not be described in detail to avoid unnecessarily complicating the instant disclosure. The inventive concepts disclosed herein are capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
[0013] As used herein a letter following a reference numeral is intended to reference an embodiment of the feature or element that may be similar, but not necessarily identical, to a previously described element or feature bearing the same reference numeral (e.g., 1, 1a, 1b). Such shorthand notations are used for purposes of convenience only, and should not be construed to limit the inventive concepts disclosed herein in any way unless expressly stated to the contrary.
[0014] Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by anyone of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
[0015] In addition, use of the “a” or “an” are employed to describe elements and components of embodiments of the instant inventive concepts. This is done merely for convenience and to give a general sense of the inventive concepts, and “a” and “an” are intended to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
[0016] Finally, as used herein any reference to “one embodiment” or “some embodiments” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the inventive concepts disclosed herein. The appearances of the phrase “in some embodiments” in various places in the specification are not necessarily all referring to the same embodiment, and embodiments of the inventive concepts disclosed may include one or more of the features expressly described or inherently present herein, or any combination or sub-combination of two or more such features, along with any other features which may not necessarily be expressly described or inherently present in the instant disclosure.
[0017] Broadly, embodiments of the inventive concepts disclosed herein are directed to a smart gesture controlled display and menu-less interaction. The inventive concepts disclosed herein can be utilized in a number of control systems for various types of applications, sensing systems, and display systems. While the present disclosure describes systems and methods implementable for an aircraft display system, the inventive concepts disclosed herein may be used in any type of environment (e.g., in another aircraft, a spacecraft, a ground-based vehicle, or in a non-vehicle application such as a ground-based display system, an air traffic control system, a radar system, a virtual display system). While certain examples and embodiments of the inventive concepts disclosed herein are described with respect to a pilot of an aircraft, it will be appreciated that users other than a pilot may use and benefit from the inventive concepts disclosed herein with respect to other vehicles or and objects.
[0018] Referring now to FIG. 1, a schematic illustration of an exemplary embodiment of an aircraft control center or cockpit 100 according to the inventive concepts disclosed herein. The aircraft control center 100 may include one or more flight displays 102 and one or more user interface (“UI”) elements 104. The flight displays 102 may be implemented using any of a variety of display technologies, including CRT, LCD, organic LED, dot matrix display, and others. The flight displays 102 may be navigation (“NAV”) displays, primary flight displays, electronic flight bag displays, tablets such as iPad® computers manufactured by Apple, Inc. or tablet computers, synthetic vision system displays, head up displays (“HUDs”) with or without a projector, wearable displays, watches, Google Glass® and so on. The flight displays 102 may be used to provide information to the flight crew, thereby increasing the flight crew’s visual range and enhancing their decision-making abilities. The flight displays 102 may be configured to function as, for example, a primary flight display (“PFD”) used to display altitude, airspeed, vertical speed, and navigation and traffic collision avoidance system (“TCAS”) advisories. The flight displays 102 may also be configured to function as, for example, a multi-function display used to display navigation maps, weather radar, electronic charts, TCAS traffic, aircraft maintenance data and electronic checklists, manuals, and procedures. The flight displays 102 may also be configured to function as, for example, an engine indicating and crew-alerting system (“EICAS”) display used to display critical engine and system status data. Other types and functions of the flight displays 102 are contemplated and will be apparent to those skilled in the art. According to various exemplary embodiments, at least one of the flight displays 102 may be configured to provide a rendered display from the systems and methods of the present disclosure.
[0019] In some embodiments, the flight displays 102 may provide an output from an aircraft-based system, a ground-based system, a satellite-based system, or from a system of another aircraft. For example, in one embodiment, the flight displays 102 provide an output from a ground-based weather radar system. In some embodiments, the flight displays 102 provide an output from an aircraft-based weather radar system, LIDAR system, infrared system or other system on the aircraft. For example, the flight displays 102 may include an avionics display, a joint display, an air traffic display, a weather radar map, and a terrain display. The flight displays 102 may include an electronic display or a synthetic vision system (“SVS”). For example, the flight displays 102 may include a display configured to display a two-dimensional (“2-D”) image, a three-dimensional (“3-D”) perspective image of air traffic data, terrain, and/or weather information, or a four-dimensional (“4-D”) display of weather information or forecast information. Other views of air traffic information, terrain, and/or weather information may also be provided (e.g., plan view, horizontal view, and vertical view). The views shown on the flight displays 102 may include monochrome or color graphical representations of the displayed information. Graphical representations of the displayed information may include an indication of altitude of other aircraft, weather conditions, or terrain, or the altitude and/or location of such information relative to the aircraft.
[0020] The UI elements 104 may include, for example, dials, switches, buttons, touch screens, keyboards, a mouse, joysticks, cursor control devices (“CCDs”) or other multi-function key pads certified for use with avionics systems, and so on. The UI elements 104 may be configured to, for example, allow an aircraft crew member to interact with various avionics applications and perform functions such as data entry, manipulation of navigational maps, and moving among and selecting checklist items. For example, the UI elements 104 may be used to adjust features of the flight displays 102, such as contrast, brightness, width, and length. The UI elements 104 may also (or alternatively) be used by an aircraft crew member to interface with or manipulate the displays of the flight displays 102. For example, the UI elements 104 may be used by aircraft crew member to adjust the brightness, contrast, and information displayed on the flight displays 102. The UI elements 104 may additionally be used to acknowledge or dismiss an indicator provided by the flight displays 102. Further, the UI elements 104 may be used to correct errors on the flight displays 102. Other UI elements 104, such as indicator lights, displays, display elements, and audio alerting devices, may be configured to warn of potentially threatening conditions such as severe weather, terrain, obstacles.
[0021] Referring now to FIG. 2, a block diagram of a gesture controlled display system 106 is shown according to an exemplary embodiment of the inventive concepts disclosed herein. The gesture controlled display system 106 includes an aircraft control system 110, at least one sensor 112, and an interactive display system 114. The aircraft control system 110 may be a system responsible for general aircraft control and features, and may include any number of aircraft subsystems, controllers, and other components for general aircraft functionality. The aircraft control system 110 includes a gesture database 118 and a controller 120. The interactive display system 114 may be any type of display system and may be or include components of the flight displays 102 of the aircraft cockpit 100. In some embodiments, the interactive display system 114 is a multi-function display.
[0022] The sensor 112 may be a plurality of sensors located in various positions in the cockpit 100 of an aircraft. In some embodiments, sensors are affixed to the interactive display system 114. For example, sensors may be integrated in the interactive display system 114 (e.g., integrated in the frame or other component). In some embodiments, a plurality of sensors 112 are used (e.g., at least two sensors). The sensor 112 may generally be configured to generate hand tracking and finger tracking data in response to a hand motion in proximity to a display screen of the interactive display system 114. The sensors 112 may be one or more of any type of motion capture sensor configured to detect movement of a hand or finger, such as a camera, infrared camera, or the like. The sensor 112 may be configured to detect an orientation of a hand or finger of the pilot 114. In some embodiments, the hand tracking data is indicative of at least one of a location of the hand of the pilot 116 over the display screen, a distance of the hand of the pilot 116 from the display screen, and a speed of the motion of the hand of the pilot 116 in proximity to the display screen. In some embodiments, the hand tracking data includes finger tracking data indicative of at least one of a motion of a finger of the pilot 116, a location of the finger of the pilot 116 in proximity to a display screen of the interactive display system 114, a distance of the finger of the pilot 116 from the display screen, a speed of the motion of the finger of the pilot 116, and a number of fingers extended from the hand of the pilot 116 during the motion. The sensors 112 may provide a sensor input to the controller 120, which may determine one or more properties related to movements made by tracked hand and finger. For example, the sensors 112 may track hand and finger movement, and the controller 120 may be configured to determine characteristics of the movement (e.g., the number of fingers extended, a shape or symbol made by the finger, a speed of the movement). The sensors 112 can detect a gesture made proximate the display screen (i.e., no part of the gesturer touching the display screen) and also a gesture made on the display screen itself (i.e., at least a part of the gesturer’s hand or finger touching the display screen). In some embodiments, the sensors 112 are configured to detect a gesture made within a short distance in front of the display screen (e.g., a few centimeters, within 5 centimeters, less than 10 centimeters, about or greater than 10 centimeters). In some embodiments, the sensor 112 senses gestures made at a greater distance from the display screen. For example, the sensor 112 may be configured to detect and sense gestures made by a person sitting or standing near or several feet away from the display screen. For example, the sensors 112 may detect gestures or signs made by a person sitting a few feet away from the display screen (e.g., and making gestures closer to their own body than the display screen).
[0023] The gesture database 118 stores a plurality of gestures and a command associated with each of the plurality of gestures. For example, the gesture database 118 may store a plurality of gestures and to also store a plurality of display screen actions, where each action is associated with one of the gestures. The gestures may include any type of gesture that a person can make with their hand and fingers, such as drawing an alphanumeric symbol (e.g., the letter “M”, the letter “S”, the number “2”), a shape or symbol (e.g., a swipe up, down, left, or right, an arrow, a circle, a triangle), and any combination or plurality of the same. The gesture database 118 associates each stored gesture with a display screen action. The display screen actions may include any type of action or function that the display screen of the interactive display system 114 can execute, such as jumping to a specific menu, page, or screen, displaying particular content, skipping forward or backward a page, zooming in or out on a current display screen, selecting an item on a current display screen, and so on. Each of the display actions may correspond to a particular gesture.
[0024] Referring now to FIG. 3, a block diagram of an exemplary embodiment of the controller 120 of the gesture controlled display system 106 of FIG. 2 is shown according to an exemplary embodiment. The controller 120 includes a processor 122, a communications interface 124, and a memory 126. The memory 126 includes various modules that cause the processor 122 to execute the systems and methods described herein, including a gesture module 128 and a display module 130.
[0025] The processor 122 may be coupled with the memory 126, which may comprise a non-transitory processor-readable medium. The processor 122 may be implemented as a specific purpose processor, an application specific integrated circuit (ASIC), one or more field programmable gate arrays (FPGAs), a group of processing components, or other suitable electronic processing components. Any controllers and modules described herein may be implemented as a hardware circuit comprising custom VLSI circuits or gate arrays, off-the-shelf semiconductors such as logic chips, transistors, or other discrete components, and may be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices or the like. The memory 126 is one or more devices (e.g., RAM, ROM, flash memory, hard disk storage) for storing data and/or computer code for completing and/or facilitating the various user or client processes, layers, and modules described in the present disclosure. The memory 126 may be or include volatile memory or non-volatile memory and may include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures of the present disclosure. The memory 126 is communicably connected to the processor 122 and includes computer code or instruction modules for executing one or more processes described herein.
[0026] The communications interface 124 is configured to facilitate communications between various components of the gesture controlled display system 106, such as the aircraft control system 110, the sensors 112, and the interactive display system 114. For example, the communications interface 124 may be configured to receive hand tracking data from the one or more sensors 112, and to communicate the hand tracking data to the controller 120 via a wired or wireless connection. The communications interface 124 may include any type of wired or wireless technology for facilitating communications, including electronic and optical communication protocols.
[0027] The gesture module 128 is configured to receive hand tracking data from the sensors 112 and to cause the processor 122 to interpret the hand tracking data to identify a gesture made by the tracked hand. The gesture module 128 may be configured to interpret the hand tracking data to determine a location of user hand over the display screen, a of at least one of a location of the user hand over the display screen, a distance of the user hand from the display screen, and a speed of the motion of the user hand. The gesture module 128 may be configured to interpret hand tracking data to determine at least one of a motion of a user finger, a location of the user finger over the display screen, a distance of the user finger from the display screen, a speed of the motion of the user finger, and a number of fingers extended from the user hand during the motion. The gesture module 128 may be configured to determine that the gesture made by the user comprises an alphanumeric symbol, and to match the alphanumeric symbol with a display screen action in the gesture database 118. The gesture module 128 is configured to interpret the hand tracking data to determine an orientation and position of a hand and finger of the pilot 116 and to determine when and how the pilot 116 moves his or her hand and finger to determine a specific gesture made by the pilot 116 and to match the specific gesture made by the pilot 116 with one of the gestures stored in the gesture database 118.
[0028] The display module 130 is configured to cause the processor 122 to display images via the interactive display system 114 and to cause the processor 122 to modify the display screen according to the display screen action associated with the gesture made by the user. The display module 130 may be configured to cause the display screen to change from displaying a first display image to a second display image in response to the gesture made by the user. For example, if the pilot 116 makes a gesture of an alphanumeric symbol, the gesture module 128 determines that the alphanumeric symbol is a letter and that the letter corresponds with a particular display action as specified in the gesture database 118, then the display module 130 modifies the display screen by changing displayed content to content associated with the alphanumeric symbol. For example, if the gesture module 128 determines that the alphanumeric symbol is the letter “W” which corresponds with a display action displaying a “Weight and Fuel” display screen as specified in the gesture database 118, then the display module 130 causes the processor 122 to modify the display screen by changing displayed content to display a “Weight and Fuel” display screen. In some embodiments, the display screen action specifies a specific sub-menu of a menu to be displayed and the display module 130 is configured to cause the processor 122 to modify the display screen to display the specific sub-menu without first displaying the menu. For example, if the display screen is currently displaying a “Nearest Airports” display screen and if the “Weight and Fuel” display screen can be accessed by first selecting a “Menu” screen, then an “Aircraft Performance” screen, then the “Weight and Fuel” screen, the pilot can jump straight to the “Weight and Fuel” display screen without having to go through or first access the “Menu” screen and “Aircraft Performance” screen by gesturing the letter “W”. The display module 130 causes the interactive display system 114 to modify a display screen or otherwise cause a particular content to be displayed according to the display command. In some embodiments, the display command specifies the display of a particular menu, a command to display a next page or a previous screen, and a command to zoom in or zoom out of a current screen.
[0029] The gesture module 128 and display module 130 may cooperate to cause the processor 122 to cause the display screen of the interactive display system 114 to be modified based on various gestures. For example, if the gesture module 128 determines that the gesture is the letter “S” which corresponds with a display action displaying a “Synoptics” display screen as specified in the gesture database 118, then the display module 130 causes the processor 122 to modify the display screen by changing displayed content to display a “Synoptics” display screen. For example, if the gesture module 128 determines that the gesture is the letter “P” which corresponds with a display action displaying a “Flight Plan” display screen as specified in the gesture database 118, then the display module 130 causes the processor 122 to modify the display screen by changing displayed content to display a “Flight Plan” display screen. For example, if the gesture module 128 determines that the gesture is the letter “F” which corresponds with a display action displaying a “Flight Management System” display screen as specified in the gesture database 118, then the display module 130 causes the processor 122 to modify the display screen by changing displayed content to display a “Flight Management System” display screen. For example, if the gesture module 128 determines that the gesture is the letter “R” which corresponds with a display action displaying a “Route” display screen as specified in the gesture database 118, then the display module 130 causes the processor 122 to modify the display screen by changing displayed content to display a “Route” display screen. The gesture module 128 and display module 130 may cooperate to cause the processor 122 to cause the display screen of the interactive display system 114 to swap sub menu pages, zoom-in and zoom-out specific portions of displays, viewing map pages in different views, and changing formats of text, characters, colors, sizes of images, text, and icons, and so on.
[0030] In addition to cockpit displays and functions, the gesture controlled display system 106 may be configured for any other type of display system, such as an inflight entertainment display system. In such embodiments, the gesture module 128 and display module 130 may similarly cooperate to cause the processor 122 to modify the images and pages displayed by the interactive display system 114 based on a gesture made in proximity to the display screen. The gesture module 128 and display module 130 may cooperate to cause the processor 122 to cause the display screen of the interactive display system 114 to go directly to a music, video, or TV show screen without unlocking or navigating to a particular folder or menu. For example, the gesture module 128 and display module 130 may cooperate to cause the processor 122 to cause the display screen of the interactive display system 114 to jump to a display screen displaying a list of audio songs in response to a person gesturing a letter “S”. In another example, the gesture module 128 and display module 130 may cooperate to cause the processor 122 to cause the display screen of the interactive display system 114 to jump to a display screen displaying a list of movies in response to a person gesturing a letter “M”. In another example, the gesture module 128 and display module 130 may cooperate to cause the processor 122 to cause the display screen of the interactive display system 114 to jump to a video display screen or video player page in response to a person gesturing a letter “V”. In another example, the gesture module 128 and display module 130 may cooperate to cause the processor 122 to cause the display screen of the interactive display system 114 to call for help (e.g., alert a flight attendant or aircraft crewmember that the passenger needs assistance) in response to a person gesturing a letter “C”. In another example, the gesture module 128 and display module 130 may cooperate to cause the processor 122 to cause the display screen of the interactive display system 114 to jump to a display screen displaying a list of books available in response to a person gesturing a letter “B”. In another example, the gesture module 128 and display module 130 may cooperate to cause the processor 122 to cause the display screen of the interactive display system 114 to jump to a game menu display screen or a list of games available to be played in response to a person gesturing a letter “G”. It will be appreciated that other display commands are within the scope of the present disclosure, such as turning on and off the display screen, putting the display screen to sleep, increasing/decreasing an outputted volume, and so on.
[0031] In some embodiments, one hand gesture made by the pilot 116 may be interpreted differently based on the phase of flight of the aircraft. For example, the same hand gesture made by the pilot 116 during a takeoff phase of flight may be interpreted differently than the same hand gesture made by the pilot 116 during a descent phase of flight, and the different interpretation may cause a different information to be displayed. For example, if the pilot 116 gestures the letter “V” near the display screen, the processor 122 may cause the display screen to display takeoff V-speeds for the aircraft based on the aircraft being in a takeoff or pre-takeoff phase of flight, whereas the processor 122 may cause the display screen to display landing V-speeds for the aircraft based on the aircraft being in a descent phase of flight.
[0032] Referring now to FIG. 4, an exemplary embodiment of a method of using gestures to control a display according to the inventive concepts disclosed herein may include one or more of the following steps.
[0033] A step (402) may include receiving hand tracking data from a sensor configured to generate hand tracking data in response to a user hand motion in proximity to a display screen. For example, the sensor 112 may detect that a person moved their fingers with respect to the display screen of the interactive display system 114, generate sensor data indicative of the gesture made, and communicate the sensor data to the aircraft control system 110.
[0034] A step (404) may include interpreting the hand tracking data to identify a gesture made by the user hand in a database matching each of a plurality of gestures with a display screen action. The gesture module 128 of the controller 120 of the aircraft control system 110 may interpret the sensor data and determine that the person gestured the letter “M”. The gesture module 128 matches the gestured letter within the gesture database 118 to determine that the letter “M” is associated with displaying a map on the cockpit display. The gesture module 128 alternative determines that the letter “M” is associated with display movies based on the display system being an entertainment display system and not a cockpit or navigation display system.
[0035] A step (406) may include modifying content displayed on the display screen according to the display screen action associated with the gesture made by the user. The display module 130 causes the processor 122 to modify the display screen of the interactive display system 114 to jump to a map display screen without first displaying any other menu.
[0036] As will be appreciated from the above, smart gesture controlled display and menu-less interaction according to embodiments of the inventive concepts disclosed herein may minimize menu-based interactions for accessing various pages and menus of a display system and also adding additional features for faster navigation of current menu-based display systems, such as swapping sub-menu pages, zooming in, zooming out specific portions of a display, changing formats, and so on all with a single gesture-based command that requires no actual touching of a display screen.
[0037] It is to be understood that embodiments of the methods according to the inventive concepts disclosed herein may include one or more of the steps described herein. Further, such steps may be carried out in any desired order and two or more of the steps may be carried out simultaneously with one another. Two or more of the steps disclosed herein may be combined in a single step, and in some embodiments, one or more of the steps may be carried out as two or more sub-steps. Further, other steps or sub-steps may be carried out in addition to, or as substitutes to one or more of the steps disclosed herein.
[0038] From the above description, it is clear that the inventive concepts disclosed herein are well adapted to carry out the objects and to attain the advantages mentioned herein as well as those inherent in the inventive concepts disclosed herein. While presently preferred embodiments of the inventive concepts disclosed herein have been described for purposes of this disclosure, it will be understood that numerous changes may be made which will readily suggest themselves to those skilled in the art and which are accomplished within the broad scope and coverage of the inventive concepts disclosed and claimed herein.
Claims:1. A gesture controlled display system, comprising:
a display screen configured to display a plurality of display images;
a sensor configured to generate hand tracking data in response to a user hand motion in proximity to the display screen;
a database storing a plurality of gestures, each gesture associated with a display screen action; and
at least one processor coupled with the display screen, the sensor, the database, and a non-transitory processor-readable medium storing processor-executable code for causing the at least one processor to:
receive the hand tracking data from the sensor;
interpret the hand tracking data to identify a gesture made by the user hand; and
modify the display screen according to the display screen action associated with the gesture made by the user.
2. The system of claim 1, wherein modifying the display screen causes the display screen to change from displaying a first display image to a second display image in response to the gesture made by the user.
3. The system of claim 1, wherein the gesture made by the user comprises an alphanumeric symbol, and wherein modifying the display screen comprises changing displayed content to content associated with the alphanumeric symbol.
4. The system of claim 1, wherein the display screen action specifies a specific sub-menu of a menu to be displayed, and wherein modifying the display screen comprises displaying the specific sub-menu without first displaying the menu.
5. The system of claim 1, wherein the hand tracking data is indicative of at least one of a location of the user hand over the display screen, a distance of the user hand from the display screen, and a speed of the motion of the user hand.
6. The system of claim 1, wherein the hand tracking data comprises finger tracking data indicative of at least one of a motion of a user finger, a location of the user finger over the display screen, a distance of the user finger from the display screen, a speed of the motion of the user finger, and a number of fingers extended from the user hand during the motion.
7. The system of claim 1, wherein the display action comprises a command to display a particular menu or screen.
8. The system of claim 1, wherein the hand tracking data indicates that the user hand is proximate to a point on the display screen, and wherein the display action is based on the point that the hand is proximate to on the display screen.
9. A method of using gestures to control a display, the method comprising:
receiving hand tracking data from a sensor configured to generate hand tracking data in response to a user hand motion in proximity to a display screen;
interpreting the hand tracking data to identify a gesture made by the user hand in a database storing a plurality of gestures, each gesture associated with a display screen action; and
modifying content displayed on the display screen according to the display screen action associated with the gesture made by the user.
10. The method of claim 9, wherein modifying the display screen causes the display screen to change from displaying a first display image to a second display image in response to the gesture made by the user.
11. The method of claim 9, wherein the gesture made by the user comprises an alphanumeric symbol, and wherein modifying the display screen comprises changing displayed content to content associated with the alphanumeric symbol.
12. The method of claim 9, wherein the display screen action specifies a specific sub-menu of a menu to be displayed, and wherein modifying the display screen comprises displaying the specific sub-menu without first displaying the menu.
13. The method of claim 9, wherein the hand tracking data is indicative of at least one of a location of the user hand over the display screen, a distance of the user hand from the display screen, and a speed of the motion of the user hand.
14. The method of claim 9, wherein the display action comprises a command to display a particular menu or screen.
15. A gesture controlled display system, comprising:
at least one processor coupled with a non-transitory processor-readable medium storing processor-executable code for causing the at least one processor to:
receive hand tracking data from a plurality of sensors configured to generate hand tracking data in response to a user hand motion in proximity to a display screen;
interpret the hand tracking data to identify a gesture made by the user hand, the gesture made by the user hand corresponding to one of a plurality of gestures stored in a database storing a plurality of gestures, each gesture associated with a display screen action; and
modify the display screen according to the display screen action associated with the gesture made by the user.
16. The system of claim 15, wherein modifying the display screen causes the display screen to change from displaying a first display image to a second display image in response to the gesture made by the user.
17. The system of claim 15, wherein the gesture made by the user comprises an alphanumeric symbol, and wherein modifying the display screen comprises changing displayed content to content associated with the alphanumeric symbol.
18. The system of claim 15, wherein the display screen action specifies a specific sub-menu of a menu to be displayed, and wherein modifying the display screen comprises displaying the specific sub-menu without first displaying the menu.
19. The system of claim 15, wherein the hand tracking data is indicative of at least one of a location of the user hand over the display screen, a distance of the user hand from the display screen, and a speed of the motion of the user hand.
20. The system of claim 15, wherein the hand tracking data comprises finger tracking data indicative of at least one of a motion of a user finger, a location of the user finger over the display screen, a distance of the user finger from the display screen, a speed of the motion of the user finger, and a number of fingers extended from the user hand during the motion.
| # | Name | Date |
|---|---|---|
| 1 | Power of Attorney [20-09-2016(online)].pdf | 2016-09-20 |
| 2 | Form 5 [20-09-2016(online)].pdf | 2016-09-20 |
| 3 | Form 3 [20-09-2016(online)].pdf | 2016-09-20 |
| 4 | Drawing [20-09-2016(online)].pdf | 2016-09-20 |
| 5 | Description(Complete) [20-09-2016(online)].pdf | 2016-09-20 |
| 6 | abstract.jpg | 2016-10-10 |
| 7 | Other Patent Document [23-12-2016(online)].pdf | 2016-12-23 |
| 8 | 201611032045-OTHERS-271216.pdf | 2016-12-29 |
| 9 | 201611032045-Correspondence-271216.pdf | 2016-12-29 |
| 10 | 201611032045-FORM 18 [15-09-2020(online)].pdf | 2020-09-15 |
| 11 | 201611032045-FER.pdf | 2021-12-02 |
| 12 | 201611032045-OTHERS [31-05-2022(online)].pdf | 2022-05-31 |
| 13 | 201611032045-FORM-26 [31-05-2022(online)].pdf | 2022-05-31 |
| 14 | 201611032045-FER_SER_REPLY [31-05-2022(online)].pdf | 2022-05-31 |
| 15 | 201611032045-CLAIMS [31-05-2022(online)].pdf | 2022-05-31 |
| 16 | 201611032045-ABSTRACT [31-05-2022(online)].pdf | 2022-05-31 |
| 17 | 201611032045-US(14)-HearingNotice-(HearingDate-30-11-2023).pdf | 2023-09-12 |
| 18 | 201611032045-Correspondence to notify the Controller [21-11-2023(online)].pdf | 2023-11-21 |
| 19 | 201611032045-US(14)-ExtendedHearingNotice-(HearingDate-30-11-2023).pdf | 2023-11-29 |
| 20 | 201611032045-FORM-26 [30-11-2023(online)].pdf | 2023-11-30 |
| 21 | 201611032045-Written submissions and relevant documents [14-12-2023(online)].pdf | 2023-12-14 |
| 22 | 201611032045-PatentCertificate13-03-2024.pdf | 2024-03-13 |
| 23 | 201611032045-IntimationOfGrant13-03-2024.pdf | 2024-03-13 |
| 1 | SearchStrategyE_16-11-2021.pdf |