Sign In to Follow Application
View All Documents & Correspondence

Touchscreen Zoom Control Display System

Abstract: A touch display control system includes a display system configured to display a plurality of display images, a sensor configured to generate finger tracking data indicative of a distance between a user finger and a display screen of the display system and a locus on the display screen that the user finger is aimed, and at least one processor coupled with a non-transitory processor-readable medium storing processor-executable code for causing the at least one processor to receive the finger tracking data from the sensor, and modify magnification of an active input area of the display screen surrounding the locus in response to the distance being less than a threshold.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
03 October 2016
Publication Number
14/2018
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
patents@remfry.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-12-11
Renewal Date

Applicants

ROCKWELL COLLINS, INC.
400 Collins Road NE, M/S 124-323, Cedar Rapids, Iowa 52498, U.S.A.

Inventors

1. KUMAR, Alok
MIG-220, P.C.Colony, Doctor's Colony, Kankarbagh, Patna-800020, Bihar, India

Specification

TOUCHSCREEN ZOOM CONTROL DISPLAY SYSTEM
BACKGROUND
[0001] The inventive concepts disclosed herein relate generally to the field of display systems.
More pmiicularly, embodiments of the inventive concepts disclosed herein relate to motion
controlled touchscreen display systems.
[0002] Aircraft display systems, such as cockpit display systems and in-flight entertainment
display systems, provide multiple ways to interact with displayed information. Some display
systems require a pilot, aircraft crew member, or passenger to use touchscreens to interact with
the system, which operate by locating the coordinates of a user's touch within the display area,
thereby allowing the user to interact directly with what is being displayed rather than indirectly
using a mouse or keyboard. Oftentimes, when operating touchscreen devices in a turbulent or
otherwise bumpy environment, users mistakenly provide undesired inputs when their finger
mistakenly touches the display screen of the device at an unintended location. Such undesired
inputs cause errors or cause incorrect information to be inputted in the system that later causes
other unintended issues or problems for an aircraft crewmember or pilot to remedy. Accordingly,
pilots typically require more time to perform tasks on a touchscreen device when traveling
through turbulent environments. To reduce input errors, current touchscreen solutions are limited
to very large active input areas, which decrease the amount of information that can be displayed
on a display screen.
SUMMARY
[0003] In one aspect, the inventive concepts disclosed herein are directed to a touch display
control system. The touch display control system includes a display system, a sensor, and at least
one processor coupled with a non-transitory processor-readable medium storing processorexecutable
code. The display system is configured to display a plurality of display images. The
sensor is configured to generate finger tracking data indicative of a distance between a user
finger and a display screen of the display system and a locus on the display screen that the user
-2-
4815-1572-3829
Atty. Dkt. No.: 16CR5461N (047141-1207)
finger is aimed. The processor-executable code is configured to cause the at least one processor
to receive the finger tracking data from the sensor, and modify magnification of an active input
area of the display screen surrounding the locus in response to the distance being less than a
threshold.
[0004] In a further aspect, the inventive concepts disclosed herein are directed to a method for
controlling a touch display screen. The method includes receiving finger tracking data from a
sensor, the finger tracking data indicative of a distance between a user finger and a display
screen of a display system and a locus on the display screen that the user finger is aimed, and
modifying the magnification of an active input area of the display screen surrounding the locus
in response to the distance being less than a threshold.
[0005] In a further aspect, the inventive concepts disclosed herein are directed to a touch
display control system. The touch display control system includes at least one processor coupled
with a non-transitory processor-readable medium storing processor-executable code for causing
the at least one processor to modify magnification of an active input area of a display screen
surrounding a locus on the display screen that a user finger is aimed in response to a distance
between the user finger and the locus of the display screen being less than a threshold.
BRIEF DESCRIPTION OF THE ORA WINGS
[0006] Implementations of the inventive concepts disclosed herein may be better understood
when consideration is given to the following detailed description thereof. Such description
makes reference to the included drawings, which are not necessary to scale, and in which some
features may be exaggerated and some features may be omitted or may be represented
schematically in the interest of clarity. Like reference numerals in the drawings may represent
and refer to the same or similar element, feature, or function. In the drawings:
[0007] FIG. 1 is a schematic illustration of an exemplary embodiment of an aircraft control
center or cockpit according to the inventive concepts disclosed herein;
-3-
4815-1572-3829
Atty. Dkt. No.: 16CR546IN (047141-1207)
[0008] FIG. 2 is a block diagram of a touchscreen zoom control system according to the
inventive concepts disclosed herein;
[0009] FIG. 3 is a block diagram of an exemplary embodiment of a controller of the
touchscreen zoom control system of FIG. 2;
[0010] FIG. 4A is an illustration of an exemplary embodiment of a display with a user finger
approaching a point on the display according to the inventive concepts disclosed herein;
[0011] FIG. 4B is an illustration of an exemplary embodiment of the display of FIG. 4A with
the user finger approaching or touching the point on the display according to the inventive
concepts disclosed herein;
[0012] FIG. 4C is an illustration of an exemplary embodiment of the display of FIGS. 4A and
4B after the user finger has withdrawn from or touched the point on the display according to the
inventive concepts disclosed herein;
[0013] FIG. SA is an illustration of an exemplary embodiment of another display with a user
finger approaching a point on the display according to the inventive concepts disclosed herein;
[0014] FIG. 5B is an illustration of an exemplary embodiment of the display of FIG. 5A with
the user finger approaching the point on the display according to the inventive concepts disclosed
herein;
[0015] FIG. 6A is an illustration of an exemplary embodiment of another display prior to a user
finger approaching a point on the display according to the inventive concepts disclosed herein;
[0016] FIG. 6B is an illustration of an exemplary embodiment of another display with a user
finger approaching a point on the display according to the inventive concepts disclosed herein;
and
[0017] FIG. 7 is a diagram of an exemplary embodiment of a method for controlling a
touchscreen display according to the inventive concepts disclosed herein.
-4-
4815-1572-3829
Atty. Dkt. No.: 16CR546IN (047141-1207)
DETAILED DESCRIPTION
[0018] Before explaining at least one embodiment of the inventive concepts disclosed herein in
detail, it is to be understood that the inventive concepts are not limited in their application to the
details of construction and the arrangement of the components or steps or methodologies set
forth in the following description or illustrated in the drawings. In the following detailed
description of embodiments of the instant inventive concepts, numerous specific details are set
forth in order to provide a more thorough understanding of the inventive concepts. However, it
will be apparent to one of ordinary skill in the art having the benefit of the instant disclosure that
the inventive concepts disclosed herein may be practiced without these specific details. In other
instances, well-known features may not be described in detail to avoid unnecessarily
complicating the instant disclosure. The inventive concepts disclosed herein are capable of other
embodiments or of being practiced or carried out in various ways. Also, it is to be understood
that the phraseology and terminology employed herein is for the purpose of description and
should not be regarded as limiting.
[0019] As used herein a letter following a reference numeral is intended to reference an
embodiment of the feature or element that may be similar, but not necessarily identical, to a
previously described element or feature bearing the same reference numeral (e.g., I, Ia, !b).
Such shorthand notations are used for purposes of convenience only, and should not be construed
to limit the inventive concepts disclosed herein in any way unless expressly stated to the
contrary.
[0020] Further, unless expressly stated to the contrary, "or" refers to an inclusive or and not to
an exclusive or. For example, a condition A orB is satisfied by anyone of the following: A is
true (or present) and B is false (or not present), A is false (or not present) and B is true (or
present), and both A and B are true (or present).
[0021] In· addition, use of the "a" or "an" are employed to describe elements and components
of embodiments of the instant inventive concepts. This is done merely for convenience and to
-5-
4815-1572-3829
Atty. Dkt. No.: 16CR5461N (047141-1207)
give a general sense of the inventive concepts, and "a" and "an" are intended to include one or at
least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
[0022] Finally, as used herein any reference to "one embodiment" or "some embodiments"
means that a particular element, feature, structure, or characteristic described in connection with
the embodiment is included in at least one embodiment of the inventive concepts disclosed
herein. The appearances of the phrase "in some embodiments" in various places in the
specification are not necessarily all referring to the same embodiment, and embodiments of the
inventive concepts disclosed may include one or more of the features expressly described or
inherently present herein, or any combination or sub-combination of two or more such features,
along with any other features which may not necessarily be expressly described or inherently
present in the instant disclosure.
[0023] Broadly, embodiments of the inventive concepts disclosed herein are directed to a
touchscreen zoom control display system. The inventive concepts disclosed herein can be
utilized in a number of control systems for various types of applications, sensing systems, and
display systems. While the present disclosure describes systems and methods implementable for
a touchscreen display for a pilot of an aircraft, the inventive concepts disclosed herein may be
used in any type of environment (e.g., in another aircraft, a spacecraft, a ground-based vehicle, or
in a non-vehicle application such as a ground-based display system, an air traffic control system,
a radar system, a virtual display system). While certain examples and embodiments of the
inventive concepts disclosed herein are described with respect to a pilot of an aircraft, it will be
appreciated that users other than a pilot may use and benefit from the inventive concepts
disclosed herein with respect to other vehicles or and objects.
[0024] Referring now to FIG. I, a schematic illustration of an exemplary embodiment of an
aircraft control center or cockpit 100 is shown according to the inventive concepts disclosed
herein. The aircraft control center 100 may include one or more flight displays 102 and one or
more user interface ("Ul") elements 104. The flight displays 102 may be implemented using any
of a variety of display technologies, including CRT, LCD, organic LED, dot matrix display, and
-6-
4815-1572-3829
Atty. Dkt. No.: 16CR546IN (047141-1207)
others. The flight displays 102 may be navigation ("NA V") displays, primary flight displays,
electronic flight bag displays, tablets such as iPad® computers manufactured by Apple, Inc. or
tablet computers, synthetic vision system displays, head up displays ("HUDs") with or without a
projector, wearable displays, watches, Google Glass® and so on. The flight displays 102 may be
used to provide information to the flight crew, thereby increasing the flight crew's visual range
and enhancing their decision-making abilities. The flight displays I 02 may be configured to
function as, for example, a primary flight display ("PFD") used to display altitude, airspeed,
vertical speed, and navigation and traffic co11ision avoidance system ("TCAS") advisories. The
flight displays I 02 may also be configured to function as, for example, a multi-function display
used to display navigation maps, weather radar, electronic charts, TCAS traffic, aircraft
maintenance data and electronic checklists, manuals, and procedures. The flight displays 102
may also be configured to function as, for example, an engine indicating and crew-alerting
system ("EICAS") display used to display critical engine and system status data. Other types and
functions of the flight displays 102 are contemplated and will be apparent to those skilled in the
art. According to various exemplary embodiments, at least one of the flight displays 102 may be
configured to provide a rendered display from the systems and methods of the present disclosure.
[0025] In some embodiments, the flight displays I 02 may provide an output from an aircraftbased
system, a ground-based system, a satellitecbased system, or from a system of another
aircraft. For example, in one embodiment, the flight displays 102 provide an output from a
ground-based weather radar system. In some embodiments, the flight displays I 02 proviile an
output from an aircraft-based weather radar system, LIDAR system, infrared system or other
system on the aircraft. For example, the flight displays I 02 may include an avionics display, a
joint display, an air traffic display, a weather radar map, and a terrain display. The flight displays
102 may include an electronic display or a synthetic vision system ("SVS"). For example, the
flight displays 102 may include a display configured to display a two-dimensional ("2-D")
image, a three-dimensional ("3-D") perspective image of air traffic data, terrain, and/or weather
information, or a four-dimensional ("4-D") display of weather information or forecast
information. Other views of air traffic information, terrain, and/or weather information may also
-7-
4815-1572-3829
Atty. Dkt. No.: 16CR546JN (047141-1207)
be provided (e.g., plan view, horizontal view, and vertical view). The views shown on the flight
displays I 02 may include monochrome or color graphical representations of the displayed
information. Graphical representations of the displayed information may include an indication of
altitude of other aircraft, weather conditions, or terrain, or the altitude and/or location of such
information relative to the aircraft.
[0026] The UI elements I 04 may include, for example, dials, switches, buttons, touch screens,
keyboards, a mouse, joysticks, cursor control devices ("CCDs") or other multi-function key pads
certified for use with avionics systems, and so on. The UI elements 104 may be configured to,
for example, allow an aircraft crew member to interact with various avionics applications and
perform functions such as data entry, manipulation of navigational maps, and moving among and
selecting checklist items. For example, the UI elements 104 may be used to adjust features ofthe
flight displays I 02, such as contrast, brightness, width, and length. The UI elements I 04 may
also (or alternatively) be used by an aircraft crew member to interface with or manipulate the
displays of the flight displays I 02. For example, the UI elements 104 may be used by aircraft
crew member to adjust the brightness, contrast, and information displayed on the flight displays
102. The UI elements I 04 may additionally be used to acknowledge or dismiss an indicator
provided by the flight displays I 02. Further, the UI elements 104 may be used to correct errors
on the flight displays 102. Other UI elements I 04, such as indicator lights, displays, display
elements, and audio alerting devices, may be configured to warn of potentially threatening
conditions such as severe weather, terrain, obstacles.
[0027] Referring now to FIG. 2, a block diagram of a touchscreen zoom control system l 06 is
shown according to an exemplary embodiment of the inventive concepts disclosed herein. The
touchscreen zoom control system I 06 includes an aircraft control system 110, at least one sensor
112, and an interactive display system 114. The aircraft control system 110 may be a system
responsible for general aircraft control and features, and may include any number of aircraft
subsystems, controllers, and other components for general aircraft functionality. The aircraft
control system 110 includes a controller 120.
-8-
4815-1572-3829
Atty. Dkt. No.: 16CR546IN (047141-1207)
[0028] The sensor 112 may be a plurality of sensors 112 located in various positions in the
cockpit 100 of an aircraft. In some embodiments, sensors 112 are affixed to the interactive
display system 114. For example, sensors 112 may be integrated in the interactive display system
114 (e.g., integrated in the frame or other component). In some embodiments, the sensor 112 is
configured to generate finger tracking data indicative of a distance between a user finger and a
display screen of the interactive display system 114. In some embodiments, the sensor 112 is
configured to generate finger tracking data indicative of a locus on the display screen of the
interactive display system 114 that the user finger is aimed. In some embodiments, the finger
tracking data is indicative of an x,y,z, coordinate of the user finger with respect to a plurality of
points on the display screen.
[0029] The sensors 112 may generally be configured to generate finger tracking data in
response to a hand or finger motion in proximity of a display screen of the interactive display
system 114. The sensors 112 may be one or more of any type of motion capture sensor
configured to detect movement of a finger, such as a camera, infrared camera, or the like. The
sensor 112 may be configured to detect an orientation of a finger of the pilot 116. In some
embodiments, the finger tracking data is indicative of at least one of a location of the finger of
the pilot 116 over the display screen, a distance of the finger of the pilot 116 from the display
screen, a speed of the motion of the finger of the pilot 116 in proximity to the display screen, and
a number of fingers extended from the hand of the pilot 116 during the motion. The sensors 112
may provide a sensor input to the controller 120, which may determine one or more properties
related to movements made by the tracked fingers. For example, the sensors 112 may track
finger movement, and the controller 120 may be configured to determine characteristics of the
movement (e.g., the number of fingers extended, a locus on the display screen that the finger is
aimed, a speed of the finger movement with respect to the display screen).
[0030] The sensor 112 can be any type of sensor configured to generate or detect a user finger
within a range of a display screen. For example, in some embodiments, the sensor 112 can be
configured to detect a user finger within a few centimeters of the display screen, a few inches of
the display screen, up to within about a meter or half of a meter of the display screen. In some
-9-
4815-1572-3829
Atty. Diet. No.: 16CR5461N (047141-1207)
embodiments, the sensor 112 is at least one of an infrared camera, two monochromatic infrared
cameras, three infrared LEDs, at least two global shutter image sensors, and a leap motion
sensor. In some embodiments, the sensor 112 is a USB peripheral device. In some embodiments,
the sensor 112 is comprises two IR cameras and three infrared LEDs configured to capture depth
information regarding the user finger. In some embodiments, the sensor 112 is configured to
detect the user finger within a hemispheric area surrounding the display screen. The three
infrared LEDs may be configured to generate a 3D pattern of dots of infrared light and the two
IR cameras may be configured to generate frames of reflected data (e.g., about 200 frames per
second of reflected data). In some embodiments, the sensor 112 is configured to track finger
movements at a rate of at least 300 frames per second. In some embodiments, the sensors 112 are
configured to track fingers within a spatial precision of about 0.0 I mm. In some embodiments,
the sensors 112 are oriented 90° with respect to the display screen, though it will be appreciated
that the sensors 112 may be oriented at any angle with respect to the display screen. In some
embodiments, the number of sensors 112 correspond to a number of display screens. For
example, in an interactive display system 114 having four display screens, the sensors 112 may
comprise four sensors, one for each of the four display screens.
[0031] The interactive display system 114 may be any type of display system and may be or
include components of the flight displays 102 ofthe aircraft cockpit 100. The interactive display
system 114 is configured to display a plurality of display images. In some embodiments, the
interactive display system 114 is a multi-function display. In some embodiments, the interactive
display system 114 is an A TC Flight Deck 202X. In some embodiments, the at least one sensor
112 is integrated in the interactive display system 114. In some embodiments, the displayed
images comprises any number of active, inactive, or deactivated (e.g., once active but no longer
active) input areas. The active input areas may be any part of a displayed image that is meant to
be interactive (e.g., receiving a touch input from a user that the touchscreen zoom control system
I 06 recognizes, such as pressing a "Map" button to display a map). However, it will be
appreciated that the interactive display system 114 may be configured to receive inputs using
systems and methods that do not require a user to physically touch the display screen (e.g., by
-10-
4815-1572-3829
Atty. Dkt. No.: 16CR546IN (047141-1207)
bringing their hand or a finger near the display screen, using voice commands). Active input
areas may comprise at least one of an icon, a key, a button, a text field, and an interactive image.
In some embodiments, an active input area may be deactivated such that no input is received by
the system if the user touches the deactivated input area. In some embodiments, an active input
area is enlarged as the user moves their finger closer to the display screen. In addition to cockpit
displays and functions, the touchscreen zoom control system 106 may be configured for any
other type of display system, such as an inflight entertainment display system.
[0032) Referring now to FIG. 3, a block diagram of an exemplary embodiment of the controller
120 of the touchscreen zoom control system I 06 ofF! G. 2 is shown according to an exemplary
embodiment. The controller 120 includes a processor 122, a communications interface 124, and a
memory 126. The memory 126 includes various modules that cause the processor 122 to execute
the systems and methods described herein, including a touch proximity module 128 and a display
module 130.
[0033) The processor 122 may be coupled with the memory 126, which may comprise a nontransitory
processor-readable medium. The processor 122 may be implemented as a specific
purpose processor, an application specific integrated circuit (ASIC), one or more field
programmable gate arrays (FPGAs), a group of processing components, or other suitable
electronic processing components. Any controllers and modules described herein may be
implemented as a hardware circuit comprising custom VLSI circuits or gate arrays, off-the-shelf
semiconductors such as logic chips, transistors, or other discrete components, and may be
implemented in programmable hardware devices such as field programmable gate arrays,
programmable array logic, programmable logic devices or the like. The memory 126 is one or
more devices (e.g., RAM, ROM, flash memory, hard disk storage) for storing data and/or
computer code for completing and/or facilitating the various user or client processes, layers, and
modules described in the present disclosure. The memory 126 may be or include volatile
memory or non-volatile memory and may include database components, object code
components, script components, or any other type of information structure for supporting the
various activities and information structures of the present disclosure. The memory 126 is
-II-
4815-1572-3829
Atty. Dkt. No.: l6CR546IN (047141-1207)
communicably connected to the processor 122 and includes computer code or instruction
modules for executing one or more processes described herein.
[0034] The communications interface 124 is configured to facilitate communications between
various components of the touchscreen zoom control system I 06, such as the aircraft control
system II 0, the sensors 112, and the interactive display system 114. For example, the
communications interface 124 may be configured to receive hand tracking data from the one or
more sensors 112, and to communicate the hand tracking data to the controller 120 via a wired or
wireless connection. The communications interface 124 may include any type of wired or
wireless technology for facilitating communications, including electronic and optical
communication protocols.
[0035] The touch proximity module 128 is configured to cause the processor 122 to receive the
finger tracking data from the sensor and determine a distance that the user finger is away from
the display screen and a locus on the display screen that the user finger is aimed based on the
finger tracking data The touch proximity module 128 is further configured to cause the
processor 122 to determine a rate of movement of the user finger with respect to the display
screen based on the finger tracking data. The touch proximity module 128 is further configured
to cause the processor 122 to determine whether the distance that the finger is away from the
locus of the display screen is greater than or less than a threshold. For example, in one
embodiment the processor 122 is configured to determine that the finger is less than 5 em from a
point on the display screen that the finger is aimed. In some embodiments, the touch proximity
module 128 is configured to cause the processor 122 to determine an x,y,z, coordinate of the user
finger with respect to a plurality of points on the display screen based on the finger tracking data.
[0036] In some embodiments, the touch proximity module 128 is configured to cause the
processor 122 to interpret the finger tracking data to determine an orientation of a finger of the
pilot 116 with respect to a particular active input area. In some embodiments, the touch
proximity module 128 is configured to cause the processor 122 to determine a gesture made by
the user based on the finger tracking data. In some embodiments, the touch proximity module
-12-
4815-1572-3829
Atty. Dkt. No.: 16CR546IN (047141-1207)
128 is configured to cause the processor 122 to determine a number of fingers aimed at at least
one point on the display screen based on the finger tracking data. For example, in some
embodiments, the processor 122 is configured to determine that a first user finger is aimed ala
first point on a first display screen and that a second user finger is aimed at a second point on a
second display screen.
[0037] The display module 130 is configured to cause the processor 122 to display images via
the interactive display system 114 and to cause the processor 122 to modify magnification of an
active input area of the display screen surrounding the locus in response to the distance being
less than a threshold. The display module 130 is further configured to cause the processor 122 to
modify the magnification by increasing the magnification in response tci a decrease of the
distance or by decreasing the magnification in response to an increase of the distance. In some
embodiments, the processor 122 is configured to increase or decrease the magnification at
substantially the same rate as a rate of movement of the user finger when the user finger moves
closer to or further away from the display screen. In this way, the user finger can act as a zoom
controller where bringing the finger closer to the display screen causes magnification of an area
to be increased and where brining the finger further away from the display screen causes
magnification of the area to decrease (e.g., back to a default size or default shape). In some
embodiments, the processor 122 is configured to. decrease the magnification of the active input
area in response to an increase in the distance after the magnification is first increased in
response to a decrease in the distance. The shape of the area magnified (e.g., the active input
area, the active input area including a surrounding area) can be any shape (e.g., square,
rectangular, triangular, circular).
[0038] In some embodiments, the processor 122 is configured to deactivate a second active
input area when modifying the magnification of the first active input area. In some embodiments,
the processor 122 is configured to deactivate all other active input areas when modifying the
magnification of the first active input area. Deactivating an active input area may include
modifying, concealing, or not displaying at least part of the once active input area to deactivate
the input area. The magnification of the active input area, deactivation of an input area, or other
CJ3-
4815-1572-3829
Atty. Diet. No.: 16CR546IN (047141-1207)
modification of the display screen is reversible. For example, a first display state of the display
screen may occur prior to modification of the magnification and a second display state of the
display screen may occur after modification of the magnification, and the processor 122 may be
configured to return the display screen to the first display state in response to receiving a user
input based on the user finger touching the display screen. In some embodiments, the active
input area is enlarged as the user finger is moved closer to the display screen such that any touch
on the display screen registers as the intended input. The deactivated area may include other
portions of the display that the user finger is not aimed at, such as other keys, other icons, and
other images.
[0039] Referring now to FIGS. 4A-4C, illustrations of exemplary embodiments of a display
400 and user finger 406 interacting with the display 400 are shown according to the inventive
concepts disclosed herein. FIG. 4A is an illustration of an exemplary embodiment of a display
400 with a user finger approaching a point on the display 400, FIG. 4B is an illustration of an
exemplary embodiment of the display 400 of FIG. 4A with the user finger 406 approaching or
touching the point on the display 400, and FIG. 4C is an illustration of an exemplary
embodiment of the display 400 of FIGS. 4A and 4B after the user finger 406 has withdrawn from
or touched the point on the display 400.
[0040] As shown in FIGS. 4A-4C, the display 400 may include a plurality of portions or areas,
including a first area 402 and a second area 404. The first area 402 and the second area 4,04 may
correspond to active input areas. For example, the first area 402 may correspond to a specific
button, icon, image, or text box. In one embodiment, as shown in FIGS. 4A-4C, the first area 402
corresponds to a keyboard key for the letter "D" and the second area 404 corresponds to the
remainder of the keyboard (e.g., all keys other than the key for the letter "D" and the remainder
of the keyboard area). In FIG. 4A, as the user finger 406 begins to approach the first input area
402, the processor 122 determines that the user finger 406 is aimed at the first input area 402. In
FIG. 4B, as the user finger 406 crosses a threshold distance to the display screen or the first input
area 402, the processor 122 causes the first input area 402 to magnify (e.g., zoom up), while at
the same time, in some embodiments, the second input area 404 is deactivated. As shown in FIG.
-14-
4815-1572-3829
Atty. Dkt. No.: 16CR546IN (047141-1207)
48, the first input area 402 is enlarged such that the user is able to recognize that they the letter
"D" is going to be inputted should they continue to move the user finger 406 toward the display
screen to cause the input to be registered (e.g., by moving closer to the display screen or by
physically touching the display screen). The first input area 402 is enlarged enough such that if
the user finger 406 continues to approach the screen but not the letter "D", the letter "D" is still
registered as the intended input. In other words, as the user finger 406 moves toward the screen,
the area that the user finger 406 is aimed at is enlarged and signifies to the user what the input
will be should the user continue to move the user finger 406 toward the display screen to touch
the display screen. For example, the first input area 402 may be enlarged wider than the user
finger 406 (e.g., two times wider, three times wider, four times wider). In some embodiments,
the first input area 402 is enlarged to cover the entirety of a portion of a display screen (e.g., the
entire keyboard area or the entire display screen itself). At this point the user can validate
whether the letter "D" is their intended input, and if so they can continue to move their finger
closer to the display screen and touch the display screen to input the letter "D", but if the letter
"D" is not the intended input of the user, the user can withdraw the user finger 406, at which
time the display 400 is configured to return to a default state as shown in FIG. 4C. In some
embodiments, when the user finger 406 moves within the threshold distance from the display
screen apd the active input area is magnified, no matter where on the display screen the user
touches, the input is the same as the input corresponding to the active input area. For example, if
the user finger 406 is aimed at the letter "J" when the user finger 406 crosses the threshold
distance to the display screen, so long as the user finger 406 continues to move toward the
display screen or at least not further away from the screen than the threshold distance, when the
user touches the display screen the input is registered as the letter "J" even if the user did not
actually touch the active input area comprising the letter "J" (e.g., even if the user actually
touches the letter "G" or another part of the display screen). In some embodiments, providing an
input via the display 400 (e.g., by touching any area of the display 400) causes the approached
first input area 402 of the display 400 to magnify, but the surrounding second input area 404 is
not completely deactivated or obscured. For example, the surrounding second input area 404
may continue to be visible to the user but the user will not be able to register any input by
-15-
4815-1572-3829
------------ -------
Atty. Diet. No.: 16CRS46IN (047141-1207)
touching the surrounding second input area 404 until the magnified first input area 402 is
released (e.g., returned to a normal magnification, de-magnified).
[0041] Referring now to FIGS. SA-SB and 6A-6B, illustrations of exemplary embodiments of
additional displays SOO, 600, 60 I are shown according to the inventive concepts disclosed herein.
FIG. SA is an illustration of an exemplary embodiment of a display SOO with a user finger
approaching a point on the display SOO. FIG. SB is an illustration of an exemplary embodiment
of the display 500 of FIG. SA with a user finger 506 approaching the point on the display 500.
FIG. 6A is an illustration of an exemplary embodiment of another display 600 prior to a user
finger 606 approaching a point on the display. FIG. 6B is an illustration of an exemplary
embodiment of another display 60 I with a user finger 606 approaching a point 602 on the
display 600.
[0042] As shown in FIG. SA, the user finger 506 is aimed or pointed at the first input area S02
corresponding with a "Map" touch input icon. The point on the display 500 that the user finger
S06 is aimed at is indicated via a displayed cursor 508 such that the user can guide their finger
with reference to the cursor 508 to accurately approach the first input area 502, for example,
instead of accidentally touching the second input area S04 surrounding the first input area 502
(e.g., any other part of the display 500). As shown in FIG. 5B, as the user finger 506 approaches
the display 500, the first input area 502 is edited to indicate that the user finger 506 is locked in
on the first input area 502. For example, the first input area 502 is shown to be highlighted via an
indicator 51 0; however it will be appreciated that the first input area 502 may be highlighted or
otherwise distinguished from the second input area 504 in other ways, such as being magnified.
In this way, the display SOO indicates that the user finger 506 will select the first input area 502 if
the user finger 506 continues to approach the display 500 without hiding or otherwise obscuring
the second input area 504 so that the user can continue to see the second input area 504 as the
user enters an input at the first input area 502. In some embodiments, after the user finger 506
withdraws from the display 500 or touches the display 500 at the first input area 502, the edited
effect of the first input area 502 is released (e.g., the indicator 510 is removed, magnification
returns to normal).
-16-
4815-1572-3829
Atty. Dkt. No.: 16CR546IN (047141-1207)
(0043] As shown in FIG. 6A, a display screen 600 depicting a head up display of a scene is
shown. The touchscreen zoom control system 106 may be contigured to receive inputs via the
head up display screen 600. As shown in FIG. 6B, the touchscreen zoom control system 106 may
be configured to detect a user finger 606 being aimed at a first input area 602 (e.g., an input
button on a display 601, a point on the display 601, a portion of the display 60 1) of the display
601. In this embodiment, the user finger 606 is able to manipulate the display 601 by interacting
with the first input area 602 (e.g., via touch). In this embodiment, as the user finger 606
approaches the display 601, the point that the user finger 606 is aimed at may be locked as the
only possible input area should the user finger 606 continue to move toward the display 601
while at the same time the rest of the display 601 (e.g., the second input area 604) is not obscured
or hidden. For example, when the user finger 606 approaches the display 601, the first input area
602 may be magnified, highlighted, overlaid with a cursor to depict the exact point at which the
user finger 606 is aimed, or indicated via an arrow or other symbology, and only once the user
finger 606 actually touches the first input area 602 or withdraws from the display 601 is the
display effect (e.g., the magnification, highlight, overlaid cursor) released and removed.
[0044] Referring now to FIG. 7, an exemplary embodiment of a method for controlling a touch
display screen according to the inventive concepts disclosed herein may include one or more of
the following steps.
(0045] A step (702) may include receiving finger tracking data from a sensor. The finger
tracking data is indicative of a distance between a user finger and a display screen of a display
system and a locus on the display screen that the user finger is aimed. For example, the finger
tracking data may be indicative of a distance between the user finger and the display screen of
the interactive display system 114 and a locus on the display screen of the interactive display
system 114 that the finger is aimed. In some embodiments, the finger tracking data is indicative
of a rate of movement of the user finger with respect to the display screen. In some
embodiments, the finger tracking data is indicative of the finger touching the display screen (e.g.,
a user input). ln some embodiments, the user input is inputted without the finger actually
-17-
4815-1572-3829
Atty. Dkt. No.: l6CR546TN (047141-1207)
touching the display screen (e.g., the finger being close to the display screen, focusing on a locus
of the display screen, or via an input gesture).
(0046] A step (704) may include modifying magnification of an active input area of the display
screen surrounding the locus in response to the distance being less than a threshold. In some
embodiments, modifying the magnification comprises increasing the magnification in response
to a decrease of the distance. For example, if the finger is moved closer to the locus of the
display screen, the magnification of the area surrounding the locus may be magnified (e.g.,
zoomed in, blown up, comprise a greater number of pixels than prior to magnification, comprise
a greater percentage or pmiion of the display screen than prior to magnification). In some
embodiments, increasing the magnification comprises increasing the magnification at
substantially the same rate as the rate of movement of the user finger when the user finger moves
toward the display screen. For example, the portion of the display screen being magnified may
increase at a rate that corresponds to the rate of movement of the user finger (e.g., the screen
enlarges at a quicker rate in response to the user finger approaching the locus at a quicker pace).
[0047] In some embodiments, modifying the magnification comprises decreasing the
magnification in response to an increase of the distance. In some embodiments, the processor
122 causes the magnification of the area to decrease after the magnification is first increased in
response to a decrease in the distance. In some embodiments, the active input area comprises a
first active input area and the processor 122 deactivates a second active input area when .
modifying the magnification of the first active input area. For example, the processor 122 may
deactivate the second active input area by at least one of modifying, concealing, or not
displaying at least part of the second image portion. The processor 122 may deactivate the
second active input area as shown in FIG. 4B. In some embodiments, the processor 122 causes
the display screen to return to a display state occurring prior to modification of the magnification
in response to receiving a user input based on the user finger touching the display screen.
[0048] As will be appreciated from the above, touchscreen zoom control display systems
according to embodiments of the inventive concepts disclosed herein may simplify and reduce
-18-
4815-1572-3829
Atty. Dkt. No.: 16CR546IN (047141-1207)
pilot workload when operating touchscreen display systems by providing displayed information
that is easier to navigate, enabling display screens and displayed imagery sizes to be reduced
while reducing input error rates, and reducing pilot head down scan time when interacting with a
touchscreen display.
[0049] It is to be understood that embodiments of the methods according to the inventive
concepts disclosed herein may include one or more of the steps described herein. Further, such
steps may be carried out in any desired order and two or more of the steps may be carried out
simultaneously with one another. Two or more of the steps disclosed herein may be combined in
a single step, and in some embodiments, one or more of the steps may be carried out as two or
more sub-steps. Further, other steps or sub-steps may be carried out in addition to, or as
substitutes to one or more of the steps disclosed herein.
[0050] From the above description, it is clear that the inventive concepts disclosed herein are
well adapted to carry out the objects and to attain the advantages mentioned herein as well as
those inherent in the inventive concepts disclosed herein. While presently preferred embodiments
of the inventive concepts disclosed herein have been described for purposes of this disclosure, it
will be understood that numerous changes may be made which will readily suggest themselves to
those skilled in the art and which are accomplished within the broad scope and coverage of the
inventive concepts disclosed and claimed herein.

WHAT IS CLAIMED IS:
1. A touch display control system, comprising:
a display system configured to display a plurality of display images;
a sensor configured to generate finger tracking data indicative of a distance
between a user finger and a display screen of the display system and a locus on the display
screen that the user finger is aimed;
at least one processor coupled with a non-transitory processor-readable medium
storing processor-executable code for causing the at least one processor to:
receive the finger tracking data from the sensor; and
modify magnification of an active input area of the display screen
surrounding the locus in response to the distance being less than a threshold.
2. The system of claim I, wherein modifying the magnification comprises increasing
the magnification in response to a decrease ofthe distance.
3. The system of claim 2, wherein the finger tracking data is indicative of a rate of
movement of the user finger with respect to the display screen, and wherein increasing the
magnification comprises increasing the magnification at substantially the same rate as the rate of
movement of the user finger when the user finger moves toward the display screen.
4. The system of claim I, wherein modifying the magnification comprises
decreasing the magnification in response to an increase of the distance, the processor-executable
code further causing the at least one processor to decrease the magnification after the
magnification is first increased in response to a decrease in the distance.
5. The system of claim l, wherein the active input area comprises a first active input
area, the processor-executable code further causing the at least one processor to deactivate a
second active input area when modifying the magnification of the first active input area.
-20-
4815-1572-3829
Atty. Dkt. No.: l6CR546TN (047141-1207)
6. The system of claim 5, wherein the first active input area comprises a first image
portion and the second active input area comprises a second image portion, and wherein
deactivating the second active input area comprises at least one of modifying, concealing, or not
displaying at least part of the second image portion.
7. The system of claim I, wherein the display screen comprises a first display state
occurring prior to modification of the magnification and a second display state occurring during
modification of the magnification, and wherein the processor-executable code further causes the
at least one processor to return the display screen to the first display state in response to
receiving a user input based on the user finger touching the display screen.
8. The system of claim I, wherein the active input area comprises at least one of an
icon, a key, a button, a text field, and an interactive image.
9. The system of claim I, wherein the sensor comprises at least one of an infrared
camera, two monochromatic infrared cameras, three infrared LEDs, at least two global shutter
image sensors, and a leap motion sensor.
I 0. The system of claim I, wherein the sensor is configured to detect the user finger
within a hemispheric area surrounding the display screen, and wherein the sensor is configured
to track finger movements at a rate of at least 300 frames per second.
II. A method for controlling a touch display screen, the method comprising:
receiving finger tracking data from a sensor, the finger tracking data indicative of
a distance between a user finger and a display screen of a display system and a locus on the
display screen that the user finger is aimed; and
modifying magnification of an active input area of the display screen surrounding
the locus in response to the distance being less than a threshold.
12. The method of claim II, wherein modifying the magnification comprises
increasing the magnification in response to a decrease of the distance.
-21-
4815-1572-3829
Atty. Dkt. No.: J6CR546IN (047141-1207)
13. The method of claim 12, wherein the finger tracking data is indicative of a rate of
movement of the user tinger with respect to the display screen, and wherein increasing the
magnification comprises increasing the magnification at substantially the same rate as the rate of
movement of the user finger when the user finger moves toward the display screen.
14. The method of claim II, wherein modifying the magnification comprises
decreasing the magnification in response to an increase of the distance, and further comprising
decreasing the magnification after the magnification is first increased in response to a decrease in
the distance.
15. The method of claim 11, wherein the active input area comprises a first active
input area, and further comprising deactivating a second active input area when modifying the
magnification of the first active input area.
16. The method of claim 15, wherein the first active input area comprises a first
image portion and the second active input area comprises a second image portion, and wherein
deactivating the second active input area comprises at least one of modifying, concealing, or not
displaying at least part of the second image portion.
17. The method of claim 11, wherein the display screen comprises a first display state
occurring prior to modification of the magnification and a second display state occurring during
modification of the magnification, and further comprising returning the display screen to· the first
display state in response to receiving a user input based on the user finger touching the display
screen.
18. A touch display control system, comprising:
at least one processor coupled with a non-transitory processor-readable medium
storing processor-executable code for causing the at least one processor to modify magnification
of an active input area of a display screen surrounding a locus on the display screen that a user
finger is aimed in response to a distance between the user finger and the locus of the display
screen being less than a threshold.
-22-
4815-1572-3829
,,
!I
I
!
Atty. Dkt. No.: 16CR546IN (047141-1207)
19. The system of claim 18, wherein the display screen is configured to increase and
decrease magnification of areas of a displayed image, and wherein modifying the magnification
of the active input area comprises increasing the magnification of the active input area in
.response to a decrease of the distance.
20. The system of claim 19, wherein the active input area comprises a first active
input area, further comprising a second active input area, the first active input area comprising a
first image portion and the second active input area comprising a second image portion, the
processor-executable code further configured to cause the at least one processor to deactivate the
second active input area when modifying the magnification of the first active input area by at
least one of modifying, concealing, or riot displaying at least part of the second image portion.

Documents

Application Documents

# Name Date
1 Power of Attorney [03-10-2016(online)].pdf 2016-10-03
2 Form 5 [03-10-2016(online)].pdf 2016-10-03
3 Form 3 [03-10-2016(online)].pdf 2016-10-03
4 Drawing [03-10-2016(online)].pdf 2016-10-03
5 Description(Complete) [03-10-2016(online)].pdf 2016-10-03
6 abstact.jpg 2016-12-30
7 Other Patent Document [08-03-2017(online)].pdf 2017-03-08
8 201611033783-OTHERS-090317.pdf 2017-03-14
9 201611033783-Correspondence-090317.pdf 2017-03-14
10 201611033783-FORM 18 [17-04-2018(online)].pdf 2018-04-17
11 201611033783-FORM 4(ii) [26-03-2021(online)].pdf 2021-03-26
12 201611033783-OTHERS [25-06-2021(online)].pdf 2021-06-25
13 201611033783-FORM-26 [25-06-2021(online)].pdf 2021-06-25
14 201611033783-FER_SER_REPLY [25-06-2021(online)].pdf 2021-06-25
15 201611033783-COMPLETE SPECIFICATION [25-06-2021(online)].pdf 2021-06-25
16 201611033783-CLAIMS [25-06-2021(online)].pdf 2021-06-25
17 201611033783-ABSTRACT [25-06-2021(online)].pdf 2021-06-25
18 201611033783-FER.pdf 2021-10-17
19 201611033783-Retyped Pages under Rule 14(1) [11-12-2023(online)].pdf 2023-12-11
20 201611033783-PatentCertificate11-12-2023.pdf 2023-12-11
21 201611033783-IntimationOfGrant11-12-2023.pdf 2023-12-11
22 201611033783-2. Marked Copy under Rule 14(2) [11-12-2023(online)].pdf 2023-12-11

Search Strategy

1 Search201611033783E_27-08-2020.pdf

ERegister / Renewals

3rd: 13 Feb 2024

From 03/10/2018 - To 03/10/2019

4th: 13 Feb 2024

From 03/10/2019 - To 03/10/2020

5th: 13 Feb 2024

From 03/10/2020 - To 03/10/2021

6th: 13 Feb 2024

From 03/10/2021 - To 03/10/2022

7th: 13 Feb 2024

From 03/10/2022 - To 03/10/2023

8th: 13 Feb 2024

From 03/10/2023 - To 03/10/2024

9th: 30 Sep 2024

From 03/10/2024 - To 03/10/2025

10th: 25 Sep 2025

From 03/10/2025 - To 03/10/2026