Sign In to Follow Application
View All Documents & Correspondence

User Interface For Touch Devices

Abstract: Methods and devices for dynamically reconfiguration of user interface on a touch device (100) are described. The touch device (100) includes a touch-screen (108) to receive a user swipe input (202) from a user. Thereafter, the touch device (100) determines a user-touchable area based on the user swipe input (202). Based on a reconfiguration setting, the user interface is reconfigured on the touch-screen (108) within the user-touchable area.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
20 January 2014
Publication Number
30/2015
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2023-01-31
Renewal Date

Applicants

SAMSUNG INDIA ELECTRONICS PVT. LTD.,
Logix Cyber Park Tower C 8th to 10th Floor, Tower D Ground to 10th Floor, Plot No. C 28-29, Sector -62, Noida Uttar Pradesh 201301

Inventors

1. AGRAWAL, Pulkit
3/141E Vidhya Nagar Phase 2, Ramghat Road, Aligarh, Uttar Pradesh 202001
2. MALIK, Lovlesh
164, Jeevan Nagar Sonipat Haryana
3. SHARMA, Tarun
10, Mall Road, Near Custom House, Amritsar Punjab 143001

Specification

2
TECHNICAL FIELD
[0001] The present subject matter relates to touch devices and, particularly but not
exclusively, to methods and systems for reconfiguring user interface of touch 5 h devices.
BACKGROUND
[0002] Nowadays, touch devices have increasingly become popular in consumer
electronics, such as mobile communication devices, computing devices, global position
system (GPS) navigation units, digital video recorders, and other handheld devices. The
10 touch devices generally include a user interface to facilitate user interactions with
application programs running on the touch devices. The user interface facilitates the user
interactions by simultaneously displaying a number of user interface (UI) elements to a user
and receiving user input through, for example, the user’s finger(s) or a stylus. The UI
elements are generally preconfigured and evenly disposed on entire touch-screen of the
15 touch devices by the manufacturers. However, with such preconfigured positioning of the UI
elements, it is inconvenient for the users to interact with the UI elements positioned beyond
the reach of the user’s hand.
BRIEF DESCRIPTION OF THE FIGURES
[0003] The detailed description is described with reference to the accompanying
20 figures. In the figures, the left-most digit(s) of a reference number identifies the figure in
which the reference number first appears. The same numbers are used throughout the figures
to reference like features and components. Some embodiments of system and/or methods in
accordance with embodiments of the present subject matter are now described, by way of
example only, and with reference to the accompanying figures, in which:
25 [0004] Fig. 1 illustrates a touch device, according to an embodiment of the present
subject matter.
[0005] Fig. 2 illustrates an exemplary user swipe input received on the touch device,
according to an embodiment of the present subject matter.
3
[0006] Fig. 3 illustrates an exemplary implementation of partial reconfiguration of
user interface on the touch device, according to an embodiment of the present subject
matter.
[0007] Fig. 4 illustrates an exemplary implementation of complete reconfiguration of
user interface on the touch device, according to an embodiment 5 nt of the present subject
matter.
[0008] Fig. 5 illustrates a method for dynamic reconfiguration of user interface on
the touch device, according to an embodiment of the present subject matter.
[0009] Fig. 6 illustrates a method for dynamic reconfiguration of user interface
10 based on direction of the user swipe input, according to an embodiment of the present
subject matter.
[0010] It should be appreciated by those skilled in the art that any block diagrams
herein represent conceptual views of illustrative systems embodying the principles of the
present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams,
15 state transition diagrams, pseudo code, and the like, represent various processes which may
be substantially represented in computer readable medium and so executed by a computer or
processor, whether or not such computer or processor is explicitly shown.
DESCRIPTION OF EMBODIMENTS
[0011] The present subject matter relates to systems and methods for dynamic
20 reconfiguration of user interface in touch devices. The methods can be implemented in
various touch devices, such as mobile phones, hand-held devices, tablets, netbooks, laptops
or other portable computers, personal digital assistants (PDAs), notebooks and other devices
that implement a touch-screen or touch-panel.
[0012] Typically, a touch device provides various functionalities, for example,
25 accessing and displaying websites, sending and receiving e-mails, taking and displaying
photographs and videos, playing music and other forms of audio, etc. These, and numerous
other functionalities, are generally performed by execution of an application on selection of
the application’s icon present on the touch device’s user interface. With increasing demands
from users for better interaction capabilities and additional functionalities, the touch devices
4
are nowadays configured with touch user interfaces having larger sizes, sometimes even
larger than 5 inches.
[0013] The touch device configured with larger size touch user interface, as
displayed on a touch-screen, commonly has user interface (UI) elements arranged on the
entire 5 touch-screen of the touch device. However, the UI elements cannot be scaled and/or
positioned as per a user’s desire, which can otherwise help to influence user interactions
with the touch device. In addition to that, the touch device does not have the capability to
reconfigure the UI elements. The UI elements are generally preconfigured and evenly
positioned on entire touch-screen of the touch devices by the manufacturers. This often
10 gives rise to a situation in which a few UI elements may be preconfigured beyond a single
hand operational capability of the user. Thus, the touch device configured with larger size
touch user interface is often operated using both hands.
[0014] The subject matter disclosed herein is directed towards systems and methods
for reconfiguring user interface on touch devices, for example, for performing single hand
15 operation. In one example, a user defines an area on a touch-screen of a touch device within
the reach of a user’s hand, and user interface is dynamically configured so that the UI
elements are positioned in the reach of the user’s hand. In an example, the user’s hand
includes, without any limitation, user’s fingers, user’s thumb or other input devices, such as
stylus held by the user.
20 [0015] Further, the description hereinafter of the present subject matter includes
various specific details to assist in that understanding but these are to be regarded as merely
exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes
and modifications of the embodiments described herein can be made without departing from
the scope and spirit of the present subject matter. In addition, descriptions of well-known
25 functions and constructions may be omitted for clarity and conciseness.
[0016] Yet further, the reconfiguration capability of the present subject matter can be
provided as an app that can be downloaded from a computer-readable medium and installed
on the touch device.
[0017] According to an exemplary embodiment of the present subject matter,
30 systems and methods for dynamic reconfiguring of a user interface on a touch device are
5
described herein. The present subject matter facilitates a user to communicate with the touch
device and register the extent of his reach on a touch-screen of the touch device by
providing a user swipe input on the touch-screen. In accordance with the present subject
matter, the touch-screen utilizes its multi-touch capabilities to receive the user swipe input,
and thus does not require any additional hardware, 5 such as specialized sensors.
[0018] In an example, the touch-screen of touch device may receive the user swipe
input when the user swipes a user input means, for example, user finger, user thumb or user
stylus, from a first edge of the touch-screen to a second-edge of the touch-screen tracing a
swipe boundary on the touch-screen. In an example, the first edge and the second edge can
10 be either adjacent sides or oppositely lying sides.
[0019] In another alternative example, the touch-screen of the touch device may
receive the user swipe input that may not be touching any edge of the touch-screen. In such
example, the user may trace the swipe boundary by the user input means from a point
nearest to the first edge of the touch-screen to a point nearest to a second edge of the touch15
screen. Then, the touch device may connect that point nearest to the first edge or the second
edge to respective nearest edge.
[0020] In yet another alternative example, the touch-device may include a
reconfiguring mechanism to receive the user swipe input by prompting the user to touch a
soft-button on the touch-screen for automatically tracing the swipe boundary. Such
20 automatic tracing of swipe boundary is performed based on swipe history maintained over a
pre-defined period of time by the reconfiguring module. Thus, when a user is prompted by
the reconfiguring module for the first time, the reconfiguring mechanism may not
automatically trace the swipe boundary as it has nothing stored or maintained as the swipe
history. Thereafter, in an example, the reconfiguration mechanism may automatically trace
25 the swipe boundary based on mean value of the previous traces stored in the swipe history.
[0021] Further, based on the received user swipe input, the touch device determines
a user-touchable area. In an example, the user-touchable area can be either a user-defined
swipe boundary area or a user-defined enclosed area enclosed by the swipe boundary and
sides of the touch-screen.
6
[0022] In an example, the user-defined swipe boundary area is not confined to an
actual area touched by the user input means. Specifically, when the user input means touches
a specific part of the touch-screen, the touch device determines whether the user has slided
or dragged the user input means, for example, from right to left or from left to right,
estimates 5 ates a swipe boundary area based on a specific touched area on the touch-screen, and
determines the estimated swipe boundary area as the user-defined swipe boundary area.
[0023] In an alternative example, the user-defined enclosed area is an area enclosed
between the first edge of the touch-screen, the second edge of the touch-screen, and the
swipe boundary traced by the user swipe input.
10 [0024] Thereafter, the touch-based device dynamically reconfigures the user
interface of the touch device within the user-touchable area based on reconfiguration setting.
[0025] Such reconfiguration of the user interface ensures a single handed operation
of the touch device by reconfiguring the user interface within the user-touchable area.
Hereinafter, the term ‘reconfiguration or reconfiguring’ may include, without any limitation,
15 the context of restructuring, rendering, rearranging, readjusting, or repositioning.
[0026] Further, in an example, based on the reconfiguration setting, the
reconfiguration of the user interface can be categorized into two categories, namely partial
reconfiguration and complete reconfiguration. In said example, the reconfiguration setting
may be predefined reconfiguration setting or may be set by the user.
20 [0027] In the partial reconfiguration, user interface (UI) elements lying within the
user-touchable area retain their positions on current UI element screen, while the UI
elements lying outside the user-touchable area are reconfigured within the user-touchable
area on a next UI element screen. This results in an increase in the number of UI element
screens.
25 [0028] In the complete reconfiguration, the size of all the UI elements is decreased
or optimized to accommodate all the UI elements within the user-touchable area on current
UI element screen. Thus, in the complete reconfiguration, the number of UI element screens
is not increased, as no UI element is reconfigured on a next UI element screen.
7
[0029] In addition to the above listed partial reconfiguration and complete
reconfiguration, many more configuration techniques can be implemented, while at the same
time allowing a single handed operation by reconfiguring distant user interface (UI)
elements within the reach of the user’s hand to ease the interaction with those distant UI
5 elements.
[0030] Thus, the exemplary embodiment of the present subject matter may provide
methods and systems for reconfiguring user interface in a user-touchable area by adjusting
the positions, intervals, and layout of the UI elements so that a user may conveniently
manipulate the touch device with single hand.
10 [0031] It should be noted that the description merely illustrates the principles of the
present subject matter. It will thus be appreciated that various arrangements may also be
employed that, although not explicitly described herein, embody the principles of the present
subject matter and are included within its spirit and scope. Furthermore, all examples recited
herein are principally intended expressly to be only for explanation purposes to aid the
15 reader in understanding the principles of the present subject matter, and are to be construed
as being without limitation to such specifically recited examples and conditions. Moreover,
all statements herein reciting principles, aspects, and embodiments of the present subject
matter, as well as specific examples thereof, are intended to encompass equivalents thereof.
The manner in which the methods shall be implemented onto various systems has been
20 explained in detail with respect to the Figs 1-6. While aspects of described systems and
methods can be implemented in any number of different computing devices and/or
configurations, the embodiments are described in the context of the following system(s).
[0032] Fig. 1 illustrates exemplary components of a touch device 100, in accordance
with an embodiment of the present subject matter. In one embodiment, the touch device 100
25 facilitates a user to provide a user swipe input for reconfiguration of user interface (UI) on
the touch device 100. The touch device 100 may be implemented as various computing
devices, such as but not limited to a mobile phone, a smart phone, a personal digital assistant
(PDA), a digital diary, a tablet, a net-book, a laptop computer, and the like. In one
implementation, the touch device 100 includes one or more processor(s) 102, I/O
30 interface(s) 104, and a memory 106 coupled to the processor(s) 102. The processor(s) 102
8
may be implemented as one or more microprocessors, microcomputers, microcontrollers,
digital signal processors, central processing units, state machines, logic circuitries, and/or
any devices that manipulate signals based on operational instructions. Among other
capabilities, the processor(s) 102 is configured to fetch and execute computer-readable
instructions stored in the memory 5 ry 106.
[0033] The I/O interface(s) 104 may include a variety of software and hardware
interfaces, for example, interfaces for peripheral device(s), such as a keyboard, a mouse, and
an external memory. Further, the I/O interfaces 104 may facilitate multiple communications
within a wide variety of protocol types including, operating system to application
10 communication, inter process communication, etc.
[0034] The memory 106 can include any computer-readable medium known in the
art including, for example, volatile memory, such as static random access memory (SRAM)
and dynamic random access memory (DRAM), and/or non-volatile memory, such as read
only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical
15 disks, and magnetic tapes.
[0035] Further, the touch device 100 may include a touch-screen 108. The touchscreen
108 is operable to display images in response to a video signal and is also operable to
output a touch signal that indicates a position, on the touch-screen 108, which is touched by
a user. In an example, the touch signal is generated in response to contact or proximity of a
20 portion of the user’s hand, for example, user’s thumb or user’s finger, with respect to the
touch-screen 108. In another example, the touch signal can also be generated in response to
contact or proximity of an implement, such as a stylus.
[0036] The touch-screen 108 can be implemented using any one of a number of
well-known technologies that are suitable for performing the functions described herein with
25 respect to the present subject matter. Any suitable technology now known or later devised
can be employed to implement the touch-screen 108. Exemplary technologies that can be
employed to implement the touch-screen 108 include resistive touch sensing, surface
acoustic wave touch sensing, capacitive touch sensing, and other suitable technologies.
9
[0037] In an example, the touch-screen 108 can be positioned on top of a display
unit having a user interface. The touch-screen 108 is substantially transparent such that the
display on the display unit is visible through the touch-screen 108.
[0038] Further, in accordance with the present subject matter, the touch-screen 108
and the display unit are sized complementary 5 ntary to one another. The touch-screen 108 can be
approximately of the same size as the display unit, and is positioned with respect to the
display unit such that a touchable area of the touch-screen 108 and a viewable area of the
display unit are substantially coextensive. In accordance with the present subject matter, the
touch-screen 108 can be a capacitive touch-screen. Other technologies can be employed, as
10 previously noted. In accordance with the present subject matter, the display unit is a liquid
crystal display that is operable to output a touch signal in response to a user’s touch on the
touch-screen.
[0039] Further, in an example, the touch-screen of the present exemplary
embodiment may have a relatively large screen size, compared to a related-art touch-screen.
15 As long as a touch-screen includes the user-untouchable area, i.e., an area untouchable
and/or unreachable, by the user input means according to a user’s reach or an area above
which the user input means cannot be placed, the present exemplary embodiment is
applicable to the touch-screen.
[0040] Further, the touch device 100 may include module(s) 110 and data 112. The
20 modules 110 and the data 112 may be coupled to the processor(s) 102. The modules 110,
amongst other things, include routines, programs, objects, components, data structures, etc.,
which perform particular tasks or implement particular abstract data types. The modules 110
may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or
any other device or component that manipulate signals based on operational instructions. In
25 another aspect of the present subject matter, the modules 110 may be computer-readable
instructions which, when executed by a processor/processing unit, perform any of the
described functionalities. The machine-readable instructions may be stored on an electronic
memory device, hard disk, optical disk or other machine-readable storage medium or nontransitory
medium. In one implementation, the computer-readable instructions can be also be
30 downloaded to a storage medium via a network connection.
10
[0041] In an implementation, the module(s) 110 includes a surface area processor
114, a reconfiguration controller 116, including a partial reconfiguration controller 118 and
a complete reconfiguration controller 120, and other module(s) 122. The other module(s)
122 may include programs or coded instructions that supplement applications or functions
performed 5 d by the touch device 100.
[0042] Further, the data 112 amongst other things, may serve as a repository for
storing data that is processed, received, or generated as a result of the execution of one or
more modules in the module(s) 110. Although the data 112 is shown internal to the touch
device 100, it may be understood that the data 112 can reside in an external repository (not
10 shown in the figure), which may be coupled to the touch device 102. The touch device 100
may communicate with the external repository through the I/O interface(s) 104 to obtain
information from the data 112.
[0043] In operation, the processor(s) 102 is operable to display a user interface, in
preconfigured or predefined mode, on the touch-screen 108 of the touch device 100. The
15 user interface facilitates a user to interact with user interface (UI) elements to execute
application programs installed on the touch device 100. In an example, the user can interact
with the UI elements presented on the user interface by performing a “tap” operation. The
“tap” operation on the touch device 100 is a form of gesture. The touch device 100
commonly supports a variety of gesture-based user commands, such as swipe gesture, pinch
20 gesture and spread gesture, to interact with the UI elements presented on the user interface.
However, in a situation when the user is holding the touch device 100 from its corner and
wants to perform a single hand operation, the user may not be able to interact with few UI
elements positioned away from the reach of the user.
[0044] In accordance with the present subject matter, the touch device 100 may
25 include a UI reconfiguration mode to allow the user to switch-on and switch-off the
reconfiguration of the UI based on user’s reach of hand or thumb or finger. In an example,
when a user wants to reconfigure the UI within the of the user’s hand, the user may activate
the UI reconfiguration mode. Once the UI reconfiguration mode is activated, the touch
device 100 prompts the user to provide a user swipe input to reconfigure the existing UI. In
30 response to the prompt, the user provides the user swipe input on the touch-screen 108. In an
11
example, the user swipe input is then utilized by the touch device 100 to register the extent
of user’s reach on the touch-screen 108.
[0045] Further, in accordance with the present subject matter, the touch-screen 108
utilizes its multi-touch capabilities to receive the user swipe input, and thus does not require
any additional hardware, such as specialized sensors, to be integrated in 5 the existing touchscreen
108. In other words, the touch-screen 108 having multi-touch capabilities can receive
the user swipe input when the user keeps maximum area of user input means in contact with
the touch-screen 108 while providing the user swipe input. In an example, the user input
means may include user thumb, user finger, or a user stylus.
10 [0046] Further, the present subject matter is not limited thereto, and the user input
means may be any suitable and/or similar input means, such as any finger of a user and a
stylus. It is to be understood that the user input means is not limited to a user’s hand in the
present subject matter.
[0047] Fig. 2 illustrates an exemplary user swipe input 202 received on the touch15
screen 108 of the touch device 100, according to an embodiment of the present subject
matter. In an example, the user swipe input 202 may be received when the user swipes the
user input means, for example, user thumb or user finger or user stylus, from a first edge
204-1 of the touch-screen 108 to a second edge 204-2 of the touch-screen 108 tracing a
swipe boundary on the touch-screen 108.
20 [0048] In another alternative example, the touch-screen 108 of the touch device 100
may receive the user swipe input 202 that may not be touching any edge of the touch-screen
108. In such example, the user may trace the swipe boundary by the user input means from a
point nearest to the first edge 204-1 of the touch-screen 108 to a point nearest to a second
edge 204-2 of the touch-screen 108. Then, the touch device 100 may connect that point
25 nearest to the first edge 204-1 or the second edge 204-2 to respective nearest edge.
[0049] In yet another alternative example, the touch-device 100 may include a
reconfiguring mechanism to receive the user swipe input 200 by prompting the user to touch
a soft-button on the touch-screen 108 for automatically tracing the swipe boundary. Such
automatic tracing of swipe boundary is performed based on swipe history maintained over a
30 pre-defined period of time by the reconfiguring module. Thus, when a user is prompted by
12
the reconfiguring module for the first time, the reconfiguring mechanism may not
automatically trace the swipe boundary as it has nothing stored or maintained as the swipe
history.
[0050] Further, in an implementation shown in Fig. 2, the first edge 204-1 is
5 represented as a bottom edge and the second edge 204-2 is represented as a side edge.
However, in an alternative example and without any limitation, the first edge 204-1 can be
any side edge and the second edge 204-2 can be a bottom or top edge.
[0051] In an alternative implementation, instead of the first edge 204-1 and the
second edge 204-2 being adjacent edges as represented in Fig. 2, the first edge 204-1 and the
10 second edge 204-2 can be oppositely lying edges. For example, the first edge 204-1 can be
one side edge and the second edge 204-2 can be another side edge or a corner point. In the
present alternative example, the side edges can be longitudinal edges or horizontal edges.
[0052] In yet another implementation and without any limitation, for right hander
users, the first edge 204-1 can be bottom edge and the second edge 204-2 can be a right
15 edge. Similarly, for left hander users, the first edge 204-1 can be bottom edge and the second
edge 204-2 can be a left edge.
[0053] Further, in an example, the user swipe input 202, received in accordance with
the present subject matter, can easily be distinguished from normal user swipe input by two
identifications. Firstly, a large portion of user input means, for example, user thumb or user
20 input, would be in contact with the touch-screen 108. Secondly, the user swipe input 202 is
performed from the first edge 204-1 to the second edge 204-2 of the touch-screen 108, and
vice versa. That is, the user swipe input 202 connects the first edge 204-1 of the touchscreen
108 with the second edge 204-2 of the touch-screen 108. It will be understood that
other identifications, such as the reconfiguration mode being in active mode, can also be
25 used.
[0054] Yet further, in an example, as can be seen in Fig. 2, the user swipe input 202,
received in accordance with the present subject matter, defines a user-touchable area. In an
example, the user-touchable area can be either a user-defined swipe boundary area or a userdefined
enclosed area 206 enclosed by the swipe boundary and sides of the touch-screen
30 108.
13
[0055] In an example, the user-defined swipe boundary area is not confined to an
actual area touched by the user input means. Specifically, when the user input means touches
a specific part of the touch-screen 108, the touch device 100 determines whether the user has
slided or dragged the user input means, for example, from right to left or from left to right,
estimates 5 ates a swipe boundary area based on a specific touched area on the touch-screen 108,
and determines the estimated swipe boundary area as the user-defined swipe boundary area.
[0056] In an alternative example, the user-defined enclosed area 206 is an area
enclosed on the touch-screen 108 within the first edge 204-1 of the touch-screen 108, the
second edge 204-2 of the touch-screen 108, and the swipe boundary traced by the user swipe
10 input 202.
[0057] In another alternative example, when the user swipe input 202 connects the
two side edges, the user-defined swipe boundary area or a user-defined enclosed area 206
can be enclosed between two side edges, one bottom edge, and the user swipe input 202.
[0058] Now, once the user swipe input 202 is received, the surface area processor
15 114 determines a value of the user-touchable area and compares the determined value of the
user-touchable area with a predefined threshold area. In an example, the predefined
threshold area is defined based on an average length of a human thumb or a human finger or
a stylus. Based on the comparison, in case the value of the user-touchable area is determined
below a predefined threshold area, the surface area processor 114 may prompt the user to
20 again provide the user swipe input 202.
[0059] Thereafter, once the surface area processor 114 confirms that the value of the
user-touchable area is above the predefined threshold area, the reconfiguration controller
116 makes a decision on what type of reconfiguration of the UI elements is to be executed.
The decision depends on the reconfiguration setting for the user interface of the touch device
25 100. In an example, the touch device 100 may include the user-definable reconfiguration
setting that enables the user to define the reconfiguring setting for the user interface under
two categories, namely partial reconfiguration and complete reconfiguration.
[0060] In accordance with an exemplary implementation, the user may define the
configuration of the UI based on the direction of the user swipe input 202. For example, the
30 user can define the user-definable reconfiguration setting that the partial reconfiguration is
14
performed when the touch-screen 108 receives the user swipe input 202 in upward direction
from the first edge 204-1 of the touch-screen 108 to the second edge 204-2 of the touchscreen
108. Similarly, the user can define the user definable reconfiguration setting that the
complete reconfiguration is performed when the touch-screen 108 receives the user swipe
input 5 202 in downward direction from the second edge 204-2 of the touch-screen 108 to the
first edge 204-1 of the touch-screen 108.
[0061] In accordance with an alternative implementation, the user can define the
user definable reconfiguration setting that the partial reconfiguration is performed when the
touch-screen 108 receives the user swipe input 202 in downward direction from the second
10 edge 204-2 of the touch-screen 108 to the first edge 204-1 of the touch-screen 108.
Similarly, the user can define the user definable reconfiguration setting that the complete
reconfiguration is performed when the touch-screen 108 receives the user swipe input 202 in
upward direction from the first edge 204-1 of the touch-screen 108 to the second edge 204-2
of the touch-screen 108.
15 [0062] In yet another implementation, the user can receive a prompt on providing the
user swipe input and in response to the prompt can select whether a partial reconfiguration
or a complete reconfiguration is to be done.
[0063] Further, in an exemplary embodiment shown in Fig. 3, in case the
reconfiguration controller 116 makes a decision to perform the partial reconfiguration of the
20 user interface based on the reconfiguration setting, the partial reconfiguration controller 118
is invoked to perform the partial reconfiguration of the user interface within the user-defined
enclosed area 206 enclosed by the user swipe input 202. Thereafter, the partial
reconfiguration controller 118 retains the positions of user interface (UI) elements lying
within the user-defined enclosed area 206 on a current UI element screen, while reconfigure
25 positions of UI elements lying outside the user-defined enclosed area 206 onto a next UI
element screen within the user-defined enclosed area 206.
[0064] For example, as can be seen in right side of Fig. 3, UI elements, such as
calculator, voice recorder, phone, contacts, messaging, internet, ChatON, Samsung apps,
Samsung Link, WatchON, and Video, lying within the user-defined enclosed area 206 retain
30 their positions, while the UI elements, such as clock, S Planner, Camera, Gallery, Settings,
15
Email, Samsung Hub, and Music, lying outside the user-defined enclosed area 206 are
reconfigured or moved on to next UI element screen within the user-defined enclosed area
206. Thus, in the partial reconfiguration, a number of UI element screens containing the UI
elements may increase. However, in the partial reconfiguration, the size of the UI elements
is 5 not scaled down to adjust into the user-defined enclosed area 206.
[0065] Yet further, in another exemplary embodiment shown in Fig. 4, in case the
reconfiguration controller 116 makes a decision to perform the complete reconfiguration of
the user interface based on the reconfiguration setting, the complete reconfiguration
controller 120 is invoked to perform the complete reconfiguration of the user interface
10 within the user-defined enclosed area 206. Thereafter, the complete reconfiguration
controller 120 optimizes or scales down the size of all user interface (UI) elements to
accommodate within the user-defined enclosed area 206 on a current UI element screen. The
optimized or scaled down UI elements are then reconfigured or shrank within the userdefined
enclosed area 206.
15 [0066] For example, as can be seen in right side of Fig. 4, size of all the UI elements
is scaled down to adjust all the UI elements within the user-defined enclosed area 206
enclosed on the touch-screen 108. Thus, in the complete reconfiguration, a number of UI
element screens on the touch device 100 are not decreased as no UI element is reconfigured
or moved to next UI element screen. However, in the complete reconfiguration, the visibility
20 of the elements is affected due to scaling down of the size of all the UI elements.
[0067] The reconfiguration of the user interface is performed by using the
technologies known in the art to a skilled person. Such technologies may divide an existing
display area for the reconfigured user interface, into a plurality of sub-areas and calculates
the coordinates of each sub-area. Thereafter, a mapping relationship between the coordinates
25 of the actual display area and the coordinates of the reconfigured display sub-area is
determined, so as to display the reconfigured user interface. However, those of ordinary skill
in the art will recognize that various changes and modifications of the embodiments
described herein can be made without departing from the scope and spirit of the present
subject matter. In addition to that, descriptions of well-known functions and constructions
16
for reconfiguration of the user interface are omitted in the description provide herein for
clarity and conciseness.
[0068] While the present subject matter has been shown and described with
reference to certain exemplary embodiments thereof, it will be understood by those skilled
in the art that various changes in form and details may 5 y be made therein without departing
from the scope of the present subject matter as described herein.
[0069] Further, as mentioned above, in addition to the partial reconfiguration and the
complete reconfiguration, any other reconfiguration technique can be implemented to
restructure distant user interface (UI) elements within the user-touchable area, which is
10 defined within the reach of the user’s hand to motivate a single handed operation.
[0070] Thus, by implementing the above mentioned reconfiguration techniques, the
present subject matter provides convenience to a user for interacting with distant UI
elements even when the distant UI elements are positioned beyond a single hand operational
capability of the user. The present subject matter facilitates the mentioned convenience by
15 dynamically reconfiguration of the user interface within the user-touchable area computed
based on the user swipe input 202. Such reconfiguration of the user interface ensures that all
the UI elements on the user interface are within the reach of the user during a single hand
operation.
[0071] Further, the present subject matter is implemented on existing touch-screen
20 computing device, and thus does not require any additional hardware.
[0072] Moreover, as can be seen in Fig. 3 and Fig. 4, a portion outside the usertouchable
area or user-defined enclosed area 206 of the reconfigured user interface, is left
unutilized. The said portion outside the user-touchable area or user-defined enclosed area
206 can be used to preview images, videos, contacts, grids of files/folders, or other preview25
able files or items. The setting for the said portion can be made through user definable
reconfiguration setting of the touch device 100.
[0073] In an example, the reconfigured user interface may include soft-keys
representing the functionality of the hard-keys of the touch device 100. This ensures that the
17
user may not have to stretch his hand to reach the hard-keys provided on the top of the touch
device 100.
[0074] The operation of touch device 100 is further explained in conjunction with
Fig. 5 and Fig. 6. Fig. 5 and Fig. 6 illustrate methods 500 and 600 for reconfiguration of
user interface on a touch device 100. The order in which the methods 5 thods 500 and 600 is
described is not intended to be construed as a limitation, and any number of the described
method blocks can be combined in any order to implement the methods, or alternative
methods. Additionally, individual blocks may be deleted from the methods without
departing from the spirit and scope of the subject matter described herein.
10 [0075] The methods may be described in the general context of computer executable
instructions. Generally, computer executable instructions can include routines, programs,
objects, components, data structures, procedures, modules, functions, etc., that perform
particular functions or implement particular abstract data types. The methods may also be
practiced in a distributed computing environment where functions are performed by remote
15 processing devices that are linked through a communications network. In a distributed
computing environment, computer executable instructions may be located in both local and
remote computer storage media, including memory storage devices.
[0076] A person skilled in the art will readily recognize that steps of the methods
500 and 600 can be performed by programmed computers and computing devices. Herein,
20 some embodiments are also intended to cover program storage devices, for example, digital
data storage media, which are machine or computer readable and encode machineexecutable
or computer-executable programs of instructions, where said instructions
perform some or all of the steps of the described method. The program storage devices may
be, for example, digital memories, magnetic storage media, such as a magnetic disks and
25 magnetic tapes, hard drives, or optically readable digital data storage media. The
embodiments are also intended to cover both communication network and computing
devices configured to perform said steps of the exemplary method.
[0077] Referring to Fig. 5, at block 502, a user swipe input 202 is received from a
user on a touch-screen 108. In an example, the touch-screen 108 of the touch device 1000
30 may receive the user swipe input 202 when the user swipes user input means, for example,
18
user thumb or user finger or user stylus, from a first edge 204-1 of the touch-screen 108 to a
second edge 204-2 of the touch-screen 108 tracing a swipe boundary on the touch-screen
108.
[0078] At block 504, based on the received user swipe input 202, the surface area
receiver 114 of the touch device 100 determines a user-touchable area. 5 In an example, the
user-touchable area. In an example, the user-touchable area can be either a user-defined
swipe boundary area or a user-defined enclosed area 206 enclosed by the swipe boundary
and sides of the touch-screen.
[0079] At block 506, a reconfiguration controller 116 reconfigures the user interface
10 present on the touch-screen 108 within the user-touchable area based on reconfiguration
setting. Such reconfiguration of the user interface ensures a single handed operation of the
touch device 100 by positioning all the user interface (UI) elements within the user-defined
enclosed area 206 of the user.
[0080] The operation of reconfiguration of the user interface is further explained in
15 detail in conjunction with Fig. 6. Fig. 6 describes the method 600 for reconfiguration of the
user interface on the touch device 100, in accordance with one implementation of the present
subject matter.
[0081] At block 602, a user swipe input 202 is received from a user on a touchscreen
108. In an example, the touch-screen 108 of the touch device 1000 may receive the
20 user swipe input 202 when the user swipes user input means, for example, user thumb or
user stylus, from a first edge 204-1 of the touch-screen 108 to a second edge 204-2 of the
touch-screen 108.
[0082] At block 604, based on the received user swipe input 202, the surface area
receiver 114 of the touch device 100 determines a user-touchable area. In an example, the
25 user-touchable area can be either a user-defined swipe boundary area or a user-defined
enclosed area 206 enclosed by the swipe boundary and sides of the touch-screen.
[0083] At block 606, based on a reconfiguration setting, a reconfiguration controller
116 makes a decision on what type of reconfiguration of the user interface is to be executed.
For example, based on the reconfiguration setting, a partial reconfiguration would be
19
performed when the user swipe input 202 in provided in an upward direction, while a
complete reconfiguration would be performed when the user swipe input 202 in provided in
downward direction, and vice versa.
[0084] Thus, in an example, the reconfiguration of the user interface can be
categorized into two categories, namely the partial reconfiguration and the comp5 lete
reconfiguration, based on the direction of the user swipe input 202. For example, the partial
reconfiguration is performed when the reconfiguration controller 116 detects the user swipe
input 202 in a direction from the first edge 204-1 of the touch-screen 108 to the second edge
204-2 of the touch-screen 108. Similarly, the complete reconfiguration is performed when
10 the reconfiguration controller 116 detects the user swipe input 202 in a direction from the
second edge 204-2 of the touch-screen 108 to the first edge 204-1 of the touch-screen 108.
[0085] In an exemplary embodiment, in case the reconfiguration controller 116
detects that the partial configuration is to be performed, the reconfiguration controller 116
invokes the partial reconfiguration controller 118 to perform the partial reconfiguration of
15 the user interface within the user-defined enclosed area 206 enclosed by the user swipe input
202.
[0086] At block 608, the partial reconfiguration controller 118 retains positions of
UI elements lying within the user-defined enclosed area 206 on a current UI element screen.
That is, the UI elements lying within the user-defined enclosed area 206 are left unchanged
20 on the touch-screen 108 of the touch device 100.
[0087] At block 610, the partial reconfiguration controller 118 reconfigures positions
of UI elements lying outside the user-defined enclosed area 206 onto a next UI element
screen. That is, the UI elements lying outside the user-defined enclosed area 206 are
reconfigured or moved within the user-defined enclosed area 206 on the next UI element
25 screen.
[0088] At block 612, once the partial reconfiguration is performed, the reconfigured
user interface is outputted on a display unit of the touch device 100.
[0089] In another exemplary embodiment, in case the reconfiguration controller 116
detects that the complete configuration is to be performed, the reconfiguration controller 116
20
invokes the complete reconfiguration controller 120 to perform the complete reconfiguration
of the user interface within the user-defined enclosed area 206 enclosed by the user swipe
input 202.
[0090] At block 614, the complete reconfiguration controller 120 optimizes or scale
down the size of all UI elements in such a way that the optimized or scaled 5 ed down UI
elements may accommodate within the user-defined enclosed area 206 on a current UI
element screen.
[0091] At 616, once the size of all the UI elements is optimized or scaled down, the
optimized or scaled down UI elements are reconfigured or shrank within the user-defined
10 enclosed area 206 on the current UI element screen.
[0092] At 612, once the complete reconfiguration is performed, the reconfigured
user interface is outputted on the display unit of the touch device 100.
[0093] Thus, by implementing the reconfiguration techniques mentioned in the
present subject matter, user interface or all user interface elements are positioned within a
15 user-defined enclosed area 206 or a user-touchable area lying within the reach of a user, so
as to facilitate a single handed operation of the touch device 100.
[0094] As is apparent from the above description of the present subject matter, since
a user-touchable area is set in a display area of the touch-screen and UI elements are
reconfigured in the user-touchable area by adjusting the positions and sizes of the UI
20 elements in the touch device, a user experience is enhanced. Furthermore, such
reconfiguration of the UI elements may utilize less number of computing resources as
compared to the related art touch devices as the reconfigured UI elements utilizes a partial
area of the touch-screen as a user interface.
[0095] Although embodiments for methods and systems for the present subject
25 matter have been described in a language specific to structural features and/or methods, it is
to be understood that the present subject matter is not necessarily limited to the specific
features or methods described. Rather, the specific features and methods are disclosed as
exemplary embodiments for the present subject matter.
21

I/We claim:
1. A method for reconfiguring user interface (UI) on a touch device (100), the method
comprising:
receiving a user swipe input (202) from a user on a touch-screen (108) of the
5 touch device (100);
determining a user-touchable area on the touch-screen (108) based on the
user swipe input (202); and
reconfiguring the UI on the touch-screen (108) within the user-touchable area
based on a reconfiguration setting.
10 2. The method as claimed in claim 1, wherein the user-touchable area is one of a userdefined
swipe boundary area and a user-defined enclosed area (106).
3. The method as claimed in claim 1, wherein the receiving comprises one of:
tracing a swipe boundary by a user input means from a first edge (204-1) of
the touch-screen (108) to a second edge (204-2) of the touch-screen (108);
15 tracing a swipe boundary by the user input means from a point nearest to the
first edge (204-1) of the touch-screen (108) to a point nearest to a second-edge (204-
2) of the touch-screen (108); and
tracing a swipe boundary by touching a soft-button provided on the touchscreen
(108) using the user input means.
20 4. The method as claimed in claim 2, wherein the tracing the swipe boundary by
touching the soft-button comprises tracing the swipe boundary based on mean value
of previous swipe boundaries traced by touching the soft-button.
5. The method as claimed in claim 4, wherein the previous swipe boundaries are stored
as a swipe history in the touch device (100).
25 6. The method as claimed in claim 3, wherein the first edge (204-1) and the second
edge (204-2) are adjacent sides.
7. The method as claimed in claim 3, wherein the first edge (204-1) and the second
edge (204-2) are oppositely lying sides.
22
8. The method as claimed in claim 3, wherein the user input means include at least one
of a user finger, a user thumb, and a stylus.
9. The method as claimed in claim 1, wherein based on the reconfiguration setting, the
reconfiguring comprises:
retaining positions of UI elements lying within a user-touchable 5 hable area on a
current UI element screen; and
reconfiguring positions of UI elements lying outside the user-touchable area
on a next UI element screen within the user-touchable area (206).
10. The method as claimed in claim 1, wherein based on the reconfiguration setting, the
10 reconfiguring comprises:
optimizing size of the UI elements to accommodate within the user-touchable
area on a current UI element screen, and
reconfiguring positions of all the optimized UI elements within the usertouchable
area on the current UI element screen.
15 11. The method as claimed in claim 1 further comprises prompting the user to again
provide the user swipe input (202) when the user-touchable area is determined to be
below a predefined threshold area of the touch-screen (108).
12. The method as claimed in claim 1, wherein after reconfiguring, the method
comprises previewing at least one item in a portion, outside the user-touchable area,
20 of the touch-screen (108).
13. The method as claimed in claim 1 further comprises representing hard-keys of the
touch device (100) as soft-keys in the reconfigured UI.
14. A touch device (100) comprising:
a processor (102);
25 a touch-screen (108), coupled to the processor (102), to receive a user swipe
input (202) from a user;
a surface area processor (114), coupled to the processor (102), to determine a
user-touchable area based on the user swipe input (202); and
23
a reconfiguration controller (116), coupled to the processor (102), to
reconfigure a user interface (UI) on the touch-screen (108) within the user-touchable
area based on a reconfiguration setting.
15. The touch device (100) as claimed in claim 14, wherein the user-touchable area is
one of a user-defined swipe 5 ipe boundary area and a user-defined enclosed area (106).
16. The touch device (100) as claimed in claim 14, wherein the touch-screen (108)
receives the user swipe input (202) by one of:
tracing a swipe boundary on the touch-screen (108) using a user input means
from a first edge (204-1) of the touch-screen (108) to a second edge (204-2) of the
10 touch-screen (108);
tracing a swipe boundary by a user input means from a point nearest to the
first edge (204-1) of the touch-screen (108) to a point nearest to a second-edge (204-
2) of the touch-screen (108); and
tracing a swipe boundary by touching a soft-button provided on the touch15
screen (108) using the user input means.
17. The touch device (100) as claimed in claim 16, wherein the touch device (100)
comprises a reconfiguration mechanism that traces the swipe boundary based on
mean value of previous swipe boundaries traced by touching the soft-button.
18. The touch device (100) as claimed in claim 17, wherein the previous swipe
20 boundaries are stored as a swipe history in the touch device (100).
19. The touch device (100) as claimed in claim 16, wherein the first edge (204-1) and the
second edge (204-2) are adjacent sides.
20. The touch device (100) as claimed in claim 16, wherein the first edge (204-1) and the
second edge (204-2) are oppositely lying sides.
25 21. The touch device (100) as claimed in claim 16, wherein the user input means
comprises at least one of a user finger, a user thumb, and a stylus.
22. The touch device (100) as claimed in claim 14, wherein the touch device (100)
comprises a partial reconfiguration controller (118) to:
24
retain positions of UI elements lying within a user-defined enclosed area
(206) on a current UI element screen, and
reconfigure positions of UI elements lying outside the user-defined enclosed
area (206) on a next UI element screen in the user-defined enclosed area (206).
23. The touch device (100) as claimed 5 d in claim 14, wherein the touch device (100)
comprises a complete reconfiguration controller (120) to:
optimize size all UI elements to accommodate within the user-defined
enclosed area (206) on a current UI element screen, and
reconfigure positions of all the UI elements within the user-defined enclosed
10 area (206) on the current UI element screen.
24. The touch device (100) as clamed in claim 14, wherein the surface area processor
(114) prompts the user to again provide the user swipe input (202) when the usertouchable
area is determined to be below a predefined threshold area of the touchscreen
(108).
15 25. The touch device (100) as clamed in claim 14, wherein the reconfiguration controller
(116) previews at least one item in a portion, outside the user-touchable area, of the
touch-screen (108).
26. The touch device (100) as clamed in claim 14, the reconfigured user interface
comprises soft-keys representing the functionality of the hard-keys of the touch
20 device (100).
27. A non-transitory computer-readable medium having a set of computer readable
instructions that, when executed, cause a processor (102) to:
receive a user swipe input (202) from a user on a touch-screen (108) of a
touch device (100);
25 determining a user-touchable area on the touch-screen (108) based on the
user swipe input (202); and
25
reconfigure user interface on the touch-screen (108) within the user-touchable
area based on a reconfiguration setting.
5
10
Dated 20 January 2014
JAYA PANDEYA
IN/PA-1345
Agent for the Applicant
15 To,
The Controller of Patents
The Patent Office at New Delhi

Documents

Application Documents

# Name Date
1 166-del-2014-GPA-(28-01-2014).pdf 2014-01-28
2 166-del-2014-Correspondence-Others-(28-01-2014).pdf 2014-01-28
3 166-del-2014-Correspondence-Others-(04-02-2014).pdf 2014-02-04
4 SPEC IN.pdf 2014-02-05
5 GPOA.pdf 2014-02-05
6 FORM 5.pdf 2014-02-05
7 FORM 3.pdf 2014-02-05
8 FIG IN.pdf 2014-02-05
9 166-DEL-2014-Request For Certified Copy-Online(22-12-2014).pdf 2014-12-22
10 PD011885IN-SC.pdf 2014-12-23
11 166-DEL-2014-RELEVANT DOCUMENTS [08-05-2018(online)].pdf 2018-05-08
12 166-DEL-2014-Changing Name-Nationality-Address For Service [08-05-2018(online)].pdf 2018-05-08
13 166-DEL-2014-AMENDED DOCUMENTS [08-05-2018(online)].pdf 2018-05-08
14 166-DEL-2014-FER.pdf 2019-02-28
15 166-DEL-2014-OTHERS [21-08-2019(online)].pdf 2019-08-21
16 166-DEL-2014-FER_SER_REPLY [21-08-2019(online)].pdf 2019-08-21
17 166-DEL-2014-DRAWING [21-08-2019(online)].pdf 2019-08-21
18 166-DEL-2014-CLAIMS [21-08-2019(online)].pdf 2019-08-21
19 166-DEL-2014-PA [18-09-2019(online)].pdf 2019-09-18
20 166-DEL-2014-ASSIGNMENT DOCUMENTS [18-09-2019(online)].pdf 2019-09-18
21 166-DEL-2014-8(i)-Substitution-Change Of Applicant - Form 6 [18-09-2019(online)].pdf 2019-09-18
22 166-DEL-2014-OTHERS-101019.pdf 2019-10-14
23 166-DEL-2014-Correspondence-101019.pdf 2019-10-14
24 166-DEL-2014-US(14)-HearingNotice-(HearingDate-05-09-2022).pdf 2022-08-19
25 166-DEL-2014-FORM-26 [02-09-2022(online)].pdf 2022-09-02
26 166-DEL-2014-Correspondence to notify the Controller [02-09-2022(online)].pdf 2022-09-02
27 166-DEL-2014-Written submissions and relevant documents [15-09-2022(online)].pdf 2022-09-15
28 166-DEL-2014-PatentCertificate31-01-2023.pdf 2023-01-31
29 166-DEL-2014-IntimationOfGrant31-01-2023.pdf 2023-01-31

Search Strategy

1 search_strategy_28-02-2019.pdf

ERegister / Renewals

3rd: 18 Apr 2023

From 20/01/2016 - To 20/01/2017

4th: 18 Apr 2023

From 20/01/2017 - To 20/01/2018

5th: 18 Apr 2023

From 20/01/2018 - To 20/01/2019

6th: 18 Apr 2023

From 20/01/2019 - To 20/01/2020

7th: 18 Apr 2023

From 20/01/2020 - To 20/01/2021

8th: 18 Apr 2023

From 20/01/2021 - To 20/01/2022

9th: 18 Apr 2023

From 20/01/2022 - To 20/01/2023

10th: 18 Apr 2023

From 20/01/2023 - To 20/01/2024

11th: 12 Dec 2023

From 20/01/2024 - To 20/01/2025

12th: 14 Nov 2024

From 20/01/2025 - To 20/01/2026

13th: 23 Oct 2025

From 20/01/2026 - To 20/01/2027