Sign In to Follow Application
View All Documents & Correspondence

System And Method For Performing Tangential Swipe For Devices With Curved Interfaces

Abstract: Method and system for interaction with a curved display on a User Equipment (UE). The UE, upon receiving a touch input, identifies, based on pre-configured data, the touch input as one of a surface swipe or a tangential swipe. Further, the UE determines at least one action to be performed corresponding to the identified touch input, and then triggers the determined action.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
30 March 2015
Publication Number
42/2016
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
patent@bananaip.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-01-10
Renewal Date

Applicants

SAMSUNG R&D Institute India - Bangalore Private Limited
# 2870, Orion Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanekundi Circle, Marathahalli Post, Bangalore-560037, India

Inventors

1. Samudrala Nagaraju
# 2870, Orion Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanekundi Circle, Marathahalli Post, Bangalore-560037, India
2. Rames Palanisamy
# 2870, Orion Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanekundi Circle, Marathahalli Post, Bangalore-560037, India
3. Raghu Vallikkat Thazhathethil
# 2870, Orion Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanekundi Circle, Marathahalli Post, Bangalore-560037, India

Specification

DESC:TECHNICAL FIELD
[001] The present invention relates to the field of electronic devices with a curved touch screen display and more particularly to user interaction with electronic devices with a curved touch screen display.

BACKGROUND
[002] Devices such as, but not limited to mobile phones, tablet computers, and computers have become an essential part of our day to day lives, with the evolution of technology, as the aforementioned devices are now capable of doing much more than what their ancestors were designed for. The devices have become compact and sophisticated. The manufacturers of the devices, who are well aware of the fact that new features would attract their customers, are in a constant race with their competitors, so as to come up with better technologies to conquer the market.
[003] If we analyze the progress in this domain over the past few years, it’s quite clear that the device manufactures are constantly trying to improve the way users are interacting with the device. Keyboards have been replaced with virtual keyboards i.e. touch screens, and the users are now able to provide touch inputs to control various functionalities of the device. Some devices are also capable of supporting gesture input based control of functionalities.

[004] Now, the display screens are undergoing a major transformation. Mobile phones with curved edge screens have been introduced to the market. The mobile phones with the curved edge screens can be configured to trigger different actions in response to the touch input given at the flat surface, and the curved edge of the screen. The existing mobile phones with the flat surface touch screens are capable of collecting touch inputs on the touch screen, and identifying the action to be triggered, in response to the collected touch input. However, with the existing technologies, the mobile device may not be able to collect and process touch inputs for the flat surface screen and the curved edge screen, separately; which in turn results in a bad user experience.

OBJECT OF INVENTION
[005] An object of the embodiments herein is to allow interaction with a curved surface display.
[006] Another object of the embodiments herein is to enable the curved display to differentiate between surface swipe and tangential swipe in the same point.

SUMMARY
[007] In view of the foregoing, an embodiment herein provides a method for touch interaction with a curved display. In this method, a User Equipment (UE) the curved display is associated with, receives a touch input on the curved display. The UE identifies the touch input as at least one of a tangential swipe and a surface swipe. The UE further triggers at least one action corresponding to the identified touch input.
[008] Embodiments further disclose a system for a device for allowing touch interaction through a curved display on the device. The device receives a touch input on the curved display, and then identifies the touch input as at least one of a tangential swipe and a surface swipe. The device further triggers at least one action corresponding to the identified touch input.
[009] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.

BRIEF DESCRIPTION OF THE FIGURES
[0010] The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:
[0011] FIGS. 1a and 1b illustrate user interaction with curved display on a User Equipment (UE), as disclosed in the embodiments herein;
[0012] FIG. 2 is a flow diagram that depicts steps involved in the process of triggering an action corresponding to a touch input, as disclosed in the embodiments herein;
[0013] FIG. 3 is a flow diagram that depicts steps involved in the process of differentiating between a surface swipe and a tangential swipe, by the UE, as disclosed in the embodiments herein; and
[0014] FIG. 4 illustrates example implementations of the touch interaction with curved displays, as disclosed in the embodiments herein.


DETAILED DESCRIPTION OF EMBODIMENTS
[0015] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0016] The embodiments herein disclose a mechanism for touch interaction with a User Equipment (UE) having curved display. Referring now to the drawings, and more particularly to FIGS. 1 through 4, where similar reference characters denote corresponding features consistently throughout the figures, there are shown embodiments.
[0017] FIGS. 1a and 1b illustrate user interaction with curved display on a User Equipment (UE), as disclosed in the embodiments herein. The UE 101 can be any device with a curved display that provides a touch interface for the user to communicate with the UE 101. For example, the UE 101 can be a mobile phone, a tablet PC, a laptop, or a television with a curved display, wherein the curved display may comprise of a primary screen and at least one secondary screen. The UE 101 can possess a single edge display (as in Fig. 1a) or a dual edge display (as in Fig. 1b). The UE 101 can be configured to receive a touch input from a user, and identify, based on at least one real time characteristic pertaining to at least one interaction parameter associated with the received touch input, the type of touch as one of a tangential curved swipe, a surface curved swipe, multiple curved swipe, or alternate curved swipe interaction. In an embodiment, the UE 101 can be further configured to differentiate between the tangential swipe and the surface swipe, even though certain interaction parameters can be common between the surface and tangential swipe.
[0018] The UE 101 can be further configured to determine at least one action to be triggered based on the identified touch input. In an embodiment, the action that needs to be triggered corresponding to an identified touch input is pre-defined and pre-configured with the UE 101.
[0019] FIG. 2 is a flow diagram that depicts steps involved in the process of triggering an action corresponding to a touch input, as disclosed in the embodiments herein. Initially, the UE 101 receives (202) a touch input on the display screen, wherein the touch input may be received on the primary or secondary (if present) display screen. The UE 101 further captures real-time information pertaining to all interaction parameters associated with the received touch input. Further, based on the real-time characteristics of interaction parameters, at least one event, and data stored in the reference database, the UE 101 identifies (204) the type of touch input, primarily as one of a tangential curved swipe or a surface curved swipe. Here, the term ‘characteristics’ may refer to value of the interaction parameter being considered. For example, if the interaction parameter being considered is ‘Swipe speed’, then the swipe speed measured in real-time is termed as the characteristic of the interaction parameter ‘Swipe speed’.
[0020] Further, for the identified touch type, the UE 101 may further try to determine variations such as but not limited to multiple curved swipe, and alternate curved swipe interaction. In various embodiments, data such as but not limited to the interaction parameters, characteristics of different interaction parameters, conditions (in terms of characteristics) that connect one or a combination of interaction parameters with a particular type of swipe, and action(s) to be triggered corresponding to different surface/tangential swipe are pre-defined and stored in the reference database.
[0021] After identifying the type of touch input received, the UE 101, by referring to the data in the reference database, determines at least one action to be triggered in response to the identified touch input. The reference database possesses information pertaining to at least one action that needs to be triggered corresponding to different types of swipes. For example, based on different interaction parameters defined for surface curved swipe, and for different characteristics of each of the interaction parameter defined, certain actions can be configured. Similarly, for the tangential curved swipe, different actions can be configured.
[0022] If the touch input is identified as tangential curved swipe, then the UE 101 identifies the action(s) to be triggered, out of the actions configured for the tangential curved swipe, and triggers (208) the action. If the touch input is identified as surface curved swipe, then the UE 101 identifies the action(s) to be triggered, out of the actions configured for the surface curved swipe, and triggers (210) the action. The various actions in method 200 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some actions listed in FIG. 2 may be omitted.
[0023] FIG. 3 is a flow diagram that depicts steps involved in the process of differentiating between a surface swipe and a tangential swipe, by the UE, as disclosed in the embodiments herein. Upon receiving a touch input, the UE 101 collects (302) real-time information pertaining to characteristics of various interaction parameters associated with the touch input. Few examples of interaction parameters and corresponding definitions are given below:
? Swipe path: - Identifies swipe path at a curved region
? Swipe speed: - Identifies swipe speed at a curved region
? Swipe start: - identifies start of swipe at a curved region
? Swipe end: - identifies end of swipe at a curved region
? Touch area: - identifies extent of touch at the curved region for providing the touch input
? No of curved regions: - defines single, dual, and multi surface tangential swipe
? No of fingers on each curved region: - refers to number of fingers on each curved region
[0024] Similarly, certain ‘events’ also can be defined, based on which the interaction parameter can be analyzed to identify the type of touch input. Examples of events are:
? Curved swipe towards: - If the starting of the curved swipe starts from the primary (i.e. normal screen) screen towards the secondary (i.e. curved screen) screen, this event occurs.
? Curved swipe away: - If the starting of the curved swipe starts from the secondary (i.e. curved screen) screen towards the primary (i.e. normal screen) screen, this event occurs.
[0025] One or more of the interaction parameters mentioned above, along with any other similar parameter, can be used to define different types of touch interactions, and corresponding actions to be triggered. For example, if the swipe start and swipe end are on the curved screen i.e. on the secondary screen, then the touch is recognized as a ‘Tangential curved swipe’.
[0026] Upon collecting the real-time characteristics info, and the event info, pertaining to at least one interaction parameter, the UE compares (304) the collected data with the reference database. If only one interaction parameter is associated with the touch input, then the characteristics and events if any, corresponding to that interaction parameter is compared with the data in the reference database. If more than one interaction parameter is associated with the detected touch input, then a combination of characteristics and events of the associated interaction parameters is compared with the reference database.
[0027] By comparing the collected data with the reference database, the UE identifies (306) a match for the collected data in the reference database. If a match is found, then the UE 101 identifies (308) the touch type mapped against the identified match, as the type of the received touch input. For example, if the match identified in the reference database is associated with Tangential curved swipe, then the received touch input is identified as ‘Tangential curved swipe’.
[0028] Consider an example implementation scenario in which the UE 101 is configured to differentiate between the surface swipe and tangential swipe. In this case, the responsibility of differentiating between the tangential swipe and the surface swipe is being assigned to a Graphical User Interface (GUI) layer of the UE 101. The GUI layer comprises of 4 sub-layers namely application layer, widgets and layouts, graphics framework, and a driver layer. Further, each sub-layer is configured to identify certain interaction parameters as ‘primary parameters’ and some as ‘secondary parameters’. Example of sublayer-wise distribution of the interaction parameters is provided in Table. 1.
Device Graphics Layer Primary/Base parameters Secondary/Additional Parameters
Application Keypad, camera, etc, applications – registered event Keypad, camera, etc, applications – registered event
Widgets&Layouts – Events Type: Tangential Swipe/Surface Swipe
Optional widget Id: Start, End Swipe- number of swipes on each screen
Graphics Framework Swipe – start, end, speed, area Swipe – number of regions, number of screens
Driver Layer Touch points – X, Y, Touch area Screen number touched, number of curved regions, number of touch on each curved region

Table. 1
[0029] The primary and secondary parameters are then collectively analyzed, initially to determine the type of touch input, and then to identify the action to be triggered in response to the determined touch input.
[0030] The various actions in method 300 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some actions listed in FIG. 3 may be omitted.
[0031] FIG. 4a illustrates example implementations of the touch interaction with curved displays, as disclosed in the embodiments herein. A wearable device with a curved surface is depicted in this figure. Different actions are configured for combination of interaction parameters and different events corresponding to each of the interaction parameters. For example, if the touch input received is a surface swipe and if the event is ‘swipe-down’, then action1 is triggered, wherein the action1 is pre-configured. Similarly, if for the surface swipe, the event detected is ‘swipe-up’, then ‘action3’ is triggered, as pre-configured. Similarly, if the touch input received is a tangential swipe and if the event is ‘swipe-down’, then action2 is triggered, wherein the action2 is pre-configured for this combination of touch input and event. If for the tangential swipe, the event detected is ‘swipe up’, then ‘action4’ is triggered, as pre-configured.
[0032] The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the network elements. The network elements shown in Fig. 1 include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.
[0033] The embodiments disclosed herein specify a mechanism for interacting with a curved display surface on a User Equipment (UE). The mechanism allows differentiation between surface swipe and tangential swipe, providing a system thereof. Therefore, it is understood that the scope of protection is extended to such a system and by extension, to a computer readable means having a message therein, said computer readable means containing a program code for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The method is implemented in a preferred embodiment using the system together with a software program written in, for ex. Very high speed integrated circuit Hardware Description Language (VHDL), another programming language, or implemented by one or more VHDL or several software modules being executed on at least one hardware device. The hardware device can be any kind of device which can be programmed including, for ex. any kind of a computer like a server or a personal computer, or the like, or any combination thereof, for ex. one processor and two FPGAs. The device may also include means which could be for ex. hardware means like an ASIC or a combination of hardware and software means, an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means are at least one hardware means or at least one hardware-cum-software means. The method embodiments described herein could be implemented in pure hardware or partly in hardware and partly in software. Alternatively, the embodiment may be implemented on different hardware devices, for ex. using a plurality of CPUs.
[0034] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the claims as described herein. ,CLAIMS:CLAIMS
What is claimed is:
1) A method for touch interaction with a curved display, said method comprising:
receiving a touch input on said curved display, by a User Equipment (UE);
identifying said touch input as at least one of a tangential swipe and a surface swipe, by said UE; and
triggering at least one action corresponding to said identified touch input, by said UE.
2) The method as claimed in claim 1, wherein identifying said touch input as one of said tangential swipe and surface swipe further comprises of:
collecting real-time characteristic of at least one interaction parameter pertaining to said touch input, by said UE;
comparing said real time characteristic of said at least one interaction parameter with a reference database, by said UE, wherein said reference database comprises of information corresponding to at least one characteristic of said at least one interaction parameter, wherein said characteristic relates said at least one interaction parameter to at least one of said tangential swipe and surface swipe;
identifying a match for said characteristic of said interaction parameter, in said reference database, by said UE;
identifying said touch input as a tangential swipe if said identified match in said reference database corresponds to a tangential swipe, by said UE; and
identifying said touch input as a surface swipe if said identified match in said reference database corresponds to a surface swipe, by said UE.
3) The method as claimed in claim 2, wherein said reference database comprises of information that relates a combination of real-time characteristics of a plurality interaction parameters with at least one of said tangential swipe and surface swipe.
4) The method as claimed in claim 1, wherein said at least one action corresponding to said touch input is pre-configured.
5) The method as claimed in claim 1, wherein said interaction parameter is at least one of a swipe path, swipe speed, swipe start point, swipe end point, touch area, number of curved regions, and number of fingers on each curved region.
6) A device for allowing touch interaction through a curved display on said device, said device configured for:
receiving a touch input on said curved display;
identifying said touch input as at least one of a tangential swipe and a surface swipe; and
triggering at least one action corresponding to said identified touch input.
7) The device as claimed in claim 6, wherein said device is configured to identify said touch input as one of said tangential and surface swipe by:
collecting real-time characteristic of at least one interaction parameter pertaining to said touch input;
comparing said real time characteristic of said at least one interaction parameter with a reference database, wherein said reference database comprises of information corresponding to at least one characteristic of said at least one interaction parameter, wherein said characteristic relates said at least one interaction parameter to at least one of said tangential swipe and surface swipe;
identifying a match for said characteristic of said interaction parameter, in said reference database;
identifying said touch input as a tangential swipe if said identified match in said reference database corresponds to a tangential swipe; and
identifying said touch input as a surface swipe if said identified match in said reference database corresponds to a surface swipe.
8) The device as claimed in claim 7, wherein said device is configured to store information that relates a combination of real-time characteristics of a plurality interaction parameters with at least one of said tangential swipe and surface swipe, in said reference database.
9) The device as claimed in claim 6, wherein said device is configured to provide at least one option to pre-configure said at least one action corresponding to said touch input.
10) The device as claimed in claim 6, wherein said device is configured to consider at least one of a swipe path, swipe speed, swipe start point, swipe end point, touch area, number of curved regions, and number of fingers on each curved region, as said interaction parameter.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 1659-CHE-2015-IntimationOfGrant10-01-2024.pdf 2024-01-10
1 Form 5.pdf 2015-04-13
2 1659-CHE-2015-PatentCertificate10-01-2024.pdf 2024-01-10
2 FORM 3.pdf 2015-04-13
3 Form 2.pdf 2015-04-13
3 1659-CHE-2015-Annexure [05-01-2024(online)]-1.pdf 2024-01-05
4 Drawings.pdf 2015-04-13
4 1659-CHE-2015-Annexure [05-01-2024(online)].pdf 2024-01-05
5 Drawing [19-10-2015(online)].pdf 2015-10-19
5 1659-CHE-2015-PETITION UNDER RULE 137 [05-01-2024(online)].pdf 2024-01-05
6 Description(Complete) [19-10-2015(online)].pdf 2015-10-19
6 1659-CHE-2015-RELEVANT DOCUMENTS [05-01-2024(online)].pdf 2024-01-05
7 1659-CHE-2015-Written submissions and relevant documents [05-01-2024(online)].pdf 2024-01-05
7 1659-CHE-2015-Power of Attorney-110915.pdf 2015-11-23
8 1659-CHE-2015-FORM-26 [20-12-2023(online)].pdf 2023-12-20
8 1659-CHE-2015-Form 1-110915.pdf 2015-11-23
9 1659-CHE-2015-Annexure [08-12-2023(online)].pdf 2023-12-08
9 1659-CHE-2015-Correspondence-110915.pdf 2015-11-23
10 1659-CHE-2015-Correspondence to notify the Controller [08-12-2023(online)].pdf 2023-12-08
10 1659-CHE-2015-FORM-26 [15-03-2018(online)].pdf 2018-03-15
11 1659-CHE-2015-FORM-26 [08-12-2023(online)].pdf 2023-12-08
11 1659-CHE-2015-FORM-26 [16-03-2018(online)].pdf 2018-03-16
12 1659-CHE-2015-FER.pdf 2019-08-13
12 1659-CHE-2015-US(14)-HearingNotice-(HearingDate-21-12-2023).pdf 2023-11-23
13 1659-CHE-2015-ABSTRACT [10-02-2020(online)].pdf 2020-02-10
13 1659-CHE-2015-OTHERS [10-02-2020(online)].pdf 2020-02-10
14 1659-CHE-2015-CLAIMS [10-02-2020(online)].pdf 2020-02-10
14 1659-CHE-2015-FER_SER_REPLY [10-02-2020(online)].pdf 2020-02-10
15 1659-CHE-2015-COMPLETE SPECIFICATION [10-02-2020(online)].pdf 2020-02-10
15 1659-CHE-2015-CORRESPONDENCE [10-02-2020(online)].pdf 2020-02-10
16 1659-CHE-2015-COMPLETE SPECIFICATION [10-02-2020(online)].pdf 2020-02-10
16 1659-CHE-2015-CORRESPONDENCE [10-02-2020(online)].pdf 2020-02-10
17 1659-CHE-2015-FER_SER_REPLY [10-02-2020(online)].pdf 2020-02-10
17 1659-CHE-2015-CLAIMS [10-02-2020(online)].pdf 2020-02-10
18 1659-CHE-2015-ABSTRACT [10-02-2020(online)].pdf 2020-02-10
18 1659-CHE-2015-OTHERS [10-02-2020(online)].pdf 2020-02-10
19 1659-CHE-2015-FER.pdf 2019-08-13
19 1659-CHE-2015-US(14)-HearingNotice-(HearingDate-21-12-2023).pdf 2023-11-23
20 1659-CHE-2015-FORM-26 [08-12-2023(online)].pdf 2023-12-08
20 1659-CHE-2015-FORM-26 [16-03-2018(online)].pdf 2018-03-16
21 1659-CHE-2015-Correspondence to notify the Controller [08-12-2023(online)].pdf 2023-12-08
21 1659-CHE-2015-FORM-26 [15-03-2018(online)].pdf 2018-03-15
22 1659-CHE-2015-Annexure [08-12-2023(online)].pdf 2023-12-08
22 1659-CHE-2015-Correspondence-110915.pdf 2015-11-23
23 1659-CHE-2015-Form 1-110915.pdf 2015-11-23
23 1659-CHE-2015-FORM-26 [20-12-2023(online)].pdf 2023-12-20
24 1659-CHE-2015-Written submissions and relevant documents [05-01-2024(online)].pdf 2024-01-05
24 1659-CHE-2015-Power of Attorney-110915.pdf 2015-11-23
25 Description(Complete) [19-10-2015(online)].pdf 2015-10-19
25 1659-CHE-2015-RELEVANT DOCUMENTS [05-01-2024(online)].pdf 2024-01-05
26 Drawing [19-10-2015(online)].pdf 2015-10-19
26 1659-CHE-2015-PETITION UNDER RULE 137 [05-01-2024(online)].pdf 2024-01-05
27 Drawings.pdf 2015-04-13
27 1659-CHE-2015-Annexure [05-01-2024(online)].pdf 2024-01-05
28 Form 2.pdf 2015-04-13
28 1659-CHE-2015-Annexure [05-01-2024(online)]-1.pdf 2024-01-05
29 FORM 3.pdf 2015-04-13
29 1659-CHE-2015-PatentCertificate10-01-2024.pdf 2024-01-10
30 Form 5.pdf 2015-04-13
30 1659-CHE-2015-IntimationOfGrant10-01-2024.pdf 2024-01-10

Search Strategy

1 search_23-07-2019.pdf

ERegister / Renewals

3rd: 08 Apr 2024

From 30/03/2017 - To 30/03/2018

4th: 08 Apr 2024

From 30/03/2018 - To 30/03/2019

5th: 08 Apr 2024

From 30/03/2019 - To 30/03/2020

6th: 08 Apr 2024

From 30/03/2020 - To 30/03/2021

7th: 08 Apr 2024

From 30/03/2021 - To 30/03/2022

8th: 08 Apr 2024

From 30/03/2022 - To 30/03/2023

9th: 08 Apr 2024

From 30/03/2023 - To 30/03/2024

10th: 08 Apr 2024

From 30/03/2024 - To 30/03/2025

11th: 27 Mar 2025

From 30/03/2025 - To 30/03/2026