Abstract: The invention relates to manipulating two applications simultaneously using multi-touch. In one embodiment, a method implemented on a touchscreen device for manipulating two applications simultaneously, comprises: receiving a first touch input during display of a first application, the first touch input being received for at least a threshold time period; receiving a second touch input pertaining to a second application during display of the first application, the second touch input being received after the threshold time period while the first touch input is being still received; and displaying, in response to the second touch input, an overlay user interface pertaining to the second application over the display of the first application. Fig.1
CLIAMS:We claim:
1. A method implemented on a touchscreen electronic device for manipulating two applications simultaneously, the method comprising:
receiving a first touch input during display of a first application, the first touch input being received for at least a threshold time period;
receiving a second touch input pertaining to a second application during display of the first application, the second touch input being received after the threshold time period while the first touch input is being still received; and
displaying, in response to the second touch input, an overlay user interface pertaining to the second application over the display of the first application.
2. The method as claimed in claim 1, further comprising:
pairing the first application and the second application together in advance; and
storing pairing information in settings of the touchscreen electronic device.
3. The method as claimed in claim 2, wherein the pairing is done automatically by the touchscreen electronic device based on application usage timings, application usage frequency, or common data between the first application and the second application.
4. The method as claimed in claim 2, wherein the pairing is done based on one or more context specific user inputs received through a user interface unit.
5. The method as claimed in claim 1, further comprising:
inactivating the predefined action pertaining to the second application in the first application upon release of the first touch input.
6. The method as claimed in claim 1, wherein the second touch input is in form of a gesture.
7. The method as claimed in claim 1, wherein the second touch input and the predefined action pertains to one or more other applications than the second application.
8. A method implemented on a touchscreen electronic device for manipulating two applications simultaneously, the method comprising:
receiving a first touch input during display of a first application, the first touch input being received for at least a threshold time period;
generating, in response to the first touch input, an overlay user interface over the display of the first application; and
receiving a second touch input pertaining to selection of a second application in the overlay user interface, the second touch input being received after the threshold time period while the first touch input is being still received.
9. The method as claimed in claim 8, wherein the overlay user interface comprises a home screen or an application menu.
10. The method as claimed in claim 8 further comprising:
performing, in response to the second touch input, a predefined action pertaining to the second application in the overlay user interface.
11. The method as claimed in claim 8 further comprising:
closing the overlay user interface upon release of the first touch input.
12. The method as claimed in claim 8, wherein generating the overlay user interface comprises receiving a user selection on an option to generate the overlay user interface from amongst a plurality of options.
13. The method as claimed in claim 8, wherein generating the overlay user interface comprises directly generating the overlay user interface without requiring any user selection.
14. A method implemented on a touchscreen device for manipulating two applications simultaneously, the method comprising:
receiving a first touch input on a first application icon, the first touch input being received for at least a threshold time period;
receiving a second touch input on a second application icon, the second touch input being received after the threshold time period while the first touch input is being still received; and
generating, in response to the second touch input, an overlay user interface having a plurality of items based on common data between the first application and the second application.
15. The method as claimed in claim 14, wherein the plurality of items comprises one or more of most recently used items, most frequently used items, recommended items, top categories/sub-categories of items, or predefined items.
16. The method as claimed in claim 14, further comprising:
pairing the first application and the second application together in advance; and
storing pairing information in settings of the touchscreen electronic device.
17. The method as claimed in claim 14, further comprising:
navigating through a plurality of screens to locate the second application icon when not present on a same screen as that of the first application icon, wherein the navigating is performed in background while the first touch input holds the first application icon.
18. The method as claimed in claim 14, further comprising:
receiving a third touch input in the overlay user interface; and
performing a predefined action in response to the third touch input.
19. A touchscreen electronic device having a plurality of applications installed therein, the touchscreen electronic device comprising:
a touch-sensitive display comprising a touch-sensing mechanism configured to receive:
a first touch input during display of a first application, the first touch input being received for at least a threshold time period, and
a second touch input pertaining to a second application during display of the first application, the second touch input being received after the threshold time period while the first touch input is being still received; and
a controller in communication with the touch-sensitive display, wherein the controller is configured to:
display, in response to the second touch input, an overlay user interface pertaining to the second application over the display of the first application.
20. The device as claimed in claim 19, wherein the controller is further configured to:
pair the first application and the second application together in advance; and
store pairing information in settings of the touchscreen electronic device.
21. The device as claimed in claim 20, wherein the pairing is done automatically by the touchscreen electronic device based on application usage timings, application usage frequency, or common data between the first application and the second application.
22. The device as claimed in claim 20, wherein the pairing is done based on one or more context specific user inputs received through a user interface unit.
23. The device as claimed in claim 19, wherein the controller is further configured to:
inactivate the predefined action pertaining to the second application in the first application upon release of the first touch input.
24. The device as claimed in claim 19, wherein the second touch input is in form of a gesture.
25. The device as claimed in claim 19, wherein the second touch input and the predefined action pertains to one or more other applications than the second application.
26. A touchscreen electronic device having a plurality of applications installed therein, the touchscreen electronic device comprising:
a touch-sensitive display comprising a touch-sensing mechanism configured to receive:
a first touch input during display of a first application, the first touch input being received for at least a threshold time period, and
a second touch input pertaining to selection of a second application in an overlay user interface, the second touch input being received after the threshold time period while the first touch input is being still received; and
a controller in communication with the touch-sensitive display, wherein the controller is configured to:
generate, in response to the first touch input, the overlay user interface over the display of the first application.
27. The device as claimed in claim 26, wherein the overlay user interface comprises a home screen or an application menu.
28. The device as claimed in claim 26, wherein the controller is further configured to:
perform, in response to the second touch input, a predefined action pertaining to the second application in the overlay user interface.
29. The device as claimed in claim 26, wherein the controller is further configured to:
close the overlay user interface upon release of the first touch input.
30. The device as claimed in claim 26, wherein generating the overlay user interface comprises receiving a user selection on an option to generate the overlay user interface from amongst a plurality of options.
31. The device as claimed in claim 26, wherein generating the overlay user interface comprises directly generating the overlay user interface without requiring any user selection.
32. A touchscreen electronic device having a plurality of applications installed therein, the touchscreen electronic device comprising:
a touch-sensitive display comprising a touch-sensing mechanism configured to receive:
a first touch input on a first application icon, the first touch input being received for at least a threshold time period, and
a second touch input on a second application icon, the second touch input being received after the threshold time period while the first touch input is being still received; and
a controller in communication with the touch-sensitive display, wherein the controller is configured to:
generate, in response to the second touch input, an overlay user interface having a plurality of items based on common data between the first application and the second application.
33. The device as claimed in claim 32, wherein the plurality of items comprises one or more of most recently used items, most frequently used items, recommended items, top categories/sub-categories of items, or predefined items.
34. The device as claimed in claim 32, wherein the controller is further configured to:
pair the first application and the second application together in advance; and
store pairing information in settings of the touchscreen electronic device.
35. The device as claimed in claim 32, wherein the touch-sensing mechanism is further configured to allow navigating through a plurality of screens to locate the second application icon when not present on a same screen as that of the first application icon, wherein the navigating is performed in background while the first touch input holds the first application icon.
36. The device as claimed in claim 32, wherein the touch-sensing mechanism configured to receive a third touch input in the overlay user interface, and wherein the controller is configured to perform a predefined action in response to the third touch input.
,TagSPECI:DESCRIPTION
TECHNICAL FIELD
The invention generally relates to quick access of information and functionalities in touchscreen based electronic devices. More particularly, the invention relates to quickly manipulating two applications simultaneously on a touchscreen electronic device.
BACKGROUND
In current scenario, most individuals have the opportunity of using touchscreen devices which are bigger in size and which need two handed handling and interaction with them. While everyone wants to get the things done in quick time, there is a time constraint in such a two handed operation of the touchscreen devices. Accordingly, shortcut icons or widgets are very popular for accessing any information or invoking certain functionality. Additionally, interfaces for listing favourite or predefined or recent items, such as call, messages, chats, etc. are equally popular. Few other solutions that are known in this art are listed in subsequent paragraphs.
One prior art describes a gesture based invoke of certain actions even when the screen is locked on a touch sensitive device. The gesture provides a trigger to either unlock the device or perform any other actions associated with that particular gesture. It also describes about physical distinguishable areas on the screen for performing the gesture and associated trigger.
Another prior art describes about the display of two or more running applications side by side on the viewable area of the screen of a touch enabled electronic device. These multiple running applications are presented in such a way that one of the running applications is always active for receiving user input for any trigger, such as a pinch to zoom or exit split screen mode. If a user wants to switch between the running applications, the user can press hold another applications to activate it for receiving user inputs.
Another prior art describes about context based gesture input for an application or trigger. The gestures are then segmented for various actions, e.g., if a gesture is performed, it has to be first checked for a context if it is not in the context for an intended trigger the gesture would not provide any action. If the gesture is recognized, its segment has the flexibility of input type determination based on the segments it accesses using that particular gesture.
Another prior art describes about managing multiple running applications on an electronic device where the electronic device can have a single application being displayed at one time and can have multiple application in an application view. This is performed in order to make the device efficient in terms of energy consumptions and improve the cognitive responses of the user.
Another prior art describes about the video play back triggers using gestures received from users’ fingers as an alternative to soft buttons in most of the media players, such as QuickTime, RealPlayer, etc. Sometimes, these soft buttons might disappear to provide larger viewing area and can appear for user intended operation either forward or backward when needed. This particular solution presents a method for touch imparted triggers as an alternative to soft buttons. In response to detecting one of said plurality of unique touch gestures, routines of embodiments of this solution imparts a video media control function, the video media control function including one of play, pause, rewind, fast-forward, volume up, volume down, track forward, track backwards, slow motion, and frame advance.
Another prior art describes about a gesture parser which holds identification route for every gesture a user makes which is then matched to an established game profile. These gestures once identified are sent back to game applications where the touch gesture is recognized and executed as a mouse click or keystroke associated with a computer OS or a computer game.
Despite the aforesaid teachings, it can be said that there is still need to provide for improvements in this area of technology.
SUMMARY OF THE INVENTION
In accordance with the purposes of the invention, the present invention as embodied and broadly described herein, provides a threshold based multi-touch where a first touch has to be held for a threshold limit and till the user is interested in performing some additional activities with a second touch based input or gesture. Once the first touch threshold is released, these additional activities would no longer be available for manipulation by the second touch input or gesture. In this way, the second touch input or gesture would no longer provide anonymous actions or undesired results. Further, the correct sequence of touch operations has to be performed to provide trigger to perform the additionally activities. Such an interface can be useful for people using touchscreen devices who are sometimes bored of browsing in a single application at one time or who want to use some other functionalities pertaining to information handling or some another application. For this purpose, one can access a pair of applications at a single time with this unique first touch threshold based interface for customized information handling, using multi-touch in touch enabled devices. This first touch threshold on the first application and the subsequent second touch on the second application may be defined by pair sequence at the start of setting up the multi-touch interface.
Further, the invention provides uninterrupted video playback while accessing information related with other applications in said first touch threshold based interface. This part of the invention solves the problem of pausing the running video and going back using the back button on smartphones to access other applications, such as video gallery, or messages, or IMs, or any other applications for that matter. In this way, users can access video gallery while running a current video play uninterrupted or see through the message box and even reply while enjoying the video screen. This invention therefore solves problems know in the art by providing an uninterrupted running of a single application while handling useful exchange of information with another application(s) using said first touch threshold based interface for multi-touch in touch enabled devices. In this way, the user can also handle and process large amount of information’s while interacting with the pair of applications at a time.
These and other features and advantages will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings and claims.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS:
To further clarify advantages and features of the invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings in which:
Figure 1 illustrates an exemplary method implemented on a touchscreen electronic device for manipulating two applications simultaneously, in accordance with an embodiment of the invention.
Figure 2 illustrates another exemplary method implemented on a touchscreen electronic device for manipulating two applications simultaneously, in accordance with an embodiment of the invention.
Figure 3 illustrates another exemplary method implemented on a touchscreen electronic device for manipulating two applications simultaneously, in accordance with an embodiment of the invention.
Figure 4 illustrates an exemplary touchscreen electronic device for implementing a method for manipulating two applications simultaneously, in accordance with an embodiment of the invention.
Figure 5 illustrates exemplary application pairs for which multi-touch is enabled, in accordance with an embodiment of the invention.
Figure 6 illustrates exemplary individual applications for which multi-touch can be enabled, in accordance with an embodiment of the invention.
Figure 7 illustrates an exemplary application pair and corresponding preview window, in accordance with an embodiment of the invention.
Figure 8 illustrates more exemplary application pairs, in accordance with an embodiment of the invention.
Figure 9 illustrates another exemplary application pair and corresponding preview window, in accordance with an embodiment of the invention.
Figure 10 illustrates another exemplary application pair and corresponding preview window, in accordance with an embodiment of the invention.
Figure 11 illustrates another exemplary application pair and corresponding preview window, in accordance with an embodiment of the invention.
Figure 12 illustrates another exemplary application pair and corresponding preview window, in accordance with an embodiment of the invention.
Figure 13 illustrates another exemplary application pair and corresponding preview window, in accordance with an embodiment of the invention.
Figure 14 illustrates another exemplary application pair and corresponding preview window, in accordance with an embodiment of the invention.
Figure 15 illustrates another exemplary application pair and corresponding preview window, in accordance with an embodiment of the invention.
Figure 16 illustrates an exemplary use case for invoking video gallery while playing a video, in accordance with an embodiment of the invention.
Figure 17 illustrates an exemplary use case for previewing a video selected from invoked video gallery in a preview window, in accordance with an embodiment of the invention.
Figure 18 illustrates an exemplary use case for invoking multiple applications, such as Phone, Contacts, and Messages at a time while playing a video, in accordance with an embodiment of the invention.
Figure 19 illustrates exemplary applications invoked while playing a video, in accordance with an embodiment of the invention.
Figure 20 illustrates an exemplary use case for previewing information from one of the invoked applications in a preview window, in accordance with an embodiment of the invention.
Figure 21 illustrates an exemplary use case for searching information originating from one of the invoked applications in a search window, in accordance with an embodiment of the invention.
Figure 22 illustrates an exemplary use case for invoking a specific functionality of another application while playing a video, in accordance with an embodiment of the invention.
Figure 23 illustrates an exemplary use case for invoking an application menu while playing a video, in accordance with an embodiment of the invention.
Figure 24 illustrates an exemplary use case for invoking a home screen while playing a video, in accordance with an embodiment of the invention.
Figure 25 illustrates an exemplary flow chart for implementing certain methods on a touchscreen electronic device, in accordance with an embodiment of the invention.
It may be noted that to the extent possible, like reference numerals have been used to represent like elements in the drawings. Further, those of ordinary skill in the art will appreciate that elements in the drawings are illustrated for simplicity and may not have been necessarily drawn to scale. For example, the dimensions of some of the elements in the drawings may be exaggerated relative to other elements to help to improve understanding of aspects of the invention. Furthermore, the one or more elements may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.
DETAILED DESCRIPTION
It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof. Throughout the patent specification, a convention employed is that in the appended drawings, like numerals denote like components.
Reference throughout this specification to “an embodiment”, “another embodiment” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. Thus, appearances of the phrase “in an embodiment”, “in another embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures proceeded by "comprises" does not, without more constraints, preclude the existence of other devices or other sub-systems.
Various embodiments of the invention will be described below in detail with reference to the accompanying drawings.
Figure 1 illustrates an exemplary method (100) implemented on a touchscreen electronic device for manipulating two applications simultaneously, in accordance with an embodiment of the invention. In said embodiment, the method (100) comprises: receiving (101) a first touch input during display of a first application, the first touch input being received for at least a threshold time period; receiving (102) a second touch input pertaining to a second application during display of the first application, the second touch input being received after the threshold time period while the first touch input is being still received; and displaying (103), in response to the second touch input, an overlay user interface pertaining to the second application over the display of the first application.
In a further embodiment, the method (100) may comprise: pairing (104) the first application and the second application together in advance; and storing (105) pairing information in settings of the touchscreen electronic device.
In a further embodiment, the pairing may be done automatically by the touchscreen electronic device based on application usage timings, application usage frequency, or common data between the first application and the second application.
In a further embodiment, the pairing may be done based on one or more context specific user inputs received through a user interface unit.
In a further embodiment, the method (100) may comprise: inactivating (106) the predefined action pertaining to the second application in the first application upon release of the first touch input.
In a further embodiment, the second touch input may be in form of a gesture.
In a further embodiment, the second touch input and the predefined action pertains to one or more other applications than the second application, for example, in case of for a home screen or a menu of applications.
Figure 2 illustrates another exemplary method (200) implemented on a touchscreen electronic device for manipulating two applications simultaneously, in accordance with an embodiment of the invention. In said embodiment, the method (200) comprises: receiving (201) a first touch input during display of a first application, the first touch input being received for at least a threshold time period; generating (202), in response to the first touch input, an overlay user interface over the display of the first application; and receiving (203) a second touch input pertaining to selection of a second application in the overlay user interface, the second touch input being received after the threshold time period while the first touch input is being still received.
In a further embodiment, the overlay user interface comprises a home screen or an application menu.
In a further embodiment, the method (200) may comprise: performing (204), in response to the second touch input, a predefined action pertaining to the second application in the overlay user interface.
In a further embodiment, the method (200) may comprise: closing (205) the overlay user interface upon release of the first touch input.
In a further embodiment, generating the overlay user interface comprises receiving a user selection on an option to generate the overlay user interface from amongst a plurality of options.
In a further embodiment, generating the overlay user interface comprises directly generating the overlay user interface without requiring any user selection.
Figure 3 illustrates another exemplary method (300) implemented on a touchscreen electronic device for manipulating two applications simultaneously, in accordance with an embodiment of the invention. In said embodiment, the method (300) comprises: receiving (301) a first touch input on a first application icon, the first touch input being received for at least a threshold time period; receiving (302) a second touch input on a second application icon, the second touch input being received after the threshold time period while the first touch input is being still received; and generating (303), in response to the second touch input, an overlay user interface having a plurality of items based on common data between the first application and the second application.
In a further embodiment, the plurality of items comprises one or more of most recently used items, most frequently used items, recommended items, top categories/sub-categories of items, or predefined items.
In a further embodiment, the method (300) may comprise: pairing (304) the first application and the second application together in advance; and storing (305) pairing information in settings of the touchscreen electronic device.
In a further embodiment, the method (300) may comprise: navigating (306) through a plurality of screens to locate the second application icon when not present on a same screen as that of the first application icon, wherein the navigating may be performed in background while the first touch input holds the first application icon.
In a further embodiment, the method (300) may comprise: receiving (307) a third touch input in the overlay user interface; and performing (308) a predefined action in response to the third touch input.
Figure 4 illustrates an exemplary touchscreen electronic device (400), hereinafter referred to as the device (400), for implementing a method (100, 200, 300) for manipulating two applications simultaneously, in accordance with an embodiment of the invention. The device (400) generally has a plurality of applications installed therein. Further, the device comprises: a touch-sensitive display (401) serving the purpose of input as well as output unit, a controller (402), and a memory to store data.
In one embodiment, the device (400) comprises: a touch-sensitive display (401) comprising a touch-sensing mechanism configured to receive: a first touch input during display of a first application, the first touch input being received for at least a threshold time period, and a second touch input pertaining to a second application during display of the first application, the second touch input being received after the threshold time period while the first touch input is being still received; and a controller (402) in communication with the touch-sensitive display (401), wherein the controller (402) is configured to: display, in response to the second touch input, an overlay user interface pertaining to the second application over the display of the first application.
In a further embodiment, the controller (402) may be configured to: pair the first application and the second application together in advance; and store pairing information in settings of the device (400).
In a further embodiment, the pairing may be done automatically by the device (400) based on application usage timings, application usage frequency, or common data between the first application and the second application.
In a further embodiment, the pairing may be done based on one or more context specific user inputs received through a user interface unit.
In a further embodiment, the controller (402) may be configured to: inactivate the predefined action pertaining to the second application in the first application upon release of the first touch input.
In a further embodiment, the second touch input may be in form of a gesture.
In a further embodiment, the second touch input and the predefined action pertains to one or more other applications than the second application.
In one embodiment, the device (400) comprises: a touch-sensitive display (401) comprising a touch-sensing mechanism configured to receive: a first touch input during display of a first application, the first touch input being received for at least a threshold time period, and a second touch input pertaining to selection of a second application in an overlay user interface, the second touch input being received after the threshold time period while the first touch input is being still received; and a controller (402) in communication with the touch-sensitive display (401), wherein the controller (402) is configured to: generate, in response to the first touch input, the overlay user interface over the display of the first application.
In a further embodiment, the overlay user interface comprises a home screen or an application menu.
In a further embodiment, the controller (402) may be configured to: perform, in response to the second touch input, a predefined action pertaining to the second application in the overlay user interface.
In a further embodiment, the controller (402) may be configured to: close the overlay user interface upon release of the first touch input.
In a further embodiment, generating the overlay user interface comprises receiving a user selection on an option to generate the overlay user interface from amongst a plurality of options.
In a further embodiment, generating the overlay user interface comprises directly generating the overlay user interface without requiring any user selection.
In one embodiment, the device (400) comprises: a touch-sensitive display (401) comprising a touch-sensing mechanism configured to receive: a first touch input on a first application icon, the first touch input being received for at least a threshold time period, and a second touch input on a second application icon, the second touch input being received after the threshold time period while the first touch input is being still received; and a controller (402) in communication with the touch-sensitive display (401), wherein the controller (402) is configured to: generate, in response to the second touch input, an overlay user interface having a plurality of items based on common data between the first application and the second application.
In a further embodiment, the plurality of items comprises one or more of most recently used items, most frequently used items, recommended items, top categories/sub-categories of items, or predefined items.
In a further embodiment, the controller (402) may be configured to: pair the first application and the second application together in advance; and store pairing information in settings of the device (400).
In a further embodiment, wherein the touch-sensing mechanism may be further configured to allow navigating through a plurality of screens to locate the second application icon when not present on a same screen as that of the first application icon, wherein the navigating may be performed in background while the first touch input holds the first application icon.
In a further embodiment, the touch-sensing mechanism configured to receive a third touch input in the overlay user interface, and wherein the controller (402) may be configured to perform a predefined action in response to the third touch input.
In a further embodiment, the device may comprise a memory (403) to store data.
Before any multi-touch action is performed, users have an option to activate multi-touch using settings option and can select pairs of applications as they want to use multi-touch with. Within the pairs there can be a step-based priority, for example, first the contacts has to be touch only then multi-touch would be active for Chaton. If the user touches and holds Chaton and then contacts, the multi-touch would not give any results. The first touch can have a time threshold which will be operative once the multi-touch option is activated. Once the threshold for the first touch is reached, the second touch can have a meaningful designated action associated with the corresponding gesture.
First of all, a user can set up the multi-touch as shown in Figures 5 and 6. More specifically, Figure 5 illustrates exemplary individual Applications, such as Chaton, Files, Gallery, Wi-Fi, Bluetooth, and Mobile Internet, for which Multi-Touch can be enabled; while Figure 6 illustrates setting up multi-touch for paired applications, such as Contacts with Chaton, Files with Chaton, Search with storage, Contacts with messages, Camera with Gallery, and Search with Mail. The pairing may be depicted through GUI arrows which may also indicate a hierarchy in the application pair.
The multi-touch setting can be done similar to as user does any other settings on the hand-held devices or smartphones. This particular multi-touch setting binds application pairs to be associated with this multi-touch capability. There can be specificity about choosing these application pairs. This specificity guides the user to select pairs of applications. These application pairs may be dependent in a hierarchy fashion. For example, the hierarchy may be defined in a way that if an application which is associated with the first touch input guided by the threshold is touched and when the threshold is reached then only the consecutive application of the designated pair upon touch would invoke the preview of data associated with them.
Figure 7 illustrates the setting up of the multi-touch pairs. As shown, Files is paired with Chaton to see recent files shared via Bluetooth and Wifi direct in a preview window (701). Here, files shared maximum number of times can be shown first. Figure 8 illustrates that frequently used applications, such as Chaton, SNS, and Search, can be automatically paired up and activated with relevant or predefined applications, such as Contacts, Gallery, and Mail respectively in order to remove the burden of multi-touch settings from user.
The Figures 9 illustrates a use case (1_1) where the first touch and hold is on the Contacts using, say, first finger/thumb during two handed phone operation. This would activate multi-touch and the second touch using the second finger/thumb on the Chaton would open a “preview window” (901) that is intelligently customized in the space available on the device screen. The preview window (901) can consist of information, such as 10 most frequently contacted ids, 5 most recently contacted ids, most recent conversation with specific contacts, any new conversation from the contact list, frequently contacted contacts, and so on.
Figure 10 illustrates a similar use case (1_1) as that of previous figure, but with an application pair of Contacts and Email. In all the uses case, the user may provide first touch and hold input on first application icon in a current screen, and may navigate through a plurality of screens to locate the second application icon when the same is not present on the current screen. This navigation may be displayed in background while the first touch input holds the first application icon over all of the plurality of screens. For example, the first touch and hold input will free the Contacts icon on the current screen, while the user may swipe through the plurality of screens using second touch in order to locate the Email icon.
Figure 11 illustrates another use case (2_1), where application pair is Gallery and Social Networking Service (SNS). A first touch and hold on Contacts opens a preview window (901) containing information or thumbnails for 10 most recent pictures shared, or 10 most recent messaged contacts, or 5 recent friend requests, and so on. A button (111), for instance, a red circular button, may be shown in the preview window, which can be pressed using a second touch to view more information or pictures. Alternatively, a user may swipe input in the preview window using the second touch to see more and more information or pictures.
Figure 12 illustrates a similar use case (2_1) as that of previous figure, but with an application pair of Gallery and Chaton. Here, the preview window (901) contains information such as 10 most recent pictures reviewed, or 10 most recent messaged contacts, and so on. As shown, this preview window (901) can also be provided said button (111) to view more information. In fact, said button may be provided in any preview window when there is a scope and requirement to see more information.
Figure 13 illustrious another use case (3_1), where application pair is Gallery and Camera. A first touch and hold on Gallery opens a preview window (901) containing pictures, for example, 10 most recent pictures taken, or 5 categories having most pictures in the gallery, and so on.
Figure 14 illustrious another use case (4_1), where application pair is Search and Storage. A first touch and hold on Search application opens preview window (901) containing information, such as most recent file exchanges. The user may interact with the preview window using a second touch as long as the first touch is still held.
Figure 15 illustrates a similar use case (4_1) as that of previous figure, but with an application pair of Search and Samsung Map. A first touch and hold on Search application opens preview window (901) containing information, such as most recently visited places and/or duration of stay. The user may interact with the preview window using a second touch as long as the first touch is still held. In this way, the multiple applications can be checked for update and notifications without going to the application individually, for example, with just holding on to Search icon one can share or upload in Storage, do hangouts chats, check recent Samsung map, checkout locations or nearby landmark places on Samsung map, see recent mails.
Accordingly, this multi-touch interface can be very much handy for exploring around other features and applications over a display of a running application without disrupting the running application, such as a media player playing a video. Suppose during a video play, a user needs to access video gallery or a picture gallery or wants to look for a contact to make a phone call. Conventionally, the user has to press the back button and exit the video play for doing the same. On the other hand, the invention enables to remain on the video screen and does not interrupt or pause the video still allows invoking certain functionalities using multi-touch control. Further, these multi-touch functionalities may be automatically customized to invoke actions upon the receiving of pre-designated touch or gesture inputs. Figure 16 illustrates a use case (1_2), where a video gallery is overlaid on the running video screen by providing the second touch in form of a gesture input matching to, say, letters ‘VG’. As shown, the user may provide first touch and hold input using first figure or thumb. While the first touch is unreleased, the user may provide second touch matching to said predefined gesture ‘VG’ to open the video gallery in an overlay window over a current video screen that too without interrupting video play.
In continuation to the previous figure, the Figure 17 illustrates a preview window where a selected video from the video gallery (171) can be previewed over the main video and without interrupting the main video. As shown, the user may swipe the selected video to the preview window (172) to see it as picture-in-picture (PIP) fashion. If the user develops interest in the selected video after seeing its preview in the preview window (172), the user may provide a third input in the preview window (172) to perform a predefined action. For example, the user may provide single click or double click through a finger/thumb to open the selected video in full screen. Otherwise, the selected video can be later watched in full screen depending on the user interest.
On a similar note, the Figure 18 illustrates another use case (2_2), where the user wants to have an access to the contacts, messages, or phone menu for doing some action, the user can do that using a simple continuous gesture, say, matching to letters ‘Vh’, while holding on to their threshold based first touch. By doing this simple continuous gesture with the second independent touch while holding with the first finger/thumb the first touch on the video screen, favourite applications or last used applications (190), such as Phone, Messages, and Contacts are overlaid over the playing video screen as shown in Figure 19. Depending on the user interest, the overlaid applications can be accessed in a preview window without disrupting the playing video. Figure 20 illustrates such a preview window containing a preview (206) of the contacts when the user is holding on to the first threshold based touch and using the second touch gesture input to activate the preview of contacts. Similarly, Figure 21 illustrates another version of the use case (2_2) where said threshold based first touch and simultaneous second touch brings up a search window (210) for the contacts, hence enabling users to search for the contacts they are interested in. Figure 22 illustrates an alternative to the contacts bring up on the running video where if the user holds on to their first touch for a threshold limit and allows their second touch to initiate a customized gesture, say letters ‘Vm’ which is associated to the functionalities of the video screen and the initial letter ‘m’ of the contact. This would directly bring up that a customized window (220) with particular contact(s). If the user wants to make a call directly from the video screen the video would either automatically go in the pause mode once the call is connected and come back running again upon the release of the call.
Figure 23 illustrates a use case where during a video play a first touch and hold input activates an overlay preview window of the application menu (230) of the device (400). The overlay preview window is displayed over the video playing screen in a PIP fashion that too without interrupting the video play. In this way, the user may access though a second touch input any application, but not just a paired application over the playing video screen.
Figure 24 illustrates a use case where during a video play a first touch and hold input activates an overlay preview window of a home screen (240). The overlay preview window is displayed over the video playing screen in a PIP fashion that too without interrupting the video play. This can be a hand feature for devices having a big screen, such as 10 inches of more as the user may easily access any feature or application present on the device (400) although in a smaller preview window than full screen view, but importantly without interrupting the video play.
Figure 25 illustrates a common flow chart (250) for few methods implemented on the device (400). Once the device (400) is started (251), the touch mechanism (401) waits for receiving (252) a first touch. Once the first touch is received, it waits for reaching (253) to the threshold limit of the first touch. Once the threshold limit is reached, it waits for receiving (254) the second touch input. If the second touch input is received, then it checks (255) whether the multi-touch pair is valid or not. If the multi-touch pair is valid, a function corresponding to the multi-touch pair is triggered (256). Further, it keeps on checking whether the first touch input is released (257) anytime during the operation. If yes, then said function is inactivated and display returns (258) to original screen.
Clearly, the present invention: eases users’ time taken in browsing through applications to access information, allows access of multiple application at the same time on the same screen, allows uninterrupted media play at the same time, allows access of preferred communication media on the same screen as well as invoked, and allows linking or sharing of different media or applications at the same time. Although the present invention is described in context of touchscreen devices, those skilled in the art will appreciate the present invention may be implemented for gesture based systems. For example, a user in front of gaming console may provide first gesture and hold input from one hand, while a second independent gesture input through another hand to access some information or invoke certain functionality. While certain present preferred embodiments of the invention have been illustrated and described herein, it is to be understood that the invention is not limited thereto, but may be otherwise variously embodied and practiced within the scope of the following claims.
| Section | Controller | Decision Date |
|---|---|---|
| 15 and 43 | SAMAY RAJ MEENA | 2023-03-14 |
| 15 and 43 | SAMAY RAJ MEENA | 2023-03-14 |
| # | Name | Date |
|---|---|---|
| 1 | 3140-DEL-2014-IntimationOfGrant14-03-2023.pdf | 2023-03-14 |
| 1 | Specifications.pdf | 2014-11-14 |
| 2 | 3140-DEL-2014-PatentCertificate14-03-2023.pdf | 2023-03-14 |
| 2 | FORM 5.pdf | 2014-11-14 |
| 3 | FORM 3.pdf | 2014-11-14 |
| 3 | 3140-DEL-2014-Written submissions and relevant documents [20-12-2022(online)].pdf | 2022-12-20 |
| 4 | Form 26.pdf | 2014-11-14 |
| 4 | 3140-DEL-2014-FORM-26 [06-12-2022(online)].pdf | 2022-12-06 |
| 5 | Drawings.pdf | 2014-11-14 |
| 5 | 3140-DEL-2014-Correspondence to notify the Controller [05-12-2022(online)].pdf | 2022-12-05 |
| 6 | 3140-DEL-2014-US(14)-HearingNotice-(HearingDate-07-12-2022).pdf | 2022-11-22 |
| 6 | 3140-DEL-2014-Correspondence-201114.pdf | 2014-12-05 |
| 7 | 3140-DEL-2014-FER.pdf | 2019-07-23 |
| 7 | 3140-DEL-2014-CLAIMS [16-01-2020(online)].pdf | 2020-01-16 |
| 8 | 3140-DEL-2014-PA [18-09-2019(online)].pdf | 2019-09-18 |
| 8 | 3140-DEL-2014-FER_SER_REPLY [16-01-2020(online)].pdf | 2020-01-16 |
| 9 | 3140-DEL-2014-ASSIGNMENT DOCUMENTS [18-09-2019(online)].pdf | 2019-09-18 |
| 9 | 3140-DEL-2014-OTHERS [16-01-2020(online)].pdf | 2020-01-16 |
| 10 | 3140-DEL-2014-8(i)-Substitution-Change Of Applicant - Form 6 [18-09-2019(online)].pdf | 2019-09-18 |
| 10 | 3140-DEL-2014-Correspondence-101019.pdf | 2019-10-14 |
| 11 | 3140-DEL-2014-OTHERS-101019.pdf | 2019-10-14 |
| 12 | 3140-DEL-2014-8(i)-Substitution-Change Of Applicant - Form 6 [18-09-2019(online)].pdf | 2019-09-18 |
| 12 | 3140-DEL-2014-Correspondence-101019.pdf | 2019-10-14 |
| 13 | 3140-DEL-2014-ASSIGNMENT DOCUMENTS [18-09-2019(online)].pdf | 2019-09-18 |
| 13 | 3140-DEL-2014-OTHERS [16-01-2020(online)].pdf | 2020-01-16 |
| 14 | 3140-DEL-2014-FER_SER_REPLY [16-01-2020(online)].pdf | 2020-01-16 |
| 14 | 3140-DEL-2014-PA [18-09-2019(online)].pdf | 2019-09-18 |
| 15 | 3140-DEL-2014-CLAIMS [16-01-2020(online)].pdf | 2020-01-16 |
| 15 | 3140-DEL-2014-FER.pdf | 2019-07-23 |
| 16 | 3140-DEL-2014-Correspondence-201114.pdf | 2014-12-05 |
| 16 | 3140-DEL-2014-US(14)-HearingNotice-(HearingDate-07-12-2022).pdf | 2022-11-22 |
| 17 | 3140-DEL-2014-Correspondence to notify the Controller [05-12-2022(online)].pdf | 2022-12-05 |
| 17 | Drawings.pdf | 2014-11-14 |
| 18 | 3140-DEL-2014-FORM-26 [06-12-2022(online)].pdf | 2022-12-06 |
| 18 | Form 26.pdf | 2014-11-14 |
| 19 | FORM 3.pdf | 2014-11-14 |
| 19 | 3140-DEL-2014-Written submissions and relevant documents [20-12-2022(online)].pdf | 2022-12-20 |
| 20 | FORM 5.pdf | 2014-11-14 |
| 20 | 3140-DEL-2014-PatentCertificate14-03-2023.pdf | 2023-03-14 |
| 21 | Specifications.pdf | 2014-11-14 |
| 21 | 3140-DEL-2014-IntimationOfGrant14-03-2023.pdf | 2023-03-14 |
| 1 | 2019-07-2310-59-00_23-07-2019.pdf |