Abstract: SYSTEM AND METHOD OF PROVIDING USER INTERACTION WITH A SOFT KEYPAD BASED DEVICE The present invention describes a system and method of providing user interaction with a soft keypad based device. The system comprises a bezel touch sensing unit for providing at least one user input, a touch screen based display unit for selecting one or more information displayed on receiving the at least one user input, and a processing unit coupled to the bezel touch sensing unit and the touch screen based display unit for processing one or more received signals. Figure 1
CLIAMS:
1. A system for providing user interaction with a soft keypad based device comprises:
a bezel touch sensing unit for providing at least one user input;
a touch screen based display unit for selecting one or more information displayed on receiving the at least one user input; and
a processing unit coupled to the bezel touch sensing unit and the touch screen based display unit for processing one or more signalsreceived from the bezel touch sensing unit and the touch screen display unit.
2. The system as claimed in claim 1, wherein the bezel touch sensing unit comprises one or more sensors configured on a bezel area of the device.
3. The system as claimed in claim 1, wherein the display unit comprises one or more sensors configured with a display screen of the device.
4. The system as claimed in claim 1, wherein the bezel touch sensing unit is configured for providing scrolling feature,
the scrolling feature on the bezel is provided to the user for obtaining at least one of a text and graphic representation.
5. The system as claimed in claim 4, wherein the text comprises at least one of a characterand word.
6. The systemas claimed in claim 4, wherein the graphic representation comprises at least one of a symbol and picture.
7. The system as claimed in claim 1, wherein the processing unit configured to map the scroll position to particular key in one or more pre-defined language.
8. The system as claimed in claim 1, wherein length of a scrolling area on the bezel of the device is configurable for providing the scrolling feature.
9. The system as claimed in claim 1, wherein the processing unit configured for providing the user at least one option to choose at least one of a text and graphic representation being displayed on the screen.
10. The system as claimed in claim 1, wherein the processing unit is configured for providing the user interaction in one or more languages.
11. The system as claimed in claim 1, wherein the input received from one or more sensors of a touch screen is provided to a user interface in order to register for the keyevents.
12. The system as claimed in claim 1, wherein the input received from one or more sensors of a touch screen in relation to edit text is registered for the keyevents.
13. A method of providing user interaction with a soft keypad based device, the method comprises:
scrolling a scrolling area of a bezel of the device for providing at least one user input;
sending at least one position index to a mapping module in a processing unit, based on the at least one user input;
displaying at least one of a text and graphic representation on a touch screen, based on the at least one position index;
selecting at least one of a text and graphic representation displayed on the touch screen by tapping; and
processing at least one of a text and graphic representation in the processing unit.
14. The method as claimed in claim 13, wherein text comprises at least one of a characterand word.
15. The method as claimed in claim 13, wherein the graphic representation comprises at least one of a symbol and picture.
16. The method as claimed in claim 13, wherein the input is provided by scrolling on a bezel area of the device to obtain at least one of a text and graphic representation, required by the user.
17. The method as claimed in claim 13, wherein the processing comprises registering at least one of a selected text and graphic representation in a user interface.
18. The method as claimed in claim 17, wherein the at least one of a text and graphic representation is selected by tapping the touch screen.
19. The method as claimed in claim 13, wherein an input received from one or more sensors of a touch screen is provided to a user interface in order to register for the key events.
,TagSPECI:FIELD OF THE INVENTION
The present invention generally relates to communication device and more particularly relates to system and method of providing user interaction with a soft keypad based device.
BACKGROUND OF INVENTION
Integration of mobile phone features in to wearable devices, especially watches is a common concept in communication scenario. Such integration typically provides main functions of communication like voice call, sending messages, voice chat etc, supported by a wireless head set and other peripheral devices. The mobile phone are becoming more and more sophisticated and compact, it is not difficult to make a wrist-watch hand phone that is worn on the hand to directly receive and transmit telephone information, due to entire miniaturization of the mobile phone. Conventional smart watches are operated through key pad or touch screen.
The screen in the smart watches is confined. Typical smart watches are designed such that the keypad are embedded in touch screen, and this results in user inconvenience in readability, selection etc. Therefore, the input methods for the smart watches/small screen devices are lesser users convenient. In case of Softkeypad the problem in the Graphical User Interface (GUI) visibility is hindered whenever the keyinput has to be done. In case of Hardkeypad even though the screen consumption is not there but thereis no flexibility in user input method when compared to softkeypad like switching between qwerty, 3x3 keypad and language specific keypad, onlongpress of a key more word/alphabet suggestions are given etc. In case of Voice key input the pronunciation may differ from person to person and there are no fully efficient solutions available for voice base key/data input.In case of Draw pad, the user writes the character on a screen and the character is taken by respective GUI element, but the problem still exists and consumes the space while writing and the writing style may differ from person to person.
Therefore, there is a need of a system and method for providing user interaction with a soft keypad based device to overcome the above mentioned limitation and effectively overcome the space limitation.
SUMMARY
An objective of the present invention is to provide a system and method for providing user interaction with a soft keypad based device. An embodiment of the present invention describes a system for providing user interaction with a soft keypad based device. The system comprises a bezel touch sensing unit for providing at least one user input, a touch screen based display unit for selecting one or more information displayed on receiving the at least one user input, and a processing unit coupled to the bezel touch sensing unit and the touch screen based display unit for processing one or more received signals.
Another embodiment of the present invention describes a method of providing user interaction with a soft keypad based device. The method comprises scrolling a scrolling area of a bezel of the device for providing at least one user input, sending at least one position index to a mapping module in a processing unit based on the at least one user input, displaying at least one of a text and graphic representation on a touch screen based on the at least one position index, selecting at least one of a text and graphic representation displayed on the touch screen by tapping, and processing at least one of a text and graphic representation in the processing unit.
BRIEF DESRIPTION OF THE ACCOMPANYING DRAWINGS
The aforementioned aspects and other features of the present invention will be explained in the following description, taken in conjunction with the accompanying drawings, wherein:
Figure 1 illustrates a block diagram of system for providing user interaction with a soft keypad based device according to one embodiment of present invention.
Figure 2 illustrates a schematic representation of smart watch according to one embodiment of present invention.
Figure 3 illustrates a schematic representation of operation of smart watch according to one embodiment of present invention.
Figure 4 illustrate display representation in a smart watch according to one exemplary embodiment of present invention.
Figure 5 illustrates key pad operation of a smart watch according to one exemplary
embodiment of present invention.
Figure 6 illustrates schematic representation of vertical movement of keypad of smart device according to one exemplary embodiment of present invention.
Figure 7 illustrates schematic representation of horizontal movement of keypad of smart device according to one exemplary embodiment of present invention.
Figure 8 illustrates a flow diagram of a method of providing user interaction with a soft
keypad based device according to one embodiment of present invention.
Figure 9 illustrates a flow diagram of a method of providing user interaction with a soft keypad based device according to another embodiment of present invention.
DETAILED DESCRIPTION OF THE INVENTION
The embodiments of the present invention will now be described in detail with reference to the accompanying drawings. However, the present invention is not limited to the embodiments. The present invention can be modified in various forms. Thus, the embodiments of the present invention are only provided to explain more clearly the present invention to the ordinarily skilled in the art of the present invention. In the accompanying drawings, like reference numerals are used to indicate like components.
The specification may refer to “an”, “one” or “some” embodiment(s) in several locations. This does not necessarily imply that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments.
As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms “includes”, “comprises”, “including” and/or “comprising” when used in this specification, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Furthermore, “connected” or “coupled” as used herein may include operatively connected or coupled. As used herein, the term “and/or” includes any and all combinations and arrangements of one or more of the associated listed items.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Figure 1 illustrates a block diagram of a system for providing user interaction with soft keypad based device according to one embodiment of present invention. The system 100 comprises a bezel touch sensing unit 101, a touch screen based display unit 102 and a processing unit 103. The bezel touch sensing unit comprises one or more sensors configured on a bezel area of the device.The touch sensing units are inbuilt in the bezel area of the watch. The bezel touch sensing unit 101 is configured for providing at least one user input to the processing unit 103. The user input is provided by scrolling the bezel area of the watch inbuilt with one or more sensors. The touch sensing unit provides one or more signals of one or more sensors to the processing unit. The processing unit processes the signals and maps the scroll position to particular key in one or more pre-defined language and provides the same to the touch screen based display unit 102. The touch screen based display unit 102 comprises one or more sensors. The touch screen based display unit 102 displays one or more characters and/ or words on receiving one or more signals from the processing unit and enables the user to select one or more characters and/ or wordsby tapping a screen of the display unit 102. The input received from one or more sensors of the touch screen based display unit 102 is provided to a user interface in order to register for the keyevents. In one embodiment, the bezel of the smart includes one or more hardkeys.
Figure 2 illustrates a schematic representation of a smart watch 200 according to one
embodiment of present invention. According to this embodiment a smart watch is the soft keypad based device which includes a bezel 201 and a touch screen 202. The bezel of the smart watch holds the touch screen 202. The bezel 201 is configured for providing scrolling feature. The scrolling feature on the bezel is provided to the user for obtaining the required character. A set of keys are provided on the bezel based on a predefined configuration or the requirement of the user. For instance, consider a 3X3 key pad is implemented in the bezel. The user providesan input to the bezel 201 by scrolling and /or touch the bezel area. The input is processed and mapped with one or more of a character and word for displaying as a floating button on the touch screen 202 of the smart watch. The user selects the one or more of a character and word to register the keyevent. Based on the keyevent, the device either displays the character or performs a predefined activity. In one embodiment, the touch screen displays at least one of a number, symbol, picture, and graphic representation.
In another embodiment, when a keyevent is registered, it may not be necessary that the associated keyevent displays character on the screen, some pre-defined activity may be performed like resume an application module, close an application module, etc. In such cases directly based on the keyevent, action is performed rather than providing interface to the user. In one exemplary case based on the keyevent, if prediction is ON in settings, the registered keyevent can be used to obtain required predictedkeywords and then display corresponding words to the user. So based on the keyevent either character will be displayed or some activity will be performed as mentioned above.
Figure 3 illustrates a schematic representation of operation of smart watch according to one embodiment of present invention. The embodiment depicts that the bezel is scrolled using a user’s finger to display a character as a floating button on a touch screen of the watch. The character displaying as a floating button is selected by either tapping or scrolling the touch screen. Further, the selected character is displayed on the touch screen. According to one embodiment of present invention, the tapping of floating button provides a key input to a specific user interface element, that element needs to be registered for the key events such as Edit text.
In one embodiment, the touch screen displays at least one of a text and graphic representation. Here, the text comprises at least one of a characterand word. The graphic representation comprises at least one of a symbol and picture.
Figure 4 illustrates display representation in a smart watch according to one exemplary embodiment of present invention. On scrolling the bezel, a character “R” is displayed along with three words “Rome”, “Ring”, and “Rail”. These three words are displayed based on the configuration of the watch and also when the prediction feature is ON in the smart watch. The user can either select the character or words by tapping the touch screen. The user also has the option to scroll the touch screen in order to obtain further predicted words and then select the required word by tapping over an area of screen displaying the desired word. The at least one of a character and word is selected by tapping the touch screen with a finger. The selected at least one of the character and word is displayed on the touch screen.
The style of showing prediction may vary and depends upon the configuration of smart watch. As shown in figure 4, the predictions are shown in dialler model, where the next predictions will be shown when they are moved in clockwise and anticlockwise direction. The movement of finger for selecting predictions are depicted in Figure 5.
Figure 6 illustrates schematic representation of vertical swiping on a bezel of the soft keypad based device according to one exemplary embodiment. The bezel is scrolled vertically to obtain and display additional character. If the bezel is scrolled-up, the next character is displayed. If the bezel is scrolled-down, the previous character is displayed.
Figure 7 illustrates schematic representation of horizontal swiping on a touch screen of thesoft keypad based device according to another exemplary embodiment of present invention. The touch screen is swiped horizontally to obtain and display additional character. If the touch screen is swiped towards right, the previous character is displayed. If the touch screen is swiped towards left, the next character is displayed.
Figure 8 illustrates a flow diagram of a method of providing user interaction with a soft keypad based device according to one embodiment of present invention. Input is provided to a bezel input module by scrolling and / or touching the bezel area of the device by a first finger of the user. The bezel input module provides a position index corresponding to the bezel area by at least one of the scrolled and touched input by the userto a mapping module. The mapping module provides a corresponding at least one of character and word to a key input module which is displayed as a floating button on a touch screen of the device. The user then provides another input by at least one of a scrolling and touching the floating button by another finger. The display module either registers keyevent or displays the character.
Figure 9 illustrates a flow diagram of a method of providing user interaction with a soft keypad based device according to another embodiment of present invention. At step 901, a scrolling area of a bezel of the device is scrolled for providing at least one user input. The input is provided by scrolling on a bezel area of the device to obtain one of a character and word according to the requirement of user. In one embodiment of present invention, an input received from one or more sensors of a touch screen is provided to a user interface in order to register for the keyevents.
At step 902, at least one position index is sent to a mapping module in a processing unit, based on the at least one user input. At step 903, character is displayed on a touch screen based on at least one position index. At step 904, at least one of a character and word displayed on the touch screen is selected by tapping. At step 905, one or more selected characters are processed in the processing unit. According to one aspect of present invention, the processing of characters comprises registering one of a selected character and selected word in a user interface.
Although the invention of the method and system has been described in connection with the embodiments of the present invention illustrated in the accompanying drawings, it is not limited thereto. It will be apparent to those skilled in the art that various substitutions, modifications and changes may be made thereto without departing from the scope and spirit of the invention.
| # | Name | Date |
|---|---|---|
| 1 | 2749-CHE-2014-IntimationOfGrant10-05-2023.pdf | 2023-05-10 |
| 1 | POA_Samsung R&D Institute India-new.pdf | 2014-06-09 |
| 2 | 2013_MSSG_1174__Form 5.pdf | 2014-06-09 |
| 2 | 2749-CHE-2014-PatentCertificate10-05-2023.pdf | 2023-05-10 |
| 3 | 2749-CHE-2014-Response to office action [09-05-2023(online)].pdf | 2023-05-09 |
| 3 | 2013_MSSG_1174_Drawings_3 June 2014.pdf | 2014-06-09 |
| 4 | 2749-CHE-2014-Written submissions and relevant documents [15-02-2022(online)].pdf | 2022-02-15 |
| 4 | 2013_MSSG_1174_CS_3 June 2014.pdf | 2014-06-09 |
| 5 | abstract 2749-CHE-2014.jpg | 2015-02-04 |
| 5 | 2749-CHE-2014-Correspondence to notify the Controller [01-02-2022(online)].pdf | 2022-02-01 |
| 6 | 2749-CHE-2014-FORM-26 [01-02-2022(online)].pdf | 2022-02-01 |
| 6 | 2749-CHE-2014-FER.pdf | 2019-06-28 |
| 7 | 2749-CHE-2014-US(14)-HearingNotice-(HearingDate-03-02-2022).pdf | 2022-01-10 |
| 7 | 2749-CHE-2014-RELEVANT DOCUMENTS [17-07-2019(online)].pdf | 2019-07-17 |
| 8 | 2749-CHE-2014-MARKED COPIES OF AMENDEMENTS [17-07-2019(online)].pdf | 2019-07-17 |
| 8 | 2749-CHE-2014-ABSTRACT [02-01-2020(online)].pdf | 2020-01-02 |
| 9 | 2749-CHE-2014-CLAIMS [02-01-2020(online)].pdf | 2020-01-02 |
| 9 | 2749-CHE-2014-FORM 13 [17-07-2019(online)].pdf | 2019-07-17 |
| 10 | 2749-CHE-2014-COMPLETE SPECIFICATION [02-01-2020(online)].pdf | 2020-01-02 |
| 10 | 2749-CHE-2014-FORM 4(ii) [30-12-2019(online)].pdf | 2019-12-30 |
| 11 | 2749-CHE-2014-DRAWING [02-01-2020(online)].pdf | 2020-01-02 |
| 11 | 2749-CHE-2014-PETITION UNDER RULE 137 [02-01-2020(online)].pdf | 2020-01-02 |
| 12 | 2749-CHE-2014-FER_SER_REPLY [02-01-2020(online)].pdf | 2020-01-02 |
| 12 | 2749-CHE-2014-OTHERS [02-01-2020(online)].pdf | 2020-01-02 |
| 13 | 2749-CHE-2014-FER_SER_REPLY [02-01-2020(online)].pdf | 2020-01-02 |
| 13 | 2749-CHE-2014-OTHERS [02-01-2020(online)].pdf | 2020-01-02 |
| 14 | 2749-CHE-2014-DRAWING [02-01-2020(online)].pdf | 2020-01-02 |
| 14 | 2749-CHE-2014-PETITION UNDER RULE 137 [02-01-2020(online)].pdf | 2020-01-02 |
| 15 | 2749-CHE-2014-COMPLETE SPECIFICATION [02-01-2020(online)].pdf | 2020-01-02 |
| 15 | 2749-CHE-2014-FORM 4(ii) [30-12-2019(online)].pdf | 2019-12-30 |
| 16 | 2749-CHE-2014-CLAIMS [02-01-2020(online)].pdf | 2020-01-02 |
| 16 | 2749-CHE-2014-FORM 13 [17-07-2019(online)].pdf | 2019-07-17 |
| 17 | 2749-CHE-2014-MARKED COPIES OF AMENDEMENTS [17-07-2019(online)].pdf | 2019-07-17 |
| 17 | 2749-CHE-2014-ABSTRACT [02-01-2020(online)].pdf | 2020-01-02 |
| 18 | 2749-CHE-2014-US(14)-HearingNotice-(HearingDate-03-02-2022).pdf | 2022-01-10 |
| 18 | 2749-CHE-2014-RELEVANT DOCUMENTS [17-07-2019(online)].pdf | 2019-07-17 |
| 19 | 2749-CHE-2014-FORM-26 [01-02-2022(online)].pdf | 2022-02-01 |
| 19 | 2749-CHE-2014-FER.pdf | 2019-06-28 |
| 20 | abstract 2749-CHE-2014.jpg | 2015-02-04 |
| 20 | 2749-CHE-2014-Correspondence to notify the Controller [01-02-2022(online)].pdf | 2022-02-01 |
| 21 | 2749-CHE-2014-Written submissions and relevant documents [15-02-2022(online)].pdf | 2022-02-15 |
| 21 | 2013_MSSG_1174_CS_3 June 2014.pdf | 2014-06-09 |
| 22 | 2749-CHE-2014-Response to office action [09-05-2023(online)].pdf | 2023-05-09 |
| 22 | 2013_MSSG_1174_Drawings_3 June 2014.pdf | 2014-06-09 |
| 23 | 2749-CHE-2014-PatentCertificate10-05-2023.pdf | 2023-05-10 |
| 23 | 2013_MSSG_1174__Form 5.pdf | 2014-06-09 |
| 24 | POA_Samsung R&D Institute India-new.pdf | 2014-06-09 |
| 24 | 2749-CHE-2014-IntimationOfGrant10-05-2023.pdf | 2023-05-10 |
| 1 | 2019-06-2717-42-11_27-06-2019.pdf |