Sign In to Follow Application
View All Documents & Correspondence

Method And System For Providing A Soft Keyboard For A Computing Device

Abstract: The present invention provides method and apparatus (200). The apparatus comprises components to execute the method comprising: rendering “N” character groups at a display device (202), wherein each group comprising at least one character and optionally at least one sub-group; receiving a selection of a group; optionally hiding the non-selected groups; rendering characters or sub-groups forming part of the selected group at the display device, if it is determined that the selected group comprises a plurality of characters or at least one sub-group; further receiving a selection among the characters or sub-groups forming part of the selected group; rendering a selected character in a pre-determined area of the display device (202), if said selection further received by said input device pertains to a character; and rendering “N” character groups, at a remaining space of the display device (202).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
29 June 2015
Publication Number
54/2016
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
mail@lexorbis.com
Parent Application
Patent Number
Legal Status
Grant Date
2022-04-04
Renewal Date

Applicants

Samsung India Electronics Pvt. Ltd.
Logix Cyber Park, Plot No. C 28-29, Tower D - Ground to 10th Floor, Tower C - 7th to 10th Floor, Sector-62, Noida – 201301, Uttar Pradesh, India

Inventors

1. PANDEY, Ankit Kumar
B-3/9, IGIMS Campus, Sheikhpura,Patna-800014, Bihar, India
2. AGRAWAL, Rishabh
Narendra Transport, Modipara, Sambalpur, Odisha-768002, India
3. BAJAJ, Jatin
F-7/47, Kashmir Avenue, Amritsar, Punjab-143001, India
4. MALLAR, Pranav
#44, 13 Cross, Vyalikaval, Malleswaram, Bangalore-560003, Karnataka, India
5. SINGH, Ritesh
Flat No. 17, Gokul Apartments, Off Karve Road, Vishnu Nagar, Dombivli West, Maharashtra – 421202, India
6. GUNASEELA BOOPATHY, Senthil Raja
868C, Thukkapet, Chengam, Thiruvannamalai (Dt), Tamil Nadu- 606709, India
7. AGARWAL, Dipanshu
9/148 Bagh Muzaffar Khan, Agra, Uttar Pradesh, India

Specification

FIELD OF THE INVENTION:

The present invention relates to a mechanism for interacting with computing devices and in particular relates to launching and operating keyboard based user interfaces for computing devices.

BACKGROUND OF THE INVENTION:

With the advent of computing devices like tablets, smart-phones, laptops etc, as well as assistive devices such as smart watches linked with mobile phones, the keyboard is not anymore restricted to being a hardware device or an assembly of externally mounted control keys. For quite some time, the keyboard functionality is being aggressively implemented through a graphical user interface known as soft keyboard. Accordingly, the screen space is not anymore restricted to displaying, and further allows the user to operate the overall device.

As a result, the soft keyboards or ‘keyboard interfaces’ have to occupy a significant screen space while being rendered, so as to retain the control inputs and features of a universal hardware based keyboard. Accordingly, while operating upon the keyboard interface, it is common yet problematic to see the display getting obscured by a large extent. Such problem gets compounded further in case of mobile phone and smart watches that have a substantially lesser screen area than tablets and laptops. Accordingly, a user may have to continuously switch between the actual display and the keyboard interface, in case the keyboard interface is required to be operated based on some directions provided within the rendered display (i.e. website). The same proves highly time-consuming in a long run and a user is quite prone to get fatigued.

Moreover, owing to the screen size constraint of the devices, an inter-key distance within the keyboard interface is also significantly lower than the stereotype hardware keyboard and is rather bare minimum in case of touch-screen based mobile phones and smart watches. Accordingly, the user has to be extra cautious while typing and may be compelled every now and then to correct mistyped words owing to incorrect key getting selected very often. Despite enough caution exercised by the user while operating upon the keyboard interface, there still remains a high probability of incorrect key selection or mistyped words/phrases. Moreover, closely packed keys in the keyboard interface are absolutely useless for users with dexterity issues and suffering from Parkinson’s disease.

Accordingly, there remains a long felt need to provide a user a substantial display of background information in computing device, while the user operates upon a keyboard interface.

Yet another need of hour is to increase an inter-key distance within the keyboard interface to minimize the occurrences of mistyped words.

OBJECT OF THE INVENTION:

Thus, it is an object of the present invention to provide a method and system for providing a keyboard interface that permits a substantial display of background information, while it is rendered in the computing device.

It is another object of the present invention to increase an inter-key distance within the keyboard interface in to minimize the occurrences of incorrect key selection or mistyped words.

SUMMARY OF THE INVENTION:

Accordingly, the present invention provides a method of rendering, on a display device, “N” character groups, each group comprising at least one character and optionally at least one sub-group. The method comprises
(a) receiving a selection of a group;
(b) optionally, hiding the non-selected groups;
(c) rendering, on the display device, characters or sub-groups forming part of the selected group, if it is determined that the selected group comprises a plurality of characters or at least one sub-group;
(d) receiving a selection among the characters or sub-groups forming part of the selected group;
(e) rendering a selected character in a pre-determined area of the display device, if the selection in step (e) pertains to a character; and
(f) rendering, on a remaining space of the display device, the “N” character groups.

The present invention also provides an apparatus comprising a display device for rendering, on a display device, “N” character groups, each group comprising at least one character and optionally at least one sub-group. An input device is provided receiving a selection of a group. A processor in operational interaction with said display device is configured to optionally, hide the non-selected groups; and render, on said display device, characters or sub-groups forming part of the selected group, if it is determined that the selected group comprises a plurality of characters or at least one sub-group. The input device further receives a selection among the characters or sub-groups forming part of the selected group. Said processor in operational interaction with said display device renders a selected character in a pre-determined area of the display device, if said selection further received by said input device pertains to a character. Thereafter, said processor causes rendering of the “N” character groups, at a remaining space of the display device.

To further clarify advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.

BRIEF DESCRIPTION OF FIGURES:

These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:

Figure 1 shows a flow chart corresponding to a first embodiment of the invention;
Figure 2 shows a detailed internal construction of the apparatus in accordance with first embodiment of the present invention;
Figure 3 shows a detailed internal construction of the apparatus as described in Fig. 2.
Figure 4 shows an exemplary initial configuration of a keyboard interface, in accordance with the present invention.
Figure 5 illustrates an exemplary mechanism for operation upon an alphabet based keyboard interface.
Figure 6 illustrates different exemplary views of the keyboard interface.
Figure 7 illustrates an exemplary mechanism for switching between sub-views of the numeral based keyboard interface.
Figure 8 illustrates an exemplary mechanism for operating upon a special-character based keyboard interface.
Figure 9 illustrates an exemplary mechanism or operation to switch among various exemplary views of the keyboard interface.
Figure 10 an exemplary mechanism for operating upon the keyboard interface without contacting the screen surface.
Figure 11 illustrates exemplary shortcuts as applicable for discharging various functions of the keyboard interface.
Figure 12 illustrates an exemplary grouping of characters within the keyboard interface
Figure 13 illustrates another exemplary grouping of characters within the keyboard interface.
Figure 14 illustrates a control flow for accomplishing the exemplary grouping as attained in Fig. 13.
Figure 15 illustrates an exemplary sequence of operations upon the keyboard interface based on the grouping of characters as described in Fig. 13.

Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have been necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present invention. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.

DETAILED DESCRIPTION:

For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.

It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof.
[…]

Reference throughout this specification to “an aspect”, “another aspect” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrase “in an embodiment”, “in another embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.

The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by "comprises... a" does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components.

Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.

Embodiments of the present invention will be described below in detail with reference to the accompanying drawings.

Now referring to Figure 1, it can be seen that the present invention provides a method comprising:

(a) rendering (step 102), on a display device, “N” character groups, each group comprising at least one character and optionally at least one sub-group;
(b) receiving (step 104) a selection of a group;
(c) optionally, hiding (step 106) the non-selected groups;
(d) rendering (step 108), on the display device, characters or sub-groups forming part of the selected group, if it is determined that the selected group comprises a plurality of characters or at least one sub-group;
(e) receiving (step 110) a selection among the characters or sub-groups forming part of the selected group;
(f) rendering (step 112) a selected character in a pre-determined area of the display device, if the selection in step (e) pertains to a character; and
(g) rendering (step 114), on a remaining space of the display device, the “N” character groups.

In an embodiment of the present invention, wherein said at least one character is: an alphabet, a special character, a numeral, a control input, or a combination thereof.

In another embodiment of the present invention, wherein said rendering of at least one of said “N” character groups or said characters or said sub-groups, comprises rendering the said “N” character groups or said characters or said sub-groups at a location that is dynamically determined.

In another embodiment of the present invention, wherein the location thus dynamically determined is valid for a pre-determined amount of time period.

In another embodiment of the present invention, wherein said rendering further comprises varying based on a time period, at least one non-dominant character of a character group or character sub-group.

In another embodiment of the present invention, wherein said rendering further comprises varying at least a transparency level of the character group, the character or the character sub-group.

In another embodiment of the present invention, wherein said optionally hiding comprises hiding said non-selected groups in case said selected group comprises a plurality of characters or at least one sub-group.

In another embodiment of the present invention, wherein said pre-determined area denotes a typing field within the screen area.

In another embodiment of the present invention, wherein said remaining space denotes a screen area excluding the typing field.

In another embodiment of the present invention, wherein said receipt of the selection of the groups or the character is gathered through a user input provided by at least one of :
a pre-defined touch gesture,
a pre-defined tapping action,
a pre-defined gesture executed without contacting a screen surface, and
a pre-defined limb movement.

In another embodiment of the present invention, the present invention further comprises:
receiving a user input towards selecting a screen view comprising said ‘N’ character groups, out of a plurality of screen views that comprise different character groups.

In another embodiment of the present invention, wherein if the selection in step (b) pertains to a group comprising a single character, the method directly proceeds to step (f) of rendering the selected character in a pre-determined area of the display device.

In another embodiment of the present invention, wherein if the selection in step (e) pertains to a sub-group, the method further comprises:
(a) rendering, on the display device, characters forming part of the selected sub-group; and
(b) receiving a selection among the characters forming part of the selected sub-group.

In another embodiment of the present invention, wherein each of said ‘N’ groups is based on at least one of:
(a) a lexicographical sequence; or
(b) a frequency of usage.

Referring to Figure 2, the present invention also provides an apparatus (200) comprising,
a display device (202) for rendering, on a display device, “N” character groups, each group comprising at least one character and optionally at least one sub-group;
an input device (204) for receiving a selection of a group;
a processor (206) in operational interaction with said display device (202) configured to:
optionally, hide the non-selected groups; and
render, on the display device, characters or sub-groups forming part of the selected group, if it is determined that the selected group comprises a plurality of characters or at least one sub-group;
wherein, said input device (204) further receives a selection among the characters or sub-groups forming part of the selected group;
wherein said processor (206) in operational interaction with said display device (202) is further configured to:
render a selected character in a pre-determined area of the display device, if said selection further received by said input device pertains to a character; and
further render, on a remaining space of the display device, the “N” character groups.

Referring to Figure 3, yet another typical hardware configuration of the apparatus 200 in the form of a computer system 200 is shown. The computer system 200 can include a set of instructions that can be executed to cause the computer system 200 to perform any one or more of the methods disclosed. The computer system 200 may operate as a standalone device or may be connected, e.g., using a network, to other computer systems or peripheral devices.

In a networked deployment, the computer system 200 may operate in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment. The computer system 200 can also be implemented as or incorporated into various devices, such as a personal computer (PC), a tablet PC, a personal digital assistant (PDA), a mobile device, a palmtop computer, a laptop computer, a desktop computer, a communications device, a wireless telephone, a land-line telephone, a web appliance, a network router, switch or bridge, or any other machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while a single computer system 200 is illustrated, the term "system" shall also be taken to include any collection of systems or sub-systems that individually or jointly execute a set, or multiple sets, of instructions to perform one or more computer functions.

The computer system 200 may include a processor 302 that is equivalent to the processor 206 as discussed before. The processor 302 may be a e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both. The processor 302 may be a component in a variety of systems. For example, the processor 302 may be part of a standard personal computer or a workstation. The processor 302 may be one or more general processors, digital signal processors, application specific integrated circuits, field programmable gate arrays, servers, networks, digital circuits, analog circuits, combinations thereof, or other now known or later developed devices for analysing and processing data The processor 302 may implement a software program, such as code generated manually (i.e., programmed).

The computer system 200 may include a memory 304, such as a memory 304 that can communicate via a bus 308. The memory 304 may be a main memory, a static memory, or a dynamic memory. The memory 304 may include, but is not limited to computer readable storage media such as various types of volatile and non-volatile storage media, including but not limited to random access memory, read-only memory, programmable read-only memory, electrically programmable read-only memory, electrically erasable read-only memory, flash memory, magnetic tape or disk, optical media and the like. In one example, the memory 304 includes a cache or random access memory for the processor 302. In alternative examples, the memory 304 is separate from the processor 302, such as a cache memory of a processor, the system memory, or other memory. The memory 304 may be an external storage device or database for storing data. Examples include a hard drive, compact disc ("CD"), digital video disc ("DVD"), memory card, memory stick, floppy disc, universal serial bus ("USB") memory device, or any other device operative to store data. The memory 304 is operable to store instructions executable by the processor 302. The functions, acts or tasks illustrated in the figures or described may be performed by the programmed processor 302 executing the instructions stored in the memory 304. The functions, acts or tasks are independent of the particular type of instructions set, storage media, processor or processing strategy and may be performed by software, hardware, integrated circuits, firm-ware, micro-code and the like, operating alone or in combination. Likewise, processing strategies may include multiprocessing, multitasking, parallel processing and the like.

As shown, the computer system 200 may or may not further include a display unit 310, such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid state display, a cathode ray tube (CRT), a projector, a printer or other now known or later developed display device for outputting determined information. The display 310 may act as an interface for the user to see the functioning of the processor 302, or specifically as an interface with the software stored in the memory 304 or in the drive unit 316.

Additionally, the computer system 200 may include an input device 312 configured to allow a user to interact with any of the components of system 200. The input device 312 may be a number pad, a keyboard, or a cursor control device, such as a mouse, or a joystick, touch screen display, remote control or any other device operative to interact with the computer system 200.

The computer system 200 may also include a disk or optical drive unit 316. The disk drive unit 316 may include a computer-readable medium 322 in which one or more sets of instructions 324, e.g. software, can be embedded. Further, the instructions 324 may embody one or more of the methods or logic as described. In a particular example, the instructions 324 may reside completely, or at least partially, within the memory 304 or within the processor 302 during execution by the computer system 200. The memory 304 and the processor 302 also may include computer-readable media as discussed above.

The present invention contemplates a computer-readable medium that includes instructions 324 or receives and executes instructions 324 responsive to a propagated signal so that a device connected to a network 326 can communicate voice, video, audio, images or any other data over the network 326. Further, the instructions 324 may be transmitted or received over the network 326 via a communication port or interface 320 or using a bus 308. The communication port or interface 320 may be a part of the processor 302 or may be a separate component. The communication port 320 may be created in software or may be a physical connection in hardware. The communication port 320 may be configured to connect with a network 326, external media, the display 310, or any other components in system 200 or combinations thereof. The connection with the network 326 may be a physical connection, such as a wired Ethernet connection or may be established wirelessly as discussed later. Likewise, the additional connections with other components of the system 200 may be physical connections or may be established wirelessly. The network 326 may alternatively be directly connected to the bus 308.

The network 326 may include wired networks, wireless networks, Ethernet AVB networks, or combinations thereof. The wireless network may be a cellular telephone network, an 802.11, 802.16, 802.20, 802.1Q or WiMax network. Further, the network 326 may be a public network, such as the Internet, a private network, such as an intranet, or combinations thereof, and may utilize a variety of networking protocols now available or later developed including, but not limited to TCP/IP based networking protocols.

In an alternative example, dedicated hardware implementations, such as application specific integrated circuits, programmable logic arrays and other hardware devices, can be constructed to implement various parts of the system 200.

Applications that may include the systems can broadly include a variety of electronic and computer systems. One or more examples described may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the present system encompasses software, firmware, and hardware implementations.

The system described may be implemented by software programs executable by a computer system. Further, in a non-limited example, implementations can include distributed processing, component/object distributed processing, and parallel processing. Alternatively, virtual computer system processing can be constructed to implement various parts of the system.

The system is not limited to operation with any particular standards and protocols. For example, standards for Internet and other packet switched network transmission (e.g., TCP/IP, UDP/IP, HTML, HTTP) may be used. Such standards are periodically superseded by faster or more efficient equivalents having essentially the same functions. Accordingly, replacement standards and protocols having the same or similar functions as those disclosed are considered equivalents thereof.

In the following paragraphs, a detailed description about exemplary implementation of various features of the invention and a corresponding control flow across associated with such features has been provided. It should however be understood that every implementation of as claimed method and apparatus need not follow the exemplary features and steps as described mentioned in the following paragraphs. Thus, the scope of the claims is intended to be restricted only on the basis of the claims and their equivalents and not on the basis of the examples provided herein below.

EXEMPLARY IMPLEMENTATION OF THE PRESENT INVENTION

The keyboard interface as provided by the present subject matter is a scattered interface or a set of groups (sub-user interfaces) that are dispersed across the screen area and also movable across the screen area so as to not obscure a particular portion of the screen area continuously. However, despite movement, the relative orientation of the groups with respect to each other may remain the same. Moreover, such movability of groups across the screen area ensures that a particular screen area of is not blocked continuously by the keyboard interface.

The path of movement or the path to be traversed by the movable groups within the screen area may be dynamically determined. In other words, the positions across the screen area where the groups shall be oriented for a predetermined time period, before shifting to a next location, may be dynamically determined. Example of factors employed for such dynamic determination may be a level of the background information currently displayed, size of text field where the characters are to be typed, no of currently displayed groups etc. Accordingly, the orientation of each group in the screen area at a particular location is transient, i.e. valid only for a pre-determined time period.

Furthermore, an initial position of any group, a rate of traversing the screen area, size of each group to be displayed, path and other displayable properties associated with each group is chosen such that these groups always maintain a pre-determined inter-group distance both horizontally and vertically.

Furthermore, a transparency level associated with each group is variable between a prescribed maximum and minimum value through a user defined control. Else such variation may also happens automatically. Accordingly, even when the group obscures a particular portion of the screen, the background information otherwise attempted to be hidden by the particular group may still be visible.

Referring to Fig. 4, an exemplary initial configuration of the keyboard interface 402 is shown. Such initial configuration points to a state of the keyboard interface soon upon its invocation through a user interaction, as exemplarily done through a finger 404 driven swiping action performed over a text typing field within the screen area 400. However, other modes of user interaction are also possible such as a pre-defined tapping action, a pre-defined gesture executed without contacting a screen surface, a pre-defined limb movement such waving the hand in the air etc. In further example, the contact less pre-defined gesture may include an eye gaze based gesture, a voice based gesture, an air blow etc.

In an example, the keyboard interface 402 is denoted by ‘N’ character groups, wherein each group represents a set of alphabets in an alphabetical border. E.g. A-C-E denotes a character group. Moreover, the centre character or a non-dominant in each group may change alphabetically. For example, C may change to B and then to D after brief intervals of time to lend more accessibility towards each group. The criteria to define a particular character as being dominant or non-dominant may be an order of frequently typed or accessed alphabets by a user.

Figure 5 illustrates an exemplary mechanism of operation upon an alphabet based keyboard interface 402 of the Fig. 4. Such selection may be performed over a particular group via the finger 404 driven selection or other forms of user interaction as already discussed in Fig. 4.

Further, upon having selected a group within the screen area (say Screen 1), say ABE, the group is further splintered into various characters A, B, C, D and E. Accordingly, the display changes to a state (say Screen 2), where only splintered characters are shown and rest of the groups are not anymore shown.

Upon selection of character as per Step 2, the display further changes its state (say Screen 3) such that the selected character gets typed within the text field. Further, all of the initial groups that were displayed at the screen 1 are re-displayed at Screen 3, though in a limited screen area 400 as compared to the screen 1. In other words, while the displaying space of the groups in the screen 1 was almost the entire screen area 400, the display space for the groups in Screen 3 does not include the text field (or the typing field) and is accordingly curtailed.

Further, any selection of the group in Screen 3 lead to the same process flow as depicted from Screen 1 till Screen 3. Moreover, depending upon how much text has been typed within the typing field, the display area for the groups will keep on lessening.

Figure 6 represents different views of the keyboard interface 402 for displaying the groups of numerals 402-2 and special characters or symbols 402-4 to act as the scattered keyboard interface. As shown in the figure, while the numerals may be singularly displayed owing to being only from 0 to 9 (i.e. only 10 characters), the special characters or symbols may be grouped like alphabets based on a universal order of representation of symbols or special characters in keyboards

Figure 7 illustrates an exemplary mechanism for switching between different sub-views (1 and 2) of the numeral based keyboard interface 402-2. As evident from the figure, the numerals may be dispersed across two screen views .e.g. sub-view 1 having the numerals from 0 to 4, while sub-view 2 has the numerals from 5 to 9. Accordingly, either through horizontal scrolling or actuating an ellipses symbol, either of the two sub-views may be selected. The user interaction required for such horizontal scrolling or actuating the ellipse symbol may correspond to any one of the examples as depicted under Fig. 4.

Further, though not depicted, selection of any singular numeral leads to its typing within the text filed, while the display of the numerals is shifted towards the remaining screen area 400 that excludes the typing field.

Figure 8 illustrates an exemplary mechanism for operating upon the groups of the special characters within the keyboard interface 402-4. As evident from the figure, the operation upon the groups of special characters for choosing a single special character and thereafter typing it within the text field follows the same sequence as described in Fig. 5 in respect of the operation over alphabet based keyboard interface. Overall, a final selection of any special character leads to its typing within the text filed, while the display of the groups of special characters is shifted towards the remaining screen area 400 that excludes the typing field.

Fig. 9 illustrates an exemplary mechanism to switch among various views of the keyboard interface. As aforesaid, such views denote alphabet based keyboard interface 402, numeral based keyboard interface 402-2 and special character based keyboard interface 402-4. In other words, the present figure describes selection of either of the aforesaid types of keyboard interfaces. Once any keyboard interface has been selected, the mechanism to operate upon the same ensues.

Now to switch between different views (402, 402-2, 402-4) user may be provided with various option of executing a user interaction as described under Fig. 4. For example, a user may switch from alphabet view to numeral view either by touch gesture or intuitively by simply rotating the hand in the air (contactless gesture), the same being represented in the present figure In other example, a dedicated graphical button may be provided for switching among views.

Figure 10 an exemplary mechanism for operating upon the keyboard interface 402 without contacting or touching the screen area 400. As shown in the figure, a particular group may be selected through a touch free gesture, e.g. by rotating the hand in the air. However, other contactless gestures like eye gaze, vocal gesture, or blowing of air upon the screen may also be contemplated to perform the ‘touch-free’ selection of a group.

Further, it may be understood that the selection of a particular group may be done by initially brining the same into ‘focus’. In other words, for a group to get selected, its prior highlighting is necessary to enable its selection. The same is evident from the figures as well, wherein the group “abe” is under focus (highlighted boundaries) prior to getting splintered into a, b, c, d, e. Such focus may be brought upon the groups automatically. For example, each of the current displayed group gets highlighted after fixed intervals of time. In other example, such automatically shifting of focus may be overridden by the user interaction. The user may instead shift the focus by a touch-free gesture. However, such user action may also be performed through touch-gestures.

Figure 11 illustrates exemplary shortcuts as applicable for operating upon the keyboard interfaces 402, 402-2, 402-4. Such shortcuts may be provided to perform a keyboard function action in quick amount of time, while not actuating individual keys within the keyboard. For example, rather than actuating a space bar within the keyboard, one may use an intuitive touch gesture such as “dragging the finger in a forward direction on the screen” to request for space for typing an additional word. Alternatively, the user may “drag the thumb and an adjacent finger” in opposite directions over the screen area 400 to increase the space between any two written words, rather than the conventional practices of bringing the cursor between the two words and thereafter pressing a space key.

Likewise, there may be many other shortcuts that can be conceived to perform actions without requiring any key actuation from the keyboard interface 402, 402-2, 402-4. In an example, various keyboard functions that can be executed using these gestures may be divided into three types of gestures as follows:

The touch gesture as represented in Fig. 11(a) may be used as a shortcut for performing various keyboard functions such as space, enter, shift etc, and other functionalities (like keyboard view switching as in Fig. 9). New gestures can be added or redefined anytime through modification of settings.

As represented in Fig. 11(b), a number of continuous taps may be made as a gesture to execute the aforesaid shortcuts. In an example, frequently used characters or words by a particular user may be assigned a certain number of taps (for example, the word “yes” may be directly typed upon tapping the screen thrice). Moreover, time interval between the tapping actions can be decided by user to execute a shortcut. For example, taps made thrice with a time interval of 1 sec may lead to switching from screen view to another.

As represented in Fig. 11(c), a contactless gesture such a ‘hand waiving in air’ gesture can be used to switch the focus (or highlighting) between character groups and further select a character out of the focussed group. Moreover, a particular screen view (e.g. alphabets, numeric, symbols) can be also selected through such contactless gesture.

The forthcoming description describes the basis of creation of the character groups of alphabets, numerals, special characters as described so far. In an example, two type of groupings have been described, either of which may be adopted by the user to formulate a type of keyboard interface. However, the creation of character groups as described in the present subject matter shall not be construed as limiting the claimed subject matter, as the claimed subject matter is compatible with other type of character group formations that are visualizable in respect of the keyboard interface.

In addition, through the aforesaid keyboard interfaces have been described in terms of alphabets, numerals and symbols, keyboard interface may also include character groups created with respect to control inputs such as F1, F2, Shift, Ctrl, Alt Gr etc. Further, the keyboard interface may also be construed to cover character groups that are a combination of alphabets, numerals, symbols and control inputs.

Also, though the forthcoming description has exemplarily highlighted groupings of characters (i.e. alphabets and numerals) in respect of English language, yet the forthcoming description shall be strictly construed as acting a mere example and definitely not limiting the scope of present invention, This is so as the forth-described rationale for grouping the characters is equally and comprehensively applicable to any known language or scheme of numerals/special characters (symbols) followed worldwide, Accordingly, the present keyboard may be understood as being multi-lingual in nature.

For example, the forth-described rationale for character grouping in Fig 12-15 shall be equally applicable to indigenous languages like Hindi, Punjabi, Urdu, Tamil, Sanskrit or international languages like German, French, Hebrew, Arabic, Spanish etc. Accordingly, while grouping the characters of a particular language, lexicography associated with that specific language for designing keyboard may be useful. It is understood that such lexicography may be referred for sequencing in terms of alphabets, numerals, symbols, characters as relevant in terms of a particular language.

Similarly, the forth-described character groupings may also cover character groupings in respect of more than one language. For example, characters of English and Hindi may be represented simultaneously within the keyboard interface to provide various options to the user.

Figure 12 illustrates an exemplary grouping of characters within the keyboard interface 402, 402-2, 402-4. The grouping specific to the present Fig. 12 may be created through following steps:

a) Alphabets (1-26) are grouped, with five or six elements in each group, lexicographically.
b) Likewise, numerals are dispersed across two screen views as shown in Fig. 7.
c) The special characters (symbols) are grouped based on usage pattern i.e. in an order of increased usage.

The groups of alphabet and special character may be referred as a primary level, while a single character depiction amounts to a secondary level. Accordingly, as the numerals are depicted individually and not in groups, the depiction of numerals never denotes the primary level.

Further, various types of aforesaid touch screen gestures or contactless gestures may be dedicated to switch from primary level to secondary level and vice versa. For example, a swipe upwards gesture leads to moving to a secondary level from primary level and the swipe downwards causes vice versa.

Figure 13 illustrates another exemplary grouping of characters within the keyboard interface 402, 402-2, 402-4. As per the criteria described in Fig. 12, following steps may be performed:

a) As a first step, alphabets are placed on alphabetical line, i.e. alphabets arranged lexicographically.
b) Thereafter, most frequently used alphabets from the arrangement in step a) are chosen based on a standard usage of the device or based on a user profile (in case of multiple users of the device).
c) Based on the shortlisted alphabets in step b), the alphabets groups or primary level groups are formed (e.g. each containing 3 to 6 alphabets). Yet, the alphabetical or lexicographical order in such groups is retained due to step a). The primary level groups in the present scenario may also include individual alphabets, if they have been very frequently accessed historically.

Each secondary level may be composed of sub-groups of characters (each containing 3 to 4 alphabets) or individual characters. Likewise, the third level is represented mostly by individual characters and further sub-groups (if any). Any further level is represented almost by individual; characters.

Similarly, upon following afore-recited steps a-b-c, numerals and special characters are arranged based on numerical order or a lexicographical order (universally applicable order), shortlisted based on the user profile or device usage, and then grouped as to create a hierarchy of primary levels, secondary levels and so on

d) In order to traverse through the aforesaid hierarchal structure of different levels, a user may be provided special shortcuts (as described under Fig. 11) to select a particular character. For example, the user may simply ‘zoom in’ a particular primary level to see the structure comprising the corresponding second level and third level. In other example, the user may simply tap a particular primary level thrice or four times to select either the leftmost or the rightmost character in the group. The information regarding shortcuts may be retrieved upon actuating a “Help” based button provided alongside the displayed groups.

The control flow depicted in Figure 14 represents a diagrammatic and summarized description of the steps a, b, c and d of Fig. 13.

Further, Figure 15 illustrates an exemplary sequence of user operations upon the keyboard interface comprising the grouping of characters as described in Fig. 13. Such user operations may be executed through touch-gestures or contactless gestures.

The screen areas represented from (a) till (c) represent the selection of a particular group or sub-groups by the touch gesture mechanism so as to traverse the hierarchy downwards. For example, within the screen (a), the primary level group FGN may be touched or actuated to split into sub-group IGN or individual characters that together constitute the secondary level as represented by the screen (b). It may be understood that touch gesture applied to the individual characters within the secondary level leads to typing of the corresponding character. However, in case the touch gesture is applied to the sub-group IJN, as shown in screen (b), individual characters I, J, K may be displayed alongside a further sub-group LMN as a third level shown in the screen (c). Although it is not shown in the present figures, a further selection of LMN as the sub-group leads to display of individual characters L, M, N as a part of fourth level.

Further, in order to traverse the hierarchy upwards, a ‘swipe down’ action may be pursued to reach back the screen (a) from the screen (c). The swipe down may be performed at the screen (c) (i.e. actuating the sub-group ’LMN’) reach the screen (b). Thereafter, swipe down may be further performed at the screen (b) (i.e. actuating the sub-group ‘IJN’) to reach the screen (a).

Likewise, such traversal (upwards or downwards) is also executable through contactless gestures depicted as aforesaid.

In view of the description of above-described embodiments and examples, it may be understood that the keyboard interface 402, 402-2, 402-4 or the soft keyboard as referred by the present invention provides an ease of typing or using the keys. It is also expandable to encompass all linguistic characters, numbers and special characters as known in the realm of a keyboard. The grouping of characters increases a rate of identification and selection of the characters from the soft keyboard, and accordingly an overall rate of typing.

Further, much of keyboard interface‘s functionality is also attainable through the shortcuts so as to bypass such type of keyboard usage that requires extensive key actuation in a sequence.

Accordingly, an enhanced ease of typing facilitated by the present keyboard interface, coupled with assistance lent by the shortcuts (in terms of substituting certain types of keyboard usage) cumulatively leads to a substantially high typing speed and a high rate of accessibility from the keyboard interface.

Moreover, the operation of the keyboard interface as enabled through various types of gestures (touch based and contactless) enables its workability in respect of both touch-screen based devices and devices operable through contactless gestures.

In addition, owing to being present in the fragmented form, a screen space occupied by the soft keyboard is substantially lesser than the conventional keyboards. Further, a controllable transparency of the soft keyboard provides a view of the background information that is otherwise meant to be hidden by the soft keyboard. Still further, as the fragments (or the groups) of the keyboard are movable throughout the screen, no screen area remains obscured continuously.

Accordingly, the present invention is better accustomed for devices having compact touch sensitive screen areas, as it provides a substantially better visibility of the background information and efficient typing as individual keys are sufficiently spaced. Moreover, the invention is also operable within devices having a large touch sensitive surface or a large screen area.

While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended. As would be apparent to a person in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein.

The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible. The scope of embodiments is at least as broad as given by the following claims.

Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any component(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or component of any or all the claims.

CLIAMS:WE CLAIM:

1. A method comprising:
(h) Rendering (step 102), on a display device, “N” character groups, each group comprising at least one character and optionally at least one sub-group;
(i) Receiving (step 104) a selection of a group;
(j) optionally, hiding (step 106) the non-selected groups;
(k) rendering (step 108), on the display device, characters or sub-groups forming part of the selected group, if it is determined that the selected group comprises a plurality of characters or at least one sub-group;
(l) receiving (step 110) a selection among the characters or sub-groups forming part of the selected group;
(m) rendering (step 112) a selected character in a pre-determined area of the display device, if the selection in step (e) pertains to a character; and
(n) rendering (step 114), on a remaining space of the display device, the “N” character groups.

2. The method as claimed in claim 1, wherein said at least one character is: an alphabet, a special character, a numeral, a control input, or a combination thereof.

3. The method as claimed in claim 1, wherein said rendering of at least one of said “N” character groups or said characters or said sub-groups, comprises rendering the said “N” character groups or said characters or said sub-groups at a location that is dynamically determined.

4. The method as claimed in claim 3, wherein the location thus dynamically determined is valid for a pre-determined amount of time period.

5. The method as claimed in claim 1, wherein said rendering further comprises varying based on a time period, at least one non-dominant character of a character group or character sub-group.

6. The method as claimed in claim 1, wherein said rendering further comprises varying at least a transparency level of the character group, the character or the character sub-group.

7. The method as claimed in claim 1, wherein said optionally hiding comprises hiding said non-selected groups in case said selected group comprises a plurality of characters or at least one sub-group.

8. The method as claimed in claim 1, wherein said pre-determined area denotes a typing field within the screen area.

9. The method as claimed in claim 1, wherein said remaining space denotes a screen area excluding the typing field.

10. The method as claimed in claim 1, wherein said receipt of the selection of the groups or the character is gathered through a user input provided by at least one of :
a pre-defined touch gesture,
a pre-defined tapping action,
a pre-defined gesture executed without contacting a screen surface, and
a pre-defined limb movement.

11. The method as claimed in claim 1, further comprising, receiving a user input towards selecting a screen view comprising said ‘N’ character groups, out of a plurality of screen views that comprise different character groups.

12. The method as claimed in claim 1, wherein if the selection in step (b) pertains to a group comprising a single character, the method directly proceeds to step (f) of rendering the selected character in a pre-determined area of the display device.

13. The method as claimed in claim 1, wherein if the selection in step (e) pertains to a sub-group, the method further comprises:
(c) rendering, on the display device, characters forming part of the selected sub-group; and
(d) receiving a selection among the characters forming part of the selected sub-group.

14. The method as claimed in claim 1, wherein each of said ‘N’ groups is based on at least one of:
(a) a lexicographical sequence; or
(b) a frequency of usage.

15. An apparatus comprising,
a display device for rendering, on a display device, “N” character groups, each group comprising at least one character and optionally at least one sub-group;
an input device for receiving a selection of a group;
a processor in operational interaction with said display device configured to:
optionally, hide the non-selected groups; and
render, on the display device, characters or sub-groups forming part of the selected group, if it is determined that the selected group comprises a plurality of characters or at least one sub-group;
wherein, said input device further receives a selection among the characters or sub-groups forming part of the selected group;
wherein said processor in operational interaction with said display device is further configured to:
render a selected character in a pre-determined area of the display device, if said selection further received by said input device pertains to a character; and
further render, on a remaining space of the display device, the “N” character groups.

Documents

Application Documents

# Name Date
1 1946-DEL-2015-IntimationOfGrant04-04-2022.pdf 2022-04-04
1 Specification.pdf 2015-06-30
2 1946-DEL-2015-PatentCertificate04-04-2022.pdf 2022-04-04
2 FORM 5.pdf 2015-06-30
3 FORM 3.pdf 2015-06-30
3 1946-DEL-2015-CLAIMS [21-05-2020(online)].pdf 2020-05-21
4 Form 26..pdf 2015-06-30
4 1946-DEL-2015-COMPLETE SPECIFICATION [21-05-2020(online)].pdf 2020-05-21
5 Drawings.pdf 2015-06-30
5 1946-DEL-2015-FER_SER_REPLY [21-05-2020(online)].pdf 2020-05-21
6 1946-DEL-2015-OTHERS [21-05-2020(online)].pdf 2020-05-21
6 1946-del-2015-Form-1-(06-07-2015).pdf 2015-07-06
7 1946-DEL-2015-FER.pdf 2019-12-26
7 1946-del-2015-Correspondence Others-(06-07-2015).pdf 2015-07-06
8 1946-DEL-2015-PA [18-09-2019(online)].pdf 2019-09-18
8 1946-DEL-2015-Correspondence-101019.pdf 2019-10-14
9 1946-DEL-2015-ASSIGNMENT DOCUMENTS [18-09-2019(online)].pdf 2019-09-18
9 1946-DEL-2015-OTHERS-101019.pdf 2019-10-14
10 1946-DEL-2015-8(i)-Substitution-Change Of Applicant - Form 6 [18-09-2019(online)].pdf 2019-09-18
11 1946-DEL-2015-ASSIGNMENT DOCUMENTS [18-09-2019(online)].pdf 2019-09-18
11 1946-DEL-2015-OTHERS-101019.pdf 2019-10-14
12 1946-DEL-2015-Correspondence-101019.pdf 2019-10-14
12 1946-DEL-2015-PA [18-09-2019(online)].pdf 2019-09-18
13 1946-del-2015-Correspondence Others-(06-07-2015).pdf 2015-07-06
13 1946-DEL-2015-FER.pdf 2019-12-26
14 1946-del-2015-Form-1-(06-07-2015).pdf 2015-07-06
14 1946-DEL-2015-OTHERS [21-05-2020(online)].pdf 2020-05-21
15 1946-DEL-2015-FER_SER_REPLY [21-05-2020(online)].pdf 2020-05-21
15 Drawings.pdf 2015-06-30
16 1946-DEL-2015-COMPLETE SPECIFICATION [21-05-2020(online)].pdf 2020-05-21
16 Form 26..pdf 2015-06-30
17 1946-DEL-2015-CLAIMS [21-05-2020(online)].pdf 2020-05-21
17 FORM 3.pdf 2015-06-30
18 1946-DEL-2015-PatentCertificate04-04-2022.pdf 2022-04-04
18 FORM 5.pdf 2015-06-30
19 Specification.pdf 2015-06-30
19 1946-DEL-2015-IntimationOfGrant04-04-2022.pdf 2022-04-04

Search Strategy

1 2019-12-0410-25-01_04-12-2019.pdf

ERegister / Renewals

3rd: 07 Jun 2022

From 29/06/2017 - To 29/06/2018

4th: 07 Jun 2022

From 29/06/2018 - To 29/06/2019

5th: 07 Jun 2022

From 29/06/2019 - To 29/06/2020

6th: 07 Jun 2022

From 29/06/2020 - To 29/06/2021

7th: 07 Jun 2022

From 29/06/2021 - To 29/06/2022

8th: 07 Jun 2022

From 29/06/2022 - To 29/06/2023

9th: 26 Jun 2023

From 29/06/2023 - To 29/06/2024

10th: 26 Jun 2024

From 29/06/2024 - To 29/06/2025

11th: 28 Jun 2025

From 29/06/2025 - To 29/06/2026