Abstract: Embodiments of the disclosure relate to a method for initiating an event using touch screen device. The method includes touching one of the plurality of contexts being displayed on the touch screen using an input means to display one or more variants of the touched context over predetermined-shaped icon around the touched context. The method also includes dragging the input means over the predetermined-shaped icon to select one of the displayed variants of the context. The method further includes releasing the input means over the selected variant to initiate the event. The system herein is a electronic visual display device that includes a processor responsible for initiating an event using touch-drag-release mechanism. Fig. 2
TECHNICAL FIELD
Embodiments of the present disclosure relate to an electronic visual display. More particularly, the embodiments relate to a convenient user input mechanism in touch screen devices.
BACKGROUND
Mobile computing devices are commonly utilized to provide users a means to communicate and stay "connected" while moving from one place to another place. Technology of such mobile computing devices has advanced to the extent where data regarding any desired content is readily available. Such information exchange can occur by way of user entering information (e.g., text, visual, audio, and so on) into the display area of a user device and interacting with the device, utilizing that display area.
The footprint of computing devices has become smaller and smaller to allow the devices to be easily carried. Accordingly, the weight of the device has also reduced considerably. This size reduction has resulted in a corresponding reduction in the size of display area or screen. Thus, when a user attempts to navigate through various directories, applications, files or other functions, all the information that the user might need to navigate might not be displayed on the display area at the same time. This makes the user scroll or move through various display pages to achieve the desired result. The excess scroll or movement on the display page that is not intuitively related to hand movement causes stress on the user. In addition, performing some functions can be cumbersome and might not allow a user to quickly and easily interact with the device.
In light of the foregoing discussion, there is a need for a method and system to solve the above mentioned problems.
SUMMARY
The shortcomings of the prior art are overcome and additional advantages are provided through the provision of a method and system as described in the description.
The present disclosure solves the limitations of existing techniques by providing improved user input mechanism which reduces number of screen touch-releases on touch screen devices.
2
In one embodiment, the input mechanism as disclosed in the present disclosure provides a user interface for touch screen device that minimizes number of finger or stylus movement required to initiate a predetermined action.
In further embodiment, the input mechanism as disclosed in the present disclosure minimizes user’s stress by intuitively relating hand movement with tasks to be performed.
Additional features and advantages are realized through various techniques provided in the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered as part of the claimed disclosure.
In one embodiment, the disclosure provides a method for initiating an event using touch screen device, said method includes touching one of plurality of contexts being displayed on the touch screen using an input means to display one or more variants of the touched context over predetermined-shaped icon around the touched context. The method further includes dragging the input means over the predetermined-shaped icon to select one of the displayed variants of the context and releasing the input means over the selected variant to initiate the event.
In one embodiment, the disclosure also provides a device for initiating an event including a touch screen to receive a touch of an input means on one of plurality of contexts being displayed on the screen. The system also includes a processor which is responsive to instructions by way of touch, for displaying one or more variants of the touched context over predetermined-shaped icon around the touched context. The processor is also responsive towards dragging of the input means over the predetermined-shaped icon to select one of the displayed variants of the context. The processor is also responsive towards initiating the event after the input means is released over the selected variant.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
3
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The novel features and characteristic of the disclosure are set forth in the appended claims. The embodiments of the disclosure itself, however, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying drawings. One or more embodiments are now described, by way of example only, with reference to the accompanying drawings wherein like reference numerals represent like elements and in which:
Fig. 1a is a block diagram of an electronic visual display device having touch screen interface, in accordance with one embodiment.
Fig. 1b is a block diagram of a device having touch screen and processor for imitating an event, in accordance with one embodiment.
Fig. 2 is a flowchart illustrating a method for initiating an event using the touch screen interface of the device, in accordance with an exemplary embodiment.
Fig 3a shows an exemplary user interface (English language) embodied in touch screen device which diagrammatically illustrates the process flow depicted in the flow chart of fig. 2, in accordance with an exemplary embodiment.
Fig 3b shows an exemplary user interface (Korean language) embodied in touch screen device which diagrammatically illustrates the process flow depicted in the flow chart of fig. 2, in accordance with an exemplary embodiment.
Fig 3c shows an exemplary user interface (Hindi language) embodied in touch screen device which diagrammatically illustrates the process flow depicted in the flow chart of fig. 2, in accordance with an exemplary embodiment.
Figs. 4a-4d shows an exemplary user interface of a touch screen device illustrating launch of various settings dialog from home screen, in accordance with an exemplary embodiment.
4
Figs. 5a-5c shows an exemplary user interface wherein various icons of accessing indicator bar are displayed in magnified view using an improved input mechanism, in accordance with an exemplary embodiment.
Figs. 6a to 6m shows an exemplary user interfaces illustrating text selection operations performed using an improved input mechanism, in accordance with an exemplary embodiment.
The figures depict embodiments of the disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION
The foregoing has broadly outlined the features and technical advantages of the present disclosure in order that the detailed description of the disclosure that follows may be better understood. Additional features and advantages of the disclosure will be described hereinafter which form the subject of the claims of the disclosure. It should be appreciated by those skilled in the art that the conception and specific embodiment disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure. It should also be realized by those skilled in the art that such equivalent constructions do not depart from the spirit and scope of the disclosure as set forth in the appended claims. The novel features which are believed to be characteristic of the disclosure, both as to its organization and method of operation, together with further objects and advantages will be better understood from the following description when considered in connection with the accompanying figures. It is to be expressly understood, however, that each of the figures is provided for the purpose of illustration and description only and is not intended as a definition of the limits of the present disclosure.
Exemplary embodiments of the present disclosure provide method and device for initiating an event including, but are not limiting to text selection, menu selection, inserting character on the screen, inserting symbol on the screen, initiating Audio using audio applications and Video using video application.
5
Fig.1a is a block diagram of an electronic visual display device 101, in accordance with one embodiment. Examples of the electronic visual display device include, but are not limited to, mobile phone, ATM machine, Television, Personal Digital Assistant (PDA), Computers and Monitors, point-of-sale terminals, car navigation systems, medical monitors, industrial control panels and any other processing device having touch screen panel. Examples of the input device or input means 102 includes, but are not limited to, stylus, finger, pen shaped pointing device, and any other device that can be used to provide input through a touch screen interface. The electronic visual display device 101 also includes a display device 103 or display interface. An example of display device 103 includes but is not limited to a touch screen enabled liquid crystal display embodied in the visual display device 101 for receiving input instructions from the user. The input instructions received from the user are transmitted to a processor 104 which is in communication with the display device 103 through a network. The network may comprise a public network e.g., the Internet etc. or private network e.g., local area network (LAN), etc. or any combinations thereof e.g., a virtual private network, LAN connected to the Internet, etc. Furthermore, the network need not be a wired network only, and may comprise wireless network elements as known in the art.
In an exemplary embodiment, the processor 104 and the display device 103 are configured in a single device and are connected through a communication interface. The touch screen interface receives a touch of an input means on one of plurality of contexts being displayed on the screen. Upon receipt of the input request from a display device 103, it is transmitted to the processor 104 for further processing. The action to be performed by the processor 104 depends on the nature of the input request received through display device 103. In the present disclosure, the processor 104 is responsive to the touch by way of the instruction for displaying one or more variants of the touched context over predetermined-shaped icon around the touched context. The variants are logically related to the contexts. As the predetermined-shaped icon appears on the screen, the user drags his finger or stylus over the predetermined-shaped icon to select one of the displayed variants of the context. The touching and dragging actions are performed without lifting the input means over the predetermined-shaped icon. Here, the processor 104 is responsible for determining whether the input means is released over the predetermined-shaped icon or not. The user selects the required option from the available
6
variants of the context displayed on the screen and releases input means over the selected variant to initiate an event. The type of event which needs to be performed depends on the type of the input request received.
In some embodiments, the event performed by the processor 104 may also be performed by one or more units coupled to the processor 104. The one or more units may be internally or externally coupled to the processor 104.
The input device 102 includes various touch sensitive keys for communicating an information or input request to the processor 104. The information communicated to the processor 104 can be the information required for initiating an event within the visual display device 101 or with the external devices connected to the visual display device 101. The information can be communicated to the processor 104 from a machine-readable medium. The term machine-readable medium can be defined as a medium providing data to a machine to enable the machine to perform a specific function. The machine-readable medium can be a storage media. The storage media can be the storage unit. All such media must be tangible to enable the instructions carried by the media to be detected by a physical mechanism that reads the instructions into the machine.
The machine readable medium can also include contacts, connectivity applications, email applications, messenger application, online links, download links, and installation links providing the information to the processor 104.
In one exemplary embodiment, initiating an event, by way of example and not limitation, for an email application, a first context might be to "Activate a menu". Touching this menu activation displays possible variants of the context over the predetermined-shaped icon around the touched context i.e. around the activate menu. The possible variants might be “compose an e-mail” “go to inbox” “Reply to email" or “forward an e-mail”. Now, the user drags the input means over one of the variants to select the desired option. Selection of one of the variants can bring up a subsequent recursive predetermined-shaped icon adjacent to the predetermined-shaped icon. For example, if the user selects “Reply to email” then possible sub-variants of the “Reply to email” are displayed over the subsequent predetermined-shaped icon. The one of the possible sub-variants can be "Reply all". Upon dragging the input means over the “reply all” sub-variant, the user releases input means over the selected variant or sub-variant and 7
this will activate an e-mail application and the user can proceed accordingly. Thus, in this example the user can reply to all or a subset of recipients of an email. The aforesaid event can be performed in a minimal amount of time. Thus, with the disclosed embodiments, speed of performing menu applications can increase and error rate can decrease.
Various events are initiated using the instant “touch-drag-release” technology. The events include, but are not limited to text selection, inserting character on the screen, inserting symbol on the screen and menu selection or any combinations thereof. In this disclosure only limited events or applications are listed out for the purposes of the demonstration only. This should not be construed as limitation on this part.
Fig. 1b is a block diagram of a device 101 having touch screen 103 and processor 104 for initiating an event, in accordance with one embodiment. The device 101 includes a touch screen 103 to receive a touch of an input means on one of plurality of contexts being displayed on the screen. The device 101 also includes a processor 104 which is in communication with the touch screen device for receiving instructions from the touch screen 103. As soon as touch of an input means is received on the touch screen 103, it communicates to the processor 104 for displaying one or more variants of the touched context over predetermined-shaped icon around the touched context. Now, the user drags the input means over the predetermined-shaped icon to select one of the displayed variants of the context. Here, the processor 104 is responsible for determining whether the input means is released over the predetermined-shaped icon or not. The user selects the required option from the available variants of the context displayed on the screen 103 and releases input means over the selected variant to initiate an event. The type of event which needs to be performed depends on the type of the input request received.
FIG. 2 is a flowchart illustrating a method for initiating an event using improved input mechanism. The instant input method or mechanism of the touch screen device works on touch-drag-release concept.
At step 201, user touches one of the plurality of contexts displayed on the touch screen 103 of the device. The user can select any one of the contexts displayed on the screen 103 using any input means such as a stylus or a finger. As a result of the touch, predetermined-shaped icon gets displayed around the touched area. The predetermined-shaped icon houses all possible variants of the contexts. In other words, when the user
8
touches a given context, the processor 104 provides a predetermined-shaped icon for example lens-shaped icon around the touched area. This would result in display of hidden options that get refracted through the predetermined-shaped icon. The options over the predetermined-shaped icon surface could be an image, text or icons. In one exemplary embodiment, the predetermined-shaped icon disappears once touch is released. This may result into closure of the application. The shape of the predetermined-shaped icon can be customized which includes, but is not limited to oval shape, circle shape, parabolic shape, cylindrical shape, cone shape or any other geometrical shape. Further, other properties of the predetermined-shaped icon are also pre-configurable. The properties of the predetermined-shaped icon include, but are not limited to number of options or variants to be displayed over the predetermined-shaped icon, shape of the icon, size, position, colour, texture, and selection appearance. It is possible for a person skilled in the art to make various changes to the properties of the predetermined-shaped icon. However, such changes would not lead into any significant changes or improvisation over the instant technology.
At step 202, the user drags the input means such as stylus or finger of the user over the displayed predetermined-shaped icon to select one of the displayed variants of the context. In one embodiment, the user does not release his finger or stylus after touching the context at step 201. The release of the stylus or finger over the touched context would result in closure of the application. Thus, touching and dragging are performed continuously. In another embodiment, dragging the input means over the variants of the context displayed over the predetermined-shaped icon leads to displaying of a subsequent predetermined-shaped icon adjacent to the earlier predetermined-shaped icon. Here, the processor 104 makes sure that the subsequent predetermined-shaped icon is displayed at appropriate place on the screen 103. The direction of the drag would decide which operation is to be performed and is intuitively related to action to be performed.
At step 203, the user releases the input means such as his finger or the stylus over the selected variant or sub-variant of the context. This user action triggers the processor 104 of the device 101 to initiate an event for further processing.
The method for initiating an event is well explained in conjunction with Figs. 3 to 6. Now, the technology of the instant disclosure is explained with the help of examples.
9
However, such examples should not be construed as limitation on the scope of the instant technology.
Referring now to Fig 3a, it shows an exemplary user interface in an English language, embodied in a touch screen device 101 that illustrates the process flow depicted in the flow chart of fig. 2. As shown in figure 3a, the user touches the context displayed on the screen. In this example the context is alphanumeric keys displayed on the touch screen interface 103 of the mobile device. The figure shows series of steps involved in typing the word “Congratulation” by selecting each letter of the word using touch-drag-release mechanism. Here, the user touch numeric key 2 and the processor 104 would display the lens-shaped icon around the numeric key 2. The possible variants or hidden options are displayed over the lens-shaped icon. In this example, small letters a, b and c, and capital letters A, B, and C are displayed over the lens-shaped icon. Now, the user drags his finger over the capital letter C and releases his finger. This action of the user triggers the processor 104 to enter the letter C over the editor screen displayed on the device. In the same way the user enter the letters o, n, g, r, a, t, u, l, a, t, i, o, n and s on the editor screen to create the word “Congratulations”.
Fig 3b shows an exemplary user interface (Non-English language) embodied in touch screen device which diagrammatically illustrates the process flow depicted in the flow chart of fig. 2, in accordance with an exemplary embodiment. The non-English language includes, but not limiting to Korean, Hindi, Thai, Arabic, Russian, and Turkish. The figure 3b shows input panel in Korean language displayed on the screen 103. Words in Korean language are written using combinations of consonants and vowels. The various combinations of consonants and vowels are depicted in the fig. 3b. The touch of a user over a consonant would send an instruction to the processor 104 to display the possible vowel which is logically related to the touched consonant. The processor displays the predetermined-shaped icon around touched consonant which houses logically related vowels required to frame a word in Korean language. In one embodiment, dragging the input means over one of the vowels displayed on the predetermined-shaped icon leads into displaying of consonants over the subsequent predetermined-shaped icon displayed adjacent to the earlier predetermined-shaped icon. However, the processor 104 is responsible here to identify whether the subsequent predetermined-shaped icon needs to be displayed or not. In one exemplary embodiment, the processor 104 displays
10
subsequent predetermined-shaped icon if the word which is to be framed has a combination of consonant, vowel and consonant. Otherwise, the processor may not display the subsequent predetermined-shaped icon. The direction of the drag would decide which vowel is to be selected and is intuitively related to the word to be written. Here, one can see that subsequent predetermined-shaped icon for example subsequent lens-shaped icon is displayed adjacent to the lens-shaped icon for selecting vowels associated with the selected consonant.
Fig. 3c shows an exemplary user interface in Hindi language embodied in touch screen device which diagrammatically illustrates the process flow depicted in the flow chart of fig. 2, in accordance with an exemplary embodiment. In Hindi language vowel is pronounced alone but a vowel sign (Matra in Hindi) is pronounced together with a consonant. Each vowel is represented by its sign (matra). As shown in figure 3c, the user touches a Hindi consonant displayed on the screen. The touch of the user over the Hindi consonant leads into displaying of possible ‘matras’ (vowel sign) logically related to the selected consonant on a predetermined-shaped icon. Now, the user drags the input means over the desired matras and releases the input means. This action triggers the processor 104 to input the required letter (akshar in Hindi) on the display screen.
Figs. 4a to 4d shows an exemplary user interface of a touch screen device illustrating launching of various settings dialog from home screen, in accordance with an exemplary embodiment. In this example, the user touches the settings menu option from the main menu of the mobile device (fig. 4a). This would display the possible variants of the settings menu option over the lens-shaped icon. The possible variants for example could be personal setting, system setting, connections, sound and clock (fig. 4b). The user has an option to select any one of the displayed variants by dragging over the lens-shaped icon. In this case the user selects connection variant or connection menu displayed over the lens-shaped icon (fig. 4c). The selection of the connection menu would send an instruction to the processor 104 to display sub-variants or sub-menu over subsequent lens-shaped icon adjacent to the earlier lens-shaped icon. However, position and shape of the subsequent lens-shaped icon can be varied, if required. The sub-variants of the connection could be mobile network option, Wi-Fi option and Bluetooth option (fig. 4d). Once the user drags his finger over the Wi-Fi option and releases over the selected option the processor 104 would initiate the Wi-Fi connection with other devices. In the same 11
manner, the user can initiate Bluetooth function or mobile network function using touch-drag-release mechanism.
Figs. 5a-5c shows an exemplary user interface wherein various icons of accessing indicator bar is displayed in magnified view using an improved input mechanism, in accordance with an exemplary embodiment. This example shows an alternative way in which the user can initiate the connection functionality, for example Wi-Fi connectivity option using the short cut menu provided on the title bar of the device (fig. 5a). The title bar can be located at any predetermined area of the device. The user touches the predetermined area of the device, for example in this case the user touches the title bar provided over the touch screen of the device (fig. 5b). The touch of the user sends a request to the processor 104 for displaying lens-shaped icon at a predetermined position on the device. The variants provided on the title bar are displayed in the magnified view over the lens (fig.5c). This would enable the user to clearly see the available options over the title bar and select the required option to initiate the event. The magnified display of the variants reduces the risk of excess scrolling and cumbersome process involved in initiating an action. In an exemplary embodiment, Wi-Fi connectivity and Bluetooth connectivity are logically related to connectivity application. Thus, Wi-Fi connectivity and Bluetooth connectivity variants are logically related to Connectivity Application which acts as context for said variants.
Figs. 6a-6m shows an exemplary user interface, illustrating text selection operations performed using an improved input mechanism, in accordance with an exemplary embodiment. The fig. 6a shows portion of text displayed on the screen editor. Now if the user wants to select and copy a part of the displayed text, the user touches start point of selection. A lens-shaped icon appears above touch point. Further, the lens-shaped icon gives magnified view of touched area. It also shows exact location of cursor (fig. 6b). Now, the user moves finger in either left or right direction to set cursor position at desired place (fig. 6c). As the finger moves in the left or the right direction, the lens-shaped icon also moves horizontally along with the finger. The user drags the finger upward to touch “text selector icon” on lens (fig. 6d). Here, the lens-shaped icon does not move when user drags his finger upward. After the user touches the text selector icon for predetermined amount of time for example 300ms, text selection mode is enabled (fig. 6e).
12
In one exemplary embodiment, the colour of the cursor changes to predetermined colour to indicate the activation of text selection mode. At this stage, user can release the finger and touch at last point of selection (fig. 6f) or can drag finger to continue with the selection (fig. 6g). As the user touches the last point of desired selection, lens-shaped icon appears with selector icon. Now the text gets selected from the point where text selector icon was enabled to last touch point (fig.6h). At this point two more icons for example copy and cut appears when the user does not move finger for predetermined amount of time for example more than 500ms (fig. 6i). However, the predetermined amount of time can be pre-configurable based on the requirement of the application or event to be initiated. Now, the user copies the selected text by dragging finger over the copy menu displayed over the lens (fig. 6j). Selected text gets copied in clipboard after finger is released (fig. 6k). To disable the text selection mode, the user needs to drag the finger to text touch selector icon and hold it at text selector icon for predetermined amount of time for example 300ms (figs. 6l and 6m).
Various embodiments of the present disclosure provide a method and device for initiating an event.
In one embodiment, the present disclosure reduces number of screen touch-release actions and minimizes finger movement to random touch point and thus saving considerable amount of time.
In one more embodiment, the technique disclosed in the present disclosure reduces excess scroll or movement on the display page to reduce hand movement and thus reduces stress on the user.
In another embodiment, the technique disclosed in the present disclosure is user friendly and more intuitive.
In yet another embodiment, the technique disclosed in the present disclosure supports different languages.
In still another embodiment, the technique of the present disclosure can be utilised in various field including, but not limiting to mobile phone, ATM machine, Television, Personal Digital Assistant (PDA), Computers and Monitors, point-of-sale terminals, car
13
navigation systems, medical monitors, industrial control panels and any other processing device having touch screen panel.
The present disclosure is not to be limited in terms of the particular embodiments described in this application, which are intended as illustrations of various aspects. Many modifications and variations can be made without departing from its spirit and scope, as will be apparent to those skilled in the art. Functionally equivalent methods and apparatuses within the scope of the disclosure, in addition to those enumerated herein, will be apparent to those skilled in the art from the foregoing descriptions. Such modifications and variations are intended to fall within the scope of the appended claims. The present disclosure is to be limited only by the terms of the appended claims, along with the full scope of equivalents to which such claims are entitled. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting.
With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.
In addition, where features or aspects of the disclosure are described in terms of Markush groups, those skilled in the art will recognize that the disclosure is also thereby described in terms of any individual member or subgroup of members of the Markush group.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
14
We claim:
1.
A method for initiating an event using a touch screen device, said method comprising acts of:
touching one of plurality of contexts being displayed on the touch screen using an input means to display one or more variants of the touched context over predetermined-shaped icon around the touched context;
dragging the input means over the predetermined-shaped icon to select one of the displayed variants of the context; and
releasing the input means over the selected variant to initiate the event.
2.
The method as claimed in claim 1, wherein the touch screen device is selected from a group comprising mobile phone, ATM machine, Television, Personal Digital Assistant (PDA), Computers and Monitors, point-of-sale terminals, car navigation systems, medical monitors, industrial control panels and any other processing device having touch screen panel.
3.
The method as claimed in claim 1, wherein direction of the drag is intuitively related to an operation to be performed.
4.
The method as claimed in claim 1, wherein the event is selected from a group comprising text selection, inserting character on the screen, inserting symbol on the screen and menu selection.
5.
The method as claimed in claim 1, wherein the context is selected from a group comprising image, text and icon.
6.
The method as claimed in claim 1, wherein the input means is selected from a group comprising stylus, finger of a user, pen shaped pointing device and any other device that can be used to provide input through touch screen interface.
7.
The method as claimed in claim 1, wherein the variants are logically related to the contexts.
8.
The method as claimed in claim 1, wherein the selection of the one of the variants of the context leads to display of one or more sub-variants over recursive predetermined-shaped icon adjacent to the previous predetermined-shaped icon.
15
9.
The method as claimed in claim 1, wherein shape of the predetermined-shaped icon is selected from a group comprising lens shape, oval shape, circle shape, parabolic shape, cylindrical shape, cone shape and any other geometrical shape.
10.
The method as claimed in claim 1, wherein properties of the predetermined-shaped icon are configurable and are selected from a group comprising size, texture, shape, position and colour or any combinations thereof.
11.
The method as claimed in claim 1, wherein the touching of the context in a predetermined area of the device leads to magnified display of all its variants.
12.
A device (101) for initiating an event comprising:
a touch screen to receive a touch of an input means on one of plurality of contexts being displayed on the screen,
a processor (104) responsive to touch as an instruction for:
displaying one or more variants of the touched context over predetermined-shaped icon around the touched context;
dragging the input means over the predetermined-shaped icon to select one of the displayed variants of the context; and
releasing the input means over the selected variant to initiate the event.
13.
The device as claimed in claim 12, wherein properties of the predetermined-shaped icon are configurable and are selected from a group comprising size, texture, shape, position and colour or any combinations thereof.
14.
The device as claimed in claim 12, wherein the input means is selected from a group comprising stylus, finger of a user, pen shaped pointing device and any other device used to provide input through touch screen interface.
Dated this 24th day of August, 2010
Madhusudan S.T
IN/PA-1297
Of K & S Partners
Agent for the Applicant
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 2438-che-2010 form-1 01-02-2011.pdf | 2011-02-01 |
| 1 | 2438-CHE-2010-Correspondence to notify the Controller [11-12-2022(online)].pdf | 2022-12-11 |
| 2 | 2438-che-2010 correspondence others 01-02-2011.pdf | 2011-02-01 |
| 2 | 2438-CHE-2010-US(14)-HearingNotice-(HearingDate-26-12-2022).pdf | 2022-12-02 |
| 3 | 2438-CHE-2010-ABSTRACT [14-05-2020(online)].pdf | 2020-05-14 |
| 3 | 2438-CHE-2010 POWER OF ATTORNEY 16-06-2011.pdf | 2011-06-16 |
| 4 | 2438-CHE-2010-CLAIMS [14-05-2020(online)].pdf | 2020-05-14 |
| 4 | 2438-CHE-2010 CORRESPONDENCE OTHERS 16-06-2011.pdf | 2011-06-16 |
| 5 | Form-5.pdf | 2011-09-04 |
| 5 | 2438-CHE-2010-COMPLETE SPECIFICATION [14-05-2020(online)].pdf | 2020-05-14 |
| 6 | Form-3.pdf | 2011-09-04 |
| 6 | 2438-CHE-2010-CORRESPONDENCE [14-05-2020(online)].pdf | 2020-05-14 |
| 7 | Form-1.pdf | 2011-09-04 |
| 7 | 2438-CHE-2010-DRAWING [14-05-2020(online)].pdf | 2020-05-14 |
| 8 | Drawings.pdf | 2011-09-04 |
| 8 | 2438-CHE-2010-FER_SER_REPLY [14-05-2020(online)].pdf | 2020-05-14 |
| 9 | 2438-CHE-2010-OTHERS [14-05-2020(online)].pdf | 2020-05-14 |
| 9 | abstract2438-che-2010.jpg | 2011-09-04 |
| 10 | 2438-CHE-2010-FER.pdf | 2019-11-15 |
| 11 | 2438-CHE-2010-OTHERS [14-05-2020(online)].pdf | 2020-05-14 |
| 11 | abstract2438-che-2010.jpg | 2011-09-04 |
| 12 | 2438-CHE-2010-FER_SER_REPLY [14-05-2020(online)].pdf | 2020-05-14 |
| 12 | Drawings.pdf | 2011-09-04 |
| 13 | 2438-CHE-2010-DRAWING [14-05-2020(online)].pdf | 2020-05-14 |
| 13 | Form-1.pdf | 2011-09-04 |
| 14 | 2438-CHE-2010-CORRESPONDENCE [14-05-2020(online)].pdf | 2020-05-14 |
| 14 | Form-3.pdf | 2011-09-04 |
| 15 | 2438-CHE-2010-COMPLETE SPECIFICATION [14-05-2020(online)].pdf | 2020-05-14 |
| 15 | Form-5.pdf | 2011-09-04 |
| 16 | 2438-CHE-2010 CORRESPONDENCE OTHERS 16-06-2011.pdf | 2011-06-16 |
| 16 | 2438-CHE-2010-CLAIMS [14-05-2020(online)].pdf | 2020-05-14 |
| 17 | 2438-CHE-2010 POWER OF ATTORNEY 16-06-2011.pdf | 2011-06-16 |
| 17 | 2438-CHE-2010-ABSTRACT [14-05-2020(online)].pdf | 2020-05-14 |
| 18 | 2438-che-2010 correspondence others 01-02-2011.pdf | 2011-02-01 |
| 18 | 2438-CHE-2010-US(14)-HearingNotice-(HearingDate-26-12-2022).pdf | 2022-12-02 |
| 19 | 2438-CHE-2010-Correspondence to notify the Controller [11-12-2022(online)].pdf | 2022-12-11 |
| 19 | 2438-che-2010 form-1 01-02-2011.pdf | 2011-02-01 |
| 1 | SearchStrategyReport2438CHE2010_02-11-2019.pdf |