Abstract: The invention relates to controlling flow of information using rotation based gestures. According to one embodiment of the invention, a system (200) for controlling flow of information comprises: an input (201) controller to receive a rotation based gesture; a context identifier (205) to identify its context; a parameter calculator (208) to compute its parameters; and a gesture controller (202) to control flow of information accordingly. This invention has application in fields, such as cloud-storage, BIG Data, Internet-of-things, which usually incorporate the large volume of data. To this end, this invention enables quick retrieval of information in these fields, which otherwise is very challenging task. Further, high speed networks, such as 5G-network that provide 5G tactile internet may be helpful in real time processing and display of information selected in response to the rotation based gesture.
TECHNICAL FIELD OF INVENTION
The invention generally relates to controlling flow of information. More particularly, the invention relates to method and system for controlling flow of information using rotation based gestures.
BACKGROUND OF INVENTION
User interfaces for computing systems continue to develop and improve as the functionality of computing platforms increases. Nowadays, gesture based user interfaces are becoming quite popular as they enable users to make selections and retrieve information in a more natural and intuitive manner. A gesture based user interface is a graphical user interface enabling users to interact with computing systems through a variety of gestures, which may be given as a touch or air input. Typical examples of a gesture include single tap, double tap, long press, drag and drop, scroll, flick, fling, pinch, spread, spin, etc. Apart from that there can be many other complicated system or user defined gestures.
It is observed that there is lot of potential in rotation based gestures in order to develop and improve user interfaces for computing systems. One set of known solutions in this regard is to provide radial menu of icons and control scrolling action in such radial menus using spiral rotation gestures. Another set of known solutions is to enable single touch zoom using spiral rotation gestures. Alternatively, this type of zooming action can be performed using a screw or helical type gesture in air in front of computing systems having necessary hardware and software capabilities to detect such gestures. To this end, there exist a variety of computing systems that can recognize rotation based gestures with lower memory usage, lower processing power, and up to 95% accuracy. Despite these known solutions, there is still a scope to provide for lots of improvements in this art.
SUMMARY OF INVENTION
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended as an aid in determining the scope of the claimed subject matter.
Embodiments of the present invention describe method and system for controlling flow of information.
In one embodiment, a method for controlling flow of information comprises: receiving a rotation based gesture; identifying context of the rotation based gesture in respect of a currently displayed screen; computing at least one parameter pertaining to the rotation based gesture; and controlling flow of information based on the identified context and the at least one computed parameter.
In another embodiment, a system for controlling flow of information comprises: an input controller to receive a rotation based gesture; a context identifier to identify context of the rotation based gesture in respect of a currently displayed screen; a parameter calculator to compute at least one parameter pertaining to the rotation based gesture; and a gesture controller to control flow of information based on the identified context and the at least one computed parameter.
Advantages of this invention include, but are not limited to user friendliness, ease of use, more options for user interface, quick operation, etc. This invention enables to easily select and/or retrieve files from folders and do many other things, but by making simple rotation based gestures, which are smart enough to fetch relevant data and put it in right place. This invention has application in fields, such as cloud-storage, BIG Data, Internet-of-things, which usually incorporate the large volume of data. To this end, this invention enables quick retrieval of information in these fields, which otherwise is very challenging task. Further, high speed networks, such as 5G-network that provide 5G tactile internet may be helpful in real time processing and display of information selected in response to the rotation based gesture.
The details of one or more embodiments are set forth in the accompanying drawings and description below. Other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that the following detailed description is explanatory only and is not restrictive of the invention as claimed.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
To further clarify the advantages and features of the invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings in which:
Figure 1 illustrates an exemplary method for controlling flow of information, in accordance with an embodiment of the invention.
Figure 2 illustrates an exemplary system for controlling flow of information, in accordance with an embodiment of the invention.
Figure 3 illustrates an exemplary computing system, wherein various embodiments of invention can be implemented.
Figure 4 illustrates a variety of exemplary rotation based gestures: (a) circular, (b) semi-circular, (c) spiral, (d) helical, and (e) conical spiral.
Figures 5(a) to 5(e) illustrate exemplary use cases pertaining to image sharing scenario in a chat application using rotation based gestures.
Figure 6 illustrates an exemplary use case pertaining to sharing files via Bluetooth using rotation based gestures.
Figure 7 illustrates a number of recently or frequently used rotation based gestures, extracted from a gesture database.
Figures 8 and 9 illustrate exemplary use cases pertaining to timeline view of images using rotation based gestures.
Figure 10 illustrates an exemplary use case pertaining to manipulation of multiple images together using rotation based gestures.
Figure 11 illustrates an exemplary use case pertaining to manipulation of an overlay object on an image file using rotation based gestures.
Figure 12 illustrates an exemplary use case pertaining to text highlighting in a browser using rotation based gestures.
It may be noted that to the extent possible, like reference numerals have been used to represent like elements in the drawings. Further, those of ordinary skill in the art will appreciate that elements in the drawings are illustrated for simplicity and may not have been necessarily drawn to scale. For example, the dimensions of some of the elements in the drawings may be exaggerated relative to other elements to help to improve understanding of aspects of the invention. Furthermore, the one or more elements may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having the benefits of the description herein.
DETAILED DESCRIPTION
For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof. Throughout the patent specification, a convention employed is that in the appended drawings, like numerals denote like components.
Reference throughout this specification to “an embodiment”, “another embodiment” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. Thus, the appearances of the phrase “in an embodiment”, “in another embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures proceeded by "comprises... a" does not, without more constraints, preclude the existence of other devices or other sub-systems.
Various embodiments of the invention will be described below in detail with reference to the accompanying drawings.
Figure 1 illustrates an exemplary method (100) for controlling flow of information, in accordance with an embodiment of the invention. In said embodiment, the method (100) comprises: receiving (101) a rotation based gesture; identifying (102) context of the rotation based gesture in respect of a currently displayed screen; computing (103) at least one parameter pertaining to the rotation based gesture; and controlling (104) flow of information based on the identified context and the at least one computed parameter. In a further embodiment, the at least one parameter comprises one or more of following parameters of the rotation based gesture: type, priority, speed, direction, length, depth, color, radius, circumference, count of rotations, nucleus, start point, end point, continuity, and discontinuity. In a further embodiment, the method (100) comprises: popping up (105) a preview window while receiving the rotation based gesture, the preview window depicting files selected so far and/or files available for selection with the rotation based selection. In a further embodiment, the method (100) comprises: progressively displaying (106) a thumbnail of each file getting selected while receiving the rotation based gesture, the thumbnail being displayed on trails of the rotation based gesture. In a further embodiment, the method (100) comprises: temporarily displaying (107) a plurality of control options for a file currently selected by the rotation based gesture. In a further embodiment, the method (100) comprises: storing (108) the received rotation based gesture in a gesture database. In a further embodiment, the method (100) comprises: providing (109) a number of recent rotation based gestures for user selection. In a further embodiment, the method (100) comprises: providing (110) tactile or haptic feedback while receiving the rotation based gesture. In a further embodiment, the rotation based gesture is a circular gesture or a semi-circular gesture or spiral gesture or conical spiral gesture or helical gesture. In a further embodiment, the rotation based gesture is a touch based gesture or air based gesture. In a further embodiment, the method (100) comprises: performing (111), in response to the rotation based gesture, a predefined action with the controlled flow of information.
Figure 2 illustrates an exemplary system (200) for controlling flow of information, in accordance with an embodiment of the invention. In said embodiment, the system (200) comprises: an input controller (201) to receive a rotation based gesture; a context identifier (205) to identify context of the rotation based gesture in respect of a currently displayed screen; a parameter calculator (208) to compute at least one parameter pertaining to the rotation based gesture; and a gesture controller (202) to control flow of information based on the identified context and the at least one computed parameter. In a further embodiment, the at least one parameter comprises one or more of following parameters of the rotation based gesture: type, priority, speed, direction, length, depth, color, radius, circumference, count of rotations, nucleus, start point, end point, continuity, and discontinuity. In a further embodiment, the system (200) comprises: an output controller (203) to pop up a preview window while receiving the rotation based gesture, the preview window depicting files selected so far and/or files available for selection with the rotation based selection. In a further embodiment, the system (200) comprises: an output controller (203) to progressively display a thumbnail of each file getting selected while receiving the rotation based gesture, the thumbnail being displayed on trails of the rotation based gesture. In a further embodiment, the system (200) comprises: an output controller (203) to temporarily display a plurality of control options for a file currently selected by the rotation based gesture. In a further embodiment, the system (200) comprises: a gesture database (207) to store the received rotation based gesture. In a further embodiment, the gesture database (207) is used to provide a number of recent rotation based gestures for user selection. In a further embodiment, the system (200) comprises: a feedback manager (209) to provide tactile or haptic feedback while receiving the rotation based gesture. In a further embodiment, the rotation based gesture is a circular gesture or a semi-circular gesture or spiral gesture or conical spiral gesture or helical gesture. In a further embodiment, the rotation based gesture is a touch based gesture or air based gesture. In a further embodiment, the gesture controller (202) is configured to perform, in response to the rotation based gesture, a predefined action with the controlled flow of information. In a further embodiment, the system (200) comprises: an application/content identifier (204) to identify at least one application or content being displayed on the currently displayed screen. In a further embodiment, the system (200) comprises: a mapping module (206) to map the at least one computed parameter with flow of information to be controlled.
During the operation of the system (200), the application/content identifier (204) identifies the application/content in each currently displayed screen with which a user can interact through the rotation based gesture, while the context identifier (205) identifies the context in which the user can interact through the rotation based gesture. Once the input controller (201) starts receiving the rotation based gesture from the user, the parameter calculator (208) starts computing various parameters of the rotation based gesture. After that the mapping module (206) converts the computed parameters into meaningful data retrieval information. In the meanwhile, the feedback manager (209) keeps providing haptic or tactile feedback to the user at each milestone, for instance, at selection each data item while receiving the rotation based gesture. In one example, tactile feedback can be provided if spiral length keeps increasing informing user about more data volume has been picked. Further, tactile feedback strength increases as along with the length of the spiral indicating the volume or data strength. In another example, a number of haptic pulses can be provided once user pauses or terminates the spiral gesture length. Similarly, the number of haptic pulses can inform about the number of objects linked with that spiral length during data reception or transmission. In the end, the gesture controller (202) controller processes the computed parameters in relation with identified application/content and identified context to control the flow of information and perform a predefined action in response to the received rotation based gesture. The output controller (203) provides output information generated in response to the rotation based gesture to the user through the display screen. In background, the history of recently or frequently used gesture patterns is stored and maintained in the gesture database (207), which can be utilized later on by the user for quick selection of rotation based gestures.
Figure 3 illustrates an exemplary computing system (300), wherein various embodiments of invention can be implemented. The computing system (300) can be any system having a gesture based user interface and which is capable of in terms of software and hardware for receiving touch or air based user inputs. Typical examples of the computing system (300) include, but are not limited a mobile phone, tablet, gaming device, laptop, desktop computer, PDA, television, ATM, ticketing machines, consumer appliances, etc. The computing system (300) can include a set of instructions that can be executed to cause the computing system (300) to perform any one or more of the methods, in accordance with the invention. The computing system (300) may operate as a standalone device or may be connected, for example, using a network to other computing systems or peripheral devices.
In a networked deployment, the computing system (300) may operate in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computing system in a peer-to-peer (or distributed) network environment. The computing system (300) can also be implemented as or incorporated into a variety of devices, which are capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device. Furthermore, while a single computing system (300) is illustrated in the figure, the term "system" shall also be taken to include any collection of systems or sub-systems that individually or jointly execute a set, or multiple sets, of instructions to perform one or more computer functions.
The computing system (300) may include a processing unit (301) e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both. The processing unit (301) may be a component in a variety of systems. For example, the processing unit (301) may be part of a standard personal computer or a workstation. The processing unit (301) may be one or more general processors, digital signal processors, application specific integrated circuits, field programmable gate arrays, servers, networks, digital circuits, analog circuits, combinations thereof, or other now known or later developed devices for analysing and processing data. The processing unit (301) may implement a software program, such as code generated manually (i.e., programmed).
The computing system (300) may include a memory unit (302), such as a memory unit (302) that can communicate via a bus (303). The memory unit (302) may be a main memory, a static memory, or a dynamic memory. The memory unit (302) may include, but is not limited to computer readable storage media such as various types of volatile and non-volatile storage media, including but not limited to random access memory, read-only memory, programmable read-only memory, electrically programmable read-only memory, electrically erasable read-only memory, flash memory, magnetic tape or disk, optical media and the like. In one example, the memory unit (302) includes a cache or random access memory for the processing unit (301). In alternative examples, the memory unit (302) is separate from the processing unit (301), such as a cache memory of a processor, the system memory, or other memory. The memory unit (302) may be an external storage device or database for storing data. Examples include a hard drive, compact disc ("CD"), digital video disc ("DVD"), memory card, memory stick, floppy disc, universal serial bus ("USB") memory device, or any other device operative to store data. The memory unit (302) is operable to store instructions executable by the processing unit (301). The functions, acts or tasks illustrated in the figures or described may be performed by the programmed processing unit (301) executing the instructions stored in the memory unit (302). The functions, acts or tasks are independent of the particular type of instructions set, storage media, processor or processing strategy and may be performed by software, hardware, integrated circuits, firm-ware, micro-code and the like, operating alone or in combination. Likewise, processing strategies may include multiprocessing, multitasking, parallel processing and the like.
As shown, the computing system (300) may or may not further include an output unit (304), such as an audio unit and/or a display unit. The examples of the display unit include, but are not limited to a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid state display, a cathode ray tube (CRT), a projector, a printer or other now known or later developed display device for outputting determined information. The output unit (304) may act as an interface for the user to listen/see the functioning of the processing unit (301), or specifically as an interface with the software stored in the memory unit (302) or in a removable storage device. Additionally, the computing system (300) may include an input unit (305) configured to allow a user to interact with any of the components of system (300). The input unit (305) may be a number pad, a keyboard, or a cursor control device, such as a mouse, or a joystick, remote control or any other device operative to interact with the computing system (300). Sometimes, a single IO unit, such a touch screen display, can serve the function of the output unit (304) as well as the input unit (305).
The computing system (300) may also include a disk or optical drive unit (306). The disk drive unit (306) may include a computer-readable medium (307) in which one or more sets of instructions (308), e.g. software, can be embedded. Further, the instructions (308) may embody one or more of the methods or logic as described. In a particular example, the instructions (308) may reside completely, or at least partially, within the memory unit (302) or within the processing unit (301) during execution by the computing system (300). The memory unit (302) and the processing unit (301) also may include computer-readable media as discussed above.
The present invention contemplates a computer-readable medium that includes instructions (308) or receives and executes instructions (308) responsive to a propagated signal so that a device connected to a network (309) can communicate voice, video, audio, images or any other data over the network (309). Further, the instructions (308) may be transmitted or received over the network (309) via a communication port or interface (310) or using the bus (303). The communication port or interface (310) may be a part of the processing unit (301) or may be a separate component. The communication port or interface (310) may be created in software or may be a physical connection in hardware. The communication port or interface (310) may be configured to connect with the network (309), external media, the output unit (304), or any other components in the computing system (300) or combinations thereof. The connection with the network (309) may be a physical connection, such as a wired Ethernet connection or may be established wirelessly as discussed later. Likewise, the additional connections with other components of the system (300) may be physical connections or may be established wirelessly. The network (309) may alternatively be directly connected to the bus (303).
The network (309) may include wired networks, wireless networks, Ethernet AVB networks, or combinations thereof. The wireless network may be a cellular telephone network, an 802.11, 802.16, 802.20, 802.1Q or WiMax network. Further, the network (309) may be a public network, such as the Internet, a private network, such as an intranet, or combinations thereof, and may utilize a variety of networking protocols now available or later developed including, but not limited to TCP/IP based networking protocols.
In an alternative example, dedicated hardware implementations, such as application specific integrated circuits, programmable logic arrays and other hardware devices, can be constructed to implement various parts of the computing system (300).
The present invention can be implemented on a variety of electronic and computing systems. For instance, one or more examples described may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the present system encompasses software, firmware, and hardware implementations.
Any one or more of the methods or logic as described may be implemented in part by software programs executable by a computing system. Further, in a non-limited example, implementations can include distributed processing, component/object distributed processing, and parallel processing. Alternatively, virtual computing system processing can be constructed to implement various parts of the computing system (300).
The computing system (300) is not limited to operation with any particular standards and protocols. For example, standards for Internet and other packet switched network transmission (e.g., TCP/IP, UDP/IP, HTML, HTTP) may be used. Such standards are periodically superseded by faster or more efficient equivalents having essentially the same functions. Accordingly, replacement standards and protocols having the same or similar functions as those disclosed are considered equivalents thereof.
Figure 4 illustrates a variety of exemplary rotation based gestures, which may be either 2D or 3D gestures. The 2D gestures may be provided as touch or air input, while the 3D gestures need to be provided as air input. Examples of rotation based 2D gestures include, but are not limited to: a circular gesture as shown in Figure 4a, a semi-circular gesture as shown in Figure 4b, and a spiral gesture as shown in Figure 4c. Examples of rotation based 3D gestures include, but are not limited to a helical gesture as shown in Figure 4d and a conical spiral as shown in Figure 4e. Those skilled in the art will appreciate that the shape of user provided gesture may not exactly map to the shapes mentioned here. Therefore, the concept of approximation needs to be used here to identify the user provided gesture. For all these rotation based gestures, one or more of parameters may be calculated. Examples of these parameters include, but are not limited to: type of gesture (circular, semi-circular, spiral, helical, conical spiral, etc.), priority (high, medium, low, etc.), speed (high, medium, low, etc.), direction (clockwise, anti-clockwise), length (exact or relative in case of multiple gestures), depth of 3D gesture (exact or relative in case of multiple gestures), color (if different), radius (exact or relative), circumference (area), count of rotations (exact or relative), nucleus (coordinates), start point (coordinates, distance from nucleus, etc.), end point (coordinates, distance from nucleus, etc.), and continuity/ discontinuity (count and extent of discontinuity, etc.).
Figures 5(a) to 5(e) illustrate exemplary use cases pertaining to image sharing scenario in a chat application using rotation based gestures. Of which, Figure 5(a) illustrates a mobile phone (500) running a chat application (503). The mobile phone comprises an optional gesture sensor or front camera (501) and a display screen (502), which may be touch screen or a normal screen. In accordance with an embodiment of the invention, a rotation based gesture (504), for instance, a spiral gesture as shown in the figure, may be received on the touch screen display or may be captured from the gesture sensor/ front camera (501) as the case may be. In said embodiment, the system automatically selects images (505) to be shared in ongoing chat based on the context and at least one parameter of the rotation based gesture. For example, the number of images to be shared can be based on the total length of the rotation based gesture. The total length implies length starting from the start point to end point of the rotation based gesture. Further, the source folder for the images may be automatically selected depending upon the context of the chat. For example, the last line of the chat mentions a city, namely Delhi; an image folder having the word Delhi in its name or in its location tag may be selected. A set of predefined rules may be created to tackle different contextual situations. For example, there can be a rule that selects the word at the nucleus of the rotation based gesture as the word for establishing context. Similarly, any other rule can be created as per requirements.
In continuation to previous use case, Figure 5(b) illustrates a more advanced use case. While a rotation based gesture is being received, a thumbnail of each image getting selected is shown to a user. In this way, the user will know what image is getting selected with the growing length of the rotation based gesture. The thumbnail of already selected images (506-1, 506-2, 506-3, 506-4) can be of smaller size and indicated on the trails of the rotation based gesture. On the other hand, the thumbnail of the image getting selected currently (507) can be of bigger size and indicated near the current end point of the rotation based gesture. In one implementation, a plurality of control options (508) can be indicated along with image getting selected currently. In one example, these control options can be utilized to accept or reject the image getting selected currently. Furthermore, a preview window (509) may be shown that depicts images selected so far, image getting selected currently, and/or images in pipeline that are available for selection. All these categories of images may be indicated in different styles. For example, the images selected so far may be highlighted, currently selected image may be enlarged, while images in pipeline may be appear as regular images. Apart from that the source folder (510) of the images may also be indicated somewhere on the screen with an option of dropdown menu so that the user can override the source folder selected automatically based on the context.
In continuation to previous use cases, the Figure 5(c) illustrates how the screen will appear once the rotation based gesture is accepted by the system. It can be seen that all the images selected in response to the rotation based gesture will be shared in current chat window. It can be gathered that when the number of images shared is very large, then the ongoing chat may disappear from the screen, which may not be desirable as the user may need to scroll up and down too much. To avoid this kind of situation, a rotation based gesture having discontinuity (511) in between can be utilized as illustrated in Figure 5(d). The exemplary effect of the same can be seen in Figure 5(e), wherein the selected images are shared in the ongoing chat after breaks. In this way, the flow of reading is not disturbed due to sudden sharing of large number of images.
Now another use case is being described, wherein a sender wants a receiver to view files shared in a chat in a particular order as intended by the sender. For this purpose, the sender can share the files through a rotation based gesture as described above, while the receiver will also need to draw a rotation based gesture on his device to download the files in the order specified by the sender. In one implementation, the downloadable/downloaded files can be shown as thumbnails on the trails of the rotation based gesture drawn at the receiver side. Since such real time co-ordinated view would need high data speeds, therefore this particular use case may be more suitable for high speed networks, such as a 5G network.
Figure 6 illustrates an exemplary use case pertaining to sharing files via Bluetooth using rotation based gestures. In accordance with an embodiment of the invention, a rotation based gesture (601) may be received on a folder or file explorer. In said embodiment, the system automatically selects files to be shared via Bluetooth based on the context and at least one parameter of the rotation based gesture. For example, the number of files to be shared can be based on the total length of the rotation based gesture. The total length implies length starting from the start point to end point of the rotation based gesture. The sequence of the files may be selected according to a preconfigured order based on priority, recently used, frequently used, or last saved files. In one implementation, the files getting selected while receiving the rotation based gesture can be indicated on the screen through a preview window (602).
Figure 7 illustrates a number of recently or frequently used rotation based gestures, which may be invoked through a predefined input gesture. The stored gesture can be extracted from a gesture database, such as database (207). Few examples of stored gestures are depicted in the figure, which include a short length gesture, a discontinuous gesture, a low count spiral, a high count spiral, a medium count spiral with high radius, etc. The user selection of any of these stored gestures eliminates the need of drawing the rotation based gesture again, thereby making operation to be performed even more quickly.
Figures 8 and 9 illustrate exemplary use cases pertaining to timeline view of images using rotation based gestures. As per the present invention, the rotation based gesture may be used to view images from gallery according to timeline. More specifically, Figure 8 illustrates that a clockwise rotation based gesture (801) may be performed on an image file depicting a person in order to filter out other images of the same person from the gallery, which are captured after capturing that particular image. The length or depth of the rotation base gesture may define the duration of timeline to be considered. Alternatively, any other appropriate parameter, such the count of number of spirals may be used in place of length or depth. While the rotation based gesture is being received by the system, the user can see images selected so far on the trails of the rotation based gesture. The user can also stop providing the gesture at any point of time to freeze the forward movement on the timeline. Additionally, a preview window (802) may also be displayed on the screen that depicts one or more images selected in response to the rotation based gesture on the original image.
Similar, Figure 9 illustrates that anti-clockwise rotation based gesture (901) may be performed on an image file depicting a person in order to filter out other images of the same person from the gallery, which are captured before capturing that particular image. The length or depth of the rotation base gesture may define the duration of timeline to be considered. Alternatively, any other appropriate parameter, such the count of number of spirals may be used in place of length or depth. While the rotation based gesture is being received by the system, the user can see images selected so far on the trails of the rotation based gesture. The user can also stop providing the gesture at any point of time to freeze the backward movement on the timeline. Additionally, a preview window (902) may also be displayed on the screen that depicts one or more images selected in response to the rotation based gesture on the original image.
Figure 10 illustrates an exemplary use case pertaining to manipulation of multiple images together using rotation based gestures. As per the present invention, the user can provide a rotation based gesture (1001) on multiple images together, which can perform a predefined action on those multiple images. For example, a common rotation based gesture on two family photographs can filter out images of one or more family members from the image gallery using face recognition techniques. In another example, a clockwise gesture in this regard will filter images of common members in both photographs, while an anti-clockwise gesture will filter images of uncommon members in both photographs. Those skilled in the art will appreciate that these are just exemplary scenarios; any other possible action can also be performed using the same gesture by configuring the predefined rules for the same.
Figure 11 illustrates an exemplary use case pertaining to manipulation of an overlay object on an image file using a rotation based gestures (1101). As shown in the figure, an object, such as a watermark, text, signature, etc. can be inserted over a currently displayed image by a clockwise gesture, while the same can be removed by an anti-clockwise gesture, and vice versa. As shown, a plurality of overlay objects appears on the trails of the gesture, while receiving the gesture. The gesture can be stopped at any point of time to select and insert a currently displayed overlay object. Alternatively, an overlay object appearing on the trails of the gesture can also be selected and inserted if the user is not interested in a currently displayed object.
Figure 12 illustrates an exemplary use case pertaining to text highlighting in a browser/document using a rotation based gesture (1201). Here, the text to be highlighted can be selected from a predefined list with the help of the rotation based gesture and control options associated with it. As described previously, said control options may appear on the screen for every currently picked text. At the same time, the system highlights one text after another text with growing the length of rotation based gesture. As shown, when the user starts providing the rotation based gesture, then first text (1202) is highlighted. When the user continues to increase the length of the rotation based gesture, the system highlights second text (1203), third text (1204), and so on. Similarly, the user can provide another rotation based gesture, say in opposite direction of previous rotation based gesture, to un-highlight the highlighted text in a similar fashion.
Those skilled in the art will appreciate that the uses cases listed above are exemplary in nature; there could be many more situations, wherein the concepts of present invention could be utilized to achieve one or more advantages of the present invention. Furthermore, embodiments of the invention have been described in detail for purposes of clarity and understanding. However, it will be appreciated that certain changes and modifications may be practiced within the scope of the appended claims. Thus, although the invention is described with reference to specific embodiments and figures thereof, the embodiments and figures are merely illustrative, and not limiting of the invention. Rather, the scope of the invention is to be determined solely by the appended claims.
Claims:
We claim:
1. A method (100) for controlling flow of information, the method (100) comprising:
receiving (101) a rotation based gesture;
identifying (102) context of the rotation based gesture in respect of a currently displayed screen;
computing (103) at least one parameter pertaining to the rotation based gesture; and
controlling (104) flow of information based on the identified context and the at least one computed parameter.
2. The method (100) as claimed in claim 1, wherein the at least one parameter comprises one or more of following parameters of the rotation based gesture: type, priority, speed, direction, length, depth, color, radius, circumference, count of rotations, nucleus, start point, end point, continuity, and discontinuity.
3. The method (100) as claimed in claim 1 further comprising:
popping up (105) a preview window while receiving the rotation based gesture, the preview window depicting files selected so far and/or files available for selection with the rotation based selection.
4. The method (100) as claimed in claim 1 further comprising:
progressively displaying (106) a thumbnail of each file getting selected while receiving the rotation based gesture, the thumbnail being displayed on trails of the rotation based gesture.
5. The method (100) as claimed in claim 1 further comprising:
temporarily displaying (107) a plurality of control options for a file currently selected by the rotation based gesture.
6. The method (100) as claimed in claim 1 further comprising:
storing (108) the received rotation based gesture in a gesture database (207).
7. The method (100) as claimed in claim 6 further comprising:
providing (109) a number of recent rotation based gestures for user selection.
8. The method (100) as claimed in claim 1 further comprising:
providing (110) tactile or haptic feedback while receiving the rotation based gesture.
9. The method (100) as claimed in claim 1, wherein the rotation based gesture a circular gesture or a semi-circular gesture or spiral gesture or conical spiral gesture or helical gesture.
10. The method (100) as claimed in claim 1, wherein the rotation based gesture is a touch based gesture or air based gesture.
11. The method (100) as claimed in claim 1 further comprising:
performing (111), in response to the rotation based gesture, a predefined action with the controlled flow of information.
12. A system (200) for controlling flow of information, the system comprising:
an input controller (201) to receive a rotation based gesture;
a context identifier (205) to identify context of the rotation based gesture in respect of a currently displayed screen;
a parameter calculator (208) to compute at least one parameter pertaining to the rotation based gesture; and
a gesture controller (202) to control flow of information based on the identified context and the at least one computed parameter.
13. The system (200) as claimed in claim 12 further comprising:
an output controller (203) to:
pop up a preview window while receiving the rotation based gesture, the preview window depicting files selected so far and/or files available for selection with the rotation based selection; and/or
progressively display a thumbnail of each file getting selected while receiving the rotation based gesture, the thumbnail being displayed on trails of the rotation based gesture; and/or
an output controller (203) to temporarily display a plurality of control options for a file currently selected by the rotation based gesture.
14. The system (200) as claimed in claim 12 further comprising:
a gesture database (207) to store the received rotation based gesture.
15. The system (200) as claimed in claim 12 further comprising:
a feedback manager (209) to provide tactile or haptic feedback while receiving the rotation based gesture.
16. The system (200) as claimed in claim 12 further comprising:
an application/content identifier (204) to identify at least one application or content being displayed on the currently displayed screen.
17. The system (200) as claimed in claim 12 further comprising:
a mapping module (206) to map the at least one computed parameter with flow of information to be controlled.
| # | Name | Date |
|---|---|---|
| 1 | Power of Attorney [06-11-2015(online)].pdf | 2015-11-06 |
| 2 | Form 5 [06-11-2015(online)].pdf | 2015-11-06 |
| 3 | Form 3 [06-11-2015(online)].pdf | 2015-11-06 |
| 4 | Form 18 [06-11-2015(online)].pdf | 2015-11-06 |
| 5 | Drawing [06-11-2015(online)].pdf | 2015-11-06 |
| 6 | Description(Complete) [06-11-2015(online)].pdf | 2015-11-06 |
| 7 | 3633-del-2015-Form-1-(23-11-2015).pdf | 2015-11-23 |
| 8 | 3633-del-2015-Correspondence Others-(23-11-2015).pdf | 2015-11-23 |
| 9 | 3633-DEL-2015-FER.pdf | 2019-08-27 |
| 10 | 3633-DEL-2015-PA [18-09-2019(online)].pdf | 2019-09-18 |
| 11 | 3633-DEL-2015-ASSIGNMENT DOCUMENTS [18-09-2019(online)].pdf | 2019-09-18 |
| 12 | 3633-DEL-2015-8(i)-Substitution-Change Of Applicant - Form 6 [18-09-2019(online)].pdf | 2019-09-18 |
| 13 | 3633-DEL-2015-OTHERS-101019.pdf | 2019-10-14 |
| 14 | 3633-DEL-2015-Correspondence-101019.pdf | 2019-10-14 |
| 15 | 3633-DEL-2015-OTHERS [26-02-2020(online)].pdf | 2020-02-26 |
| 16 | 3633-DEL-2015-FER_SER_REPLY [26-02-2020(online)].pdf | 2020-02-26 |
| 17 | 3633-DEL-2015-DRAWING [26-02-2020(online)].pdf | 2020-02-26 |
| 18 | 3633-DEL-2015-CLAIMS [26-02-2020(online)].pdf | 2020-02-26 |
| 19 | 3633-DEL-2015-PatentCertificate03-08-2023.pdf | 2023-08-03 |
| 20 | 3633-DEL-2015-IntimationOfGrant03-08-2023.pdf | 2023-08-03 |
| 1 | 3633-del-2015_27-08-2019.pdf |