Abstract: ATTACHED
The following specification fully and particularly describes the nature of this
disclosure:
Technical Field 5
Embodiments of the present disclosure relate to devices and associated methods
for providing a user interface, and more specifically, but not limited to, a system
and method for iteratively generating an optimized user interface based on
predefined user input. 0
Background
A user interface allows for human-machine interaction. In a typical computing environment, the goal of this interaction is effective operation and control of the 5 machine on the user's end. A user interface also provides for feedback from such a machine, thus allowing the end user to make appropriate decisions.
A graphical user interface is one such type of user interface that allows users to interact with electronic devices through graphical icons. The actions in '.0 Graphical User Interfaces are performed through direct manipulation of the graphical elements. Thus, in a typical GUI interaction the user interacts with the underling machine (electronic device) by manipulating visual widgets that allow for interactions in line with the kind of data they hold.
!5 One of the most common interactions (by a user) with a typical graphical user interface involves selecting, dragging and dropping a predefined set of icons (widgets) on a virtual work area (screen) as nodes, connecting said nodes via "link" in like manner with an attempt to visually depict work flow of a process to be followed or executed.
10
Such, workflow nodes are therefore created using well known "drag and drop" user interface pattern form the palette. However, complexity and time-spent increases manifold when such dragging and dropping involves multiple nodes. This is particularly tedious task for user if there is more than one node to be
15 drawn for a given scenario. To overcome such disadvantages this, different
2
products have come with different user interface patterns like context pad, auto add from context pad etc.
However, such solutions fail to overcome the limitations, including but not limited to, need to drag and drop text input the (graphical) nodes to the virtual screen, connecting each node manually and need to enter the node name separately after the node creation.
Thus, there is a long felt need to provide a mechanism that meets above challenges and also allows a user to intuitively (and not mechanically) arrange steps/blocks of a process.
Summary
For purposes of summarizing, certain aspects, advantages, and novel features of the disclosure have been described herein. It is to be understood that not necessarily all such advantages may be achieved in accordance with any one particular embodiment of the disclosure. Thus, the present disclosure may be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other advantages as may be taught suggested herein.
The present invention provides a method and an apparatus (system) for iteratively generating an optimized user interface at a client device in a network environment, the method comprising, providing a first widget configured for receiving user input ,generating a second widget configured for receiving text input in response to said user input, providing text input to said second widget, checking text input for syntax at client device, forwarding checked text input to a repository server ;interpreting received text input at repository ,server and in response providing an indication of a corresponding node, receiving said indication at the client device and dynamically generating the corresponding node. The first widget within the purview of the present invention is a specialized button widget and second widget is a text box/text area widget.
An apparatus of the present invention allows for iteratively generating an optimized user interface at a client device in a network environment and comprises of a plurality of client devices to provide a first widget configured to receive user input, generate a second widget to receive text input in response to said user input, provide text input to said second widget, check said text input for syntax at client device, forward checked text input to a repository server via a plurality of network router (s); at least one repository server to interpret received text input at repository server and in response provide an indication of a corresponding node and wherein said indication is received at the client device, via network router(s) to dynamically generate the corresponding node.
The apparatus and corresponding method of the present invention in a typical embodiment of the instant invention offers a mechanism for enabling a user to dynamically generate graphical user elements.
The apparatus and corresponding method of the present invention allow for real time interpretation of the user entered text data for generation of corresponding graphical user elements.
In an embodiment of the present invention, checking text input for syntax comprises of providing error message when text input does not conform to predefined standards and checked text input is forwarded to repository server via a network router.
In an embodiment of the present invention, interpreting received text input at repository server comprises of matching the received text input with repository maintained at repository server.
All the embodiments as herein described with respect to the present invention are applicable to the method and the corresponding system.
Thus, every embodiment and related feature of the instant invention is equally implementable via a system as well as a corresponding method detailing the steps as described and claimed.
These and other embodiments of the present disclosure will also become readily apparent to those skilled in the art from the following detailed description of the embodiments having reference to the attached figures, the disclosure not being limited to any particular embodiments disclosed.
Brief Description of the Drawings
For a better understanding of the embodiments of the systems and methods described herein, and to show more clearly how they may be carried into effect, i reference will now be made, by way of example, to the accompanying drawings, wherein:
FIGURE 1 illustrates an existing first default user interface.
: FIGURE 2 illustrates an existing first user interface with an exemplary
process.
FIGURE 3 illustrates a first existing user interface with a newly added node.
• FIGURE 4 illustrates a diagrammatic representation of a first existing user
interface with an editable newly added user node.
FIGURE 5 illustrates a first existing user interface wherein one node is joined
from one node to another node by manually manipulating a
' connector.
FIGURE 6 illustrates a second existing default existing user interface.
FIGURE 7 illustrates the second existing user interface wherein a palette icon is
• selected in a like manner from a palette.
FIGURE 8 illustrates the second existing user interface wherein a node is dragged from palette and dropped as part of a process.
FIGURE 9 illustrates the second existing user interface wherein newly added node is made editable.
FIGURE 10 illustrates the second existing user interface wherein a connector
5 icon is used to connect newly added node in like manner.
FIGURE 11 illustrates a default third existing user interface wherein a palette icon is dragged from a palette in like manner.
) FIGURE 12 illustrates the third existing user interface depicting a newly added node.
FIGURE 13 illustrates the third existing user interface depicting joining of one node to another node via a connector icon. 5
FIGURE 14 illustrates a default user interface of the present invention with a predefined button.
FIGURE 15 illustrates the user interface of the present invention depicting a
) text entry box on selection of the predefined button.
FIGURE 16 illustrates the user interface of the present invention wherein user is enabled to enter text items in text entry box leading to dynamic generation of corresponding nodes. 5
FIGURE 17 illustrates the user interface of the present invention depicting dynamic generation of an exemplary node.
FIGURE 18 illustrates the user interface of the present invention depicting
) dynamic generation of an END node signaling end of an exemplary
process.
FIGURE 19 illustrates the user interface of the present invention depicting dynamic completion of one exemplary process.
5
FIGURE 20 illustrates the user interface of the present invention depicting enablement of a user to change node type via a context menu.
FIGURE 21 illustrates the user interface of the present invention depicting an exemplary change of node type.
FIGURE 22 illustrates an exemplary architecture of the underlying apparatus of the present invention.
Detailed Description
Exemplary embodiments now will be described with reference to the accompanying drawings. The disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey its scope to those skilled in the art. The terminology used in the detailed description of the particular exemplary embodiments illustrated in the accompanying drawings is not intended to be limiting. In the drawings, like numbers refer to like elements.
The specification may refer to "an", "one" or "some" embodiment(s) in several locations. This does not necessarily imply that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments.
As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms "includes", "comprises", "including" and/or "comprising" when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or
intervening elements may be present. Furthermore, "connected" or "coupled" as used herein may include operatively connected or coupled. As used herein, the term "and/or" includes any and all combinations and arrangements of one or more of the associated listed items.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The figures depict a simplified structure only showing some elements and functional entities, all being logical units whose implementation may differ from what is shown. The connections shown are logical connections; the actual physical connections may be different. It is apparent to a person skilled in the art that the structure may also comprise other functions and structures. It should be appreciated that the functions, structures, elements and the protocols used in communication are irrelevant to the present disclosure. Therefore, they need not be discussed in more detail here.
Also, all logical units described and depicted in the figures include the software and/or hardware components required for the unit to function. Further, each unit may comprise within itself one or more components which are implicitly understood. These components may be operatively coupled to each other and be configured to communicate with each other to perform the function of the said unit.
The features provided by the disclosed system in the present disclosure, may be wirelessly accessed remotely, in one or more embodiments, and/or through a wireless network. Such types of wireless network service providers operate and maintain the computing systems and environment, such as server system and architectures. Server architecture may include the infrastructure (e.g. hardware, software, and communication lines) that offers wireless network services. The
operations in embodiment of the present invention may in certain embodiments are performed wirelessly during on-air or air interface.
For the most part, the operations described herein are operations performed by a network, device, router device, computer or a machine, or in some embodiments in conjunction with a human operator or user that interacts with the computer or the machine. The programs, modules, processes, methods, data, and the like, described herein are but an exemplary implementation and are not related, or limited, to any particular computer, apparatus, or computer language. Rather, various types of general purpose computing machines or devices may be used with programs constructed in accordance with the teachings described herein.
It should be understood that embodiments of the present disclosure may be included in various types of communication networks intended to be within the scope of the present disclosure, although not limited to, embodiments The terms "network" and "systems" are often used interchangeably.
Any of the trademarks, service marks, collective marks, copy rights, design rights or similar rights that are mentioned, in this document are the property of their respective owners. Their mention herein does not grant any rights to use any otherwise protected materials. Use of any such or similar incorporeal property is at own risk.
A person ordinary skilled in the art within the purview of the instant invention may note that a for the sake of clarity and for effectively appreciating the technical contribution of the instant invention, few existing proposed solutions are provided herewith that are briefly described with reference to drawings.
Figure 1 illustrates a first existing default user interface with a start event node (101) and end event node (102)
Figure 2 illustrates a first existing user interface with an exemplary process wherein a user (not shown) is enabled to select a palette icon (205) from a palette (204).
The palette icon (205) is dragged (by the user) from the palette (204) to a process depicted by a start event node and end event node (202).
Figure 3 illustrates a first existing user interface with a newly added node.
A new palette icon (305) as selected from palette (304) is dragged and dropped as a new node (303)
Figure 4 illustrates a diagrammatic representation of a first existing user interface with an editable newly added user node.
A newly added user node (401) may be named by making it editable, for example, by executing an event such as double clicking it. A text box appears in response to said event which can be filled in via user input for entering name of the newly added node (401).
Figure 5 illustrates an existing user interface wherein one node is joined to another node by manually manipulating a connector icon.
A newly added node (502) may be joined (e.g. to a task (501) by selecting and dragging a connector icon (503) from the palette and then dragging it from source node (501) to target node (502).
Figure 6 illustrates a second default existing user interface with a default step (602) for an exemplary process starting at 601.
Figure 7 illustrates the second existing user interface wherein a palette icon is selected in a like manner from a palette.
Figure 8 illustrates the second existing user interface wherein a node is dragged from palette and dropped.
Particularly, in this known method a connection is splitted and joined automatically through the ends when a new node is dropped over it.
Figure 9 illustrates the second existing user interface wherein newly added node is made editable.
Figure 10 illustrates the second existing user interface wherein a connector icon is used to connect newly added node in like manner.
Figure 10 particularly illustrates a method of drawing graphical elements using a context button pad (as shown over node (Step2) which attempts to minimize delay due to interaction with palette.
Thus, method of drawing graphical elements using context button pad as shown over node (Step 2) minimizes delay due to interaction with palette.
Figure 11 illustrates a default third existing user interface wherein a palette icon is dragged from a palette in like manner.
Figure 12 illustrates the third existing user interface depicting a newly added node.
Figure 13 illustrates the third existing user interface depicting joining of one node to another node via a connector icon.
Figure 14 illustrates a default user interface of the present invention with a predefined button.
A work area (1401) is shown capable of displaying graphical user elements that form part of a process to depicted. A typical user interface includes a work area within the context of the invention is a predefined bounded region on the user interface, the user interface comprising a plurality of widgets. A plurality of graphical user elements can be manipulated, added to, removed from, and connected to/from on such a work area (1401). Graphical user elements that from part of a process being depicted, generally, cannot be displayed out of work area. The nodes within the purview of the present invention comprise of start node, end node, connectors.
A palette also (1402) forms part of the user interface of the present invention.
The user interface of the present invention advantageously includes at least one additional button (1403). The additional button within the purview of the instant 5 invention refers to a first widget and as a specialized button widget. The technical features offered by the additional button (1403) are described in the following paragraphs.
Figure 15 illustrates the user interface of the present invention depicting a text 10 entry box on selection of the predefined button.
The additional button is receptive to predefined events. For example, in a typical scenario, one of the major attributes of the additional button is to receive INPUT event and in response provide a text area/text box (also referred to as second 15 widget) (1501).
The text box (1501) is receptive for text input and is further explained in following paragraphs.
20 The additional button is configurable for receiving user inputs/text input via a plurality of input devices, including but not limited to, e.g. a mouse, a keyboard, a stylus and so on.
Figure 16 illustrates the user interface of the present invention wherein user is 25 enabled to enter text items in text entry box leading to dynamic generation of corresponding nodes.
The text box/text area (1601) is configured for receiving text input. Importantly, the present invention also configured for interpreting entered text in text 30 box/text area (1601) as a pre defined set of commands and performing a further corresponding action.
For example, a text entry "Start" (1602) in text box/text area (1601) may be interpreted as a user command for generating start node (1604). Similarly,
making a text entry as "My Task" (1602) shall lead to generation of a corresponding node with name as "My Task" (1603).
On the similar lines, various elements including connectors are generated and 5 linked dynamically over user work area to form a workflow of a user defined process as soon as a corresponding entry is made in the text box/text area (1601).
Said technical scheme of generating a process work flow avoids the tedious need 10 of dragging and dropping the node on user work area and also of connecting the nodes manually.
By entering the task names 8B expressions as natural language, corresponding nodes and the connections are dynamically created. 15
Figure 17 illustrates the user interface of the present invention depicting dynamic generation of an exemplary node.
Herein, "start", "My Task", "service" entries (1702) when entered in text box 20 (1701) lead to dynamic generation of corresponding nodes as start node (1703), My Task node (1704) and service node (1705) respectively. Also, said nodes are also connected via appropriate connectors in line with the context of nodes as created.
25 Figure 18 illustrates the user interface of the present invention depicting dynamic generation of an END node (1801) signaling end of an exemplary process.
Figure 19 illustrates the user interface of the present invention depicting 30 dynamic completion of one exemplary process via the technical scheme as recited above.
Figure 20 illustrates the user interface of the present invention depicting enablement of a user to change node type via a context menu.
35
Herein, gateway node (2001) is illustrated to be changed by e.g. selecting and double clicking over it. The "type" of gateway node (2001) can changed to any of the node types as illustrated via drop down menu (2002).
Figure 21 illustrates the user interface of the present invention depicting after an exemplary change of node type.
Thus, in a nutshell, creating a node and connecting them using conventional methods, a user needs to follow 3 steps
1. Select the node and drag and drop to editor
2. Double click and enter the node name
3. Select the connector arrow and connect the relative nodes In contrast, creating and connecting node(s) within the purview of the instant invention, comprises:
1. Selecting the additional button and enter task name
2. Selecting the node type from available options (menu) and change the node type
One of the indicators of technical advantages achieved by the present invention is better understood with a following exemplary scenario. If user needs to create 100 nodes, the user, in a traditional scenario, needs to drag and drop 100 nodes, or using other design patterns he has to create the process flow.
A sample illustration of the time saved by user if user spends 30 sec to create a node in an existing (traditional) method, then using the technical scheme of the present invention a node can be created in 10 sec based on following determination.
Productivity
(Time Saved) x (Employee Cost) x (# of employees) = Cost Savings (8 hours a week) x ($30/hour) x (100 employees) = $24,000 per week
Further, said technical scheme of the present invention results in reduced communication between client and server units resulting in saving of CPU (Central Processing Unit) cycles and bandwidth available therefore substantially increasing the efficiency of the underlying network. 5 Now, a typical architecture of an exemplary underlying apparatus of the present invention is explained with reference to Figure 22.
The apparatus within the purview of the present invention comprises of file repository server (2201), network router (2207), desktop pc (2214, 2215). 0
A file repository server (2201) comprises a plurality of hardware integers including repository (2206), web server (2205), hard disks (2204), CPU+RAM (2203) and Network cards (2202).
5 The CPU (2203) is, shall refer to CPU bus (not shown) that is essentially, an interconnection wires that all subsystems are connected to. In general, only one pair of devices can talk to each other at a time, so communication of the bus must be coordinated to prevent message collisions. This coordination is often handled by the CPU (2203).
0
The central processing unit (CPU) (2203) executes instructions contained in memory (2203). These instructions are executed at a rate specified by the computer's clock (not shown).
5 The CPU (2203) needs to access two different types of memory (2203) in order to execute a program. There are two types of memories used in micro-controllers. These are read-only memory (ROM) (Not Shown) and random access memory (RAM) (2203).
0 Read-only memory (ROM) is used to store permanent programs, operating drivers, and data.
Random access memory or RAM (2203) is used to temporarily store data and instructions.
Similarly, desktop PC (2214, 2215) includes Graphical Editor Window Application (2220), (keyboard + mouse) (2219), hard disks (2218), and (CPU + RAM) (2217) and network cards (2216).
A person ordinary skilled in the art would appreciate that network router (2207) also comprises of above mentioned hardware subsystems.
. ^_^__——■-,. Also, hardware integers of the instant invention are coupled and
jT^ connected via a hardware link (2208).
1. A method for iteratively generating an optimized user interface at a client device in a network environment, the method comprising:
providing a first widget configured for receiving user input ;
generating a second widget configured for receiving text input in
response to said user input;
providing text input to said second widget;
checking text input for syntax at client device;
forwarding checked text input to a repository server ;
interpreting received text input at repository server and in response
providing an indication of a corresponding node;
receiving said indication at the client device and dynamically
generating the corresponding node.
2. A method for iteratively generating an optimized user interface at a client device as claimed in claim 1 , wherein the user interface comprise of a predefined bounded region having a plurality of widgets.
3. A method for iteratively generating an optimized user interface at a client device as claimed in claim 1, wherein first widget is specialized button widget.
4. A method for iteratively generating an optimized user interface at a client device as claimed in claim 1, wherein the second widget is a text box/text area widget.
5. A method for iteratively generating an optimized user interface at a client device as claimed in claim 1 , wherein text input is provided by a plurality of input devices comprising of a mouse, a keyboard, a stylus.
6. A method for iteratively generating an optimized user interface at a client device as claimed in claim 1 , wherein checking text input for
syntax comprises of providing error message when text input does not conform to predefined standards.
7. A method for iteratively generating an optimized user interface at a client device as claimed in claim 1, wherein checked text input is forwarded to repository server via a network router.
8. A method for iteratively generating an optimized user interface at a client device as claimed in claim 1, wherein interpreting received text input at repository server comprises of matching the received text input with repository maintained at repository server.
9. A method for iteratively generating an optimized user interface at a client device as claimed in claim 1, wherein the corresponding node comprises of start node , end node , connectors etc.
10. An apparatus for iteratively generating an optimized user interface at a client device in a network environment, comprising
A plurality of client devices to
o provide a first widget configured to receive user input,
o generate a second widget to receive text input in response to
said user input, o provide text input to said second widget, o check said text input for syntax at client device, o forward checked text input to a repository server via a plurality of network router (s);
at least one repository server to interpret received text input at repository server and in response provide an indication of a corresponding node and wherein said indication is received at the client device, via network router(s) to dynamically generate the corresponding node.
11. An apparatus for iteratively generating an optimized user interface at a client device in a network environment by performing the method steps of claims 1-9.
| # | Name | Date |
|---|---|---|
| 1 | FORM-5.pdf | 2014-04-02 |
| 2 | FORM-3.pdf | 2014-04-02 |
| 3 | 21586-03-SPECIFICATION.pdf | 2014-04-02 |
| 4 | 1496-CHE-2014 FORM-1 17-07-2014.pdf | 2014-07-17 |
| 5 | 1496-CHE-2014 CORRESPONDENCE OTHERS 17-07-2014.pdf | 2014-07-17 |
| 6 | FORM-18.pdf | 2014-11-07 |
| 7 | FORM-1.pdf | 2014-11-07 |
| 8 | DRAWINGS.pdf | 2014-11-07 |
| 9 | CORRESPONDECE OTHERS.pdf | 2014-11-07 |
| 10 | 1496-CHE-2014 FORM-13 02-07-2015.pdf | 2015-07-02 |
| 11 | POA.pdf | 2015-07-06 |
| 12 | F-13.pdf | 2015-07-06 |
| 13 | 1496-CHE-2014 POWER OF ATTORNEY 09-07-2015.pdf | 2015-07-09 |
| 14 | 1496-CHE-2014 FORM-1 09-07-2015.pdf | 2015-07-09 |
| 15 | 1496-CHE-2014 CORRESPONDENCE OTHRS 09-07-2015.pdf | 2015-07-09 |
| 16 | 1496-CHE-2014-PA [27-04-2018(online)].pdf | 2018-04-27 |
| 17 | 1496-CHE-2014-ASSIGNMENTDOCUMENTS [27-04-2018(online)].pdf | 2018-04-27 |
| 18 | 1496-CHE-2014-8(i)-Substitution-ChangeOfApplicant-Form6 [27-04-2018(online)].pdf | 2018-04-27 |
| 19 | Correspondence by Agent_Power of Attorney_Assignment_03-05-2018.pdf | 2018-05-03 |
| 20 | 1496-CHE-2014-FER.pdf | 2019-10-22 |
| 21 | 1496-CHE-2014-OTHERS [30-12-2019(online)].pdf | 2019-12-30 |
| 22 | 1496-CHE-2014-FER_SER_REPLY [30-12-2019(online)].pdf | 2019-12-30 |
| 23 | 1496-CHE-2014-DRAWING [30-12-2019(online)].pdf | 2019-12-30 |
| 24 | 1496-CHE-2014-COMPLETE SPECIFICATION [30-12-2019(online)].pdf | 2019-12-30 |
| 25 | 1496-CHE-2014-CLAIMS [30-12-2019(online)].pdf | 2019-12-30 |
| 26 | 1496-CHE-2014-ABSTRACT [30-12-2019(online)].pdf | 2019-12-30 |
| 27 | 1496-CHE-2014-PatentCertificate22-09-2021.pdf | 2021-09-22 |
| 28 | 1496-CHE-2014-IntimationOfGrant22-09-2021.pdf | 2021-09-22 |
| 29 | 1496-CHE-2014-FORM 3 [12-11-2021(online)].pdf | 2021-11-12 |
| 30 | 1496-CHE-2014-RELEVANT DOCUMENTS [27-09-2023(online)].pdf | 2023-09-27 |
| 1 | 2021-06-2916-31-13AE_29-06-2021.pdf |
| 2 | 2019-10-1017-31-35_14-10-2019.pdf |