Abstract: Disclosed is a method and system for optimizing generation of test cases. The method comprises receiving of one or more parameters by the processor 202 to be used for establishing an interaction with a target application. Further identifying of one or more interactive elements is done by the processor 202 associated with the target application. Upon identification of the interactive elements generating of one or more test cases by the processor 202 by using the one or more interactive elements. The one or more test cases comprises one or more steps created in a predefined informative format.
Claims:
1. A method for optimizing generation of test cases, the method comprising:
receiving, one or more parameters to be used for establishing an interaction with a target application, wherein the interaction is established through one or more browsers;
identifying, one or more interactive elements associated with the target application; and
generating, one or more test cases by using the one or more interactive elements, wherein the one or more test cases comprises one or more steps created in a predefined informative format, such that the predefined informative format groups the one or more test steps in a logical set according to link identified between the one or more test steps.
2. The method as claimed in claim 1, wherein the one or more parameters comprises at least one of the Uniform Resource Locator (URL) of the target application, words to be ignored, explore to level, and restrict to sub-domain.
3. The method as claimed in claim 1, wherein the target application comprises a web-based application.
4. The method as claimed in claim 1, wherein the one or more browsers comprises at least one of selenium web browser and standard web browser.
5. The method as claimed in claim 1, wherein the one or more interactive elements comprise at least one of web links, web forms, web form elements, custom interactive elements.
6. The method as claimed in claim 1, wherein the predefined structure comprises a tree type structure arranging interactive elements in a hierarchy.
7. The method as claimed in claim 1, wherein the one or more test cases comprises a user customized test case, wherein the user customized test case is generated by a user by combining one or more test steps and by using one or more assertions.
8. The method as claimed in claim 1, comprising:
using one or more test cases for generating a new test scenario, wherein the one or more test cases are provided as the parameter for generating new test scenarios.
9. The method as claimed in claim 1, comprising:
generating, an execution plan for executing the one or more test scenarios; and
generating a report towards a success status of the execution of the one or more test scenarios.
10. A system for optimizing generation of test cases, the system comprising:
a processor; and
a memory coupled to the processor, wherein the memory stores a set of instructions to be executed by the processor, wherein the processor is configured to:
receive, one or more parameters to be used for establishing an interaction with a target application, wherein the interaction is established through one or more browsers;
identify, one or more interactive elements associated with the target application; and
generate, one or more test cases by using the one or more interactive elements, wherein the one or more test cases comprises one or more steps created in a predefined informative format, such that the predefined informative format groups the one or more test steps in a logical set according to link identified between the one or more test steps.
11. The system as claimed in claim 10, wherein the one or more parameters comprises at least one of the Uniform Resource Locator (URL) of the target application, words to be ignored, explore to level, and restrict to sub-domain.
12. The system as claimed in claim 10, wherein the target application comprises a web-based application.
13. The system as claimed in claim 10, wherein the one or more browsers comprises at least one of selenium web browser and standard web browser.
14. The system as claimed in claim 10, wherein the one or more interactive elements comprise at least one of web links, web forms, web form elements, custom interactive elements.
15. The system as claimed in claim 10, wherein the predefined structure comprises a tree type structure arranging interactive elements in a hierarchy.
16. The system as claimed in claim 10, wherein the one or more test cases comprises a user customized test case, wherein the user customized test case is generated by a user by combining one or more test steps and by using one or more assertions, .
17. The system as claimed in claim 1, wherein the processor is configured to:
use one or more test cases for generating a new test scenario, wherein the one or more test cases are provided as the parameter for generating new test scenario.
18. The system as claimed in claim 1, wherein the processor is configured to:
generate, an execution plan for executing the one or more test scenarios; and
generate a report towards a success status of the execution of the one or more test scenarios.
, Description:FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
SYSTEM AND METHOD FOR OPTIMIZING GENERATION OF TEST CASES
Applicant:
OPTIMIZEQ PRIVATE LIMITED
A company Incorporated in India
Having address:
A502, Synchronicity Bldg, Chandivali, Andheri E,
Mumbai – 400072, Maharashtra, India
The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD
[001] The present subject matter described herein, in general, relates to generation of test cases, and more particularly to a system method for optimizing the generation of test cases.
BACKGROUND
[002] A Functional testing is a form of automated testing that deals with how applications functions, or, in other words, the functional testing provides relation of applications with the users and specially to rest of the layers of the application. Traditionally, the functional testing is implemented by a team of testers, independent of developers involved in development of the test cases or testing tools.
[003] Many test automation tools provide record and playback features allowing users to interactively record user’s actions and replay the user’s actions any number of times, comparing actual results to expected results. However, reliance on such test automation tools feature poses major reliability and maintainability problems. Relabelling a button or moving the button to another part of the window may require the test to be re-recorded. Record and playback also often add irrelevant activities or incorrectly records some activities. Such involvement of irrelevant activities also makes the automation tool time consuming in operation.
SUMMARY
[004] Before the system and method for optimizing generation of test cases are described, it is to be understood that this application is not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments which are not expressly illustrated in the present disclosures. It is also to be understood that the terminology used in the description is for the purpose of describing the particular implementations or versions or embodiments only and is not intended to limit the scope of the present application.
[005] This summary is provided to introduce aspects related to a system and method for optimizing generation of test cases. This summary is not intended to identify essential features of the claimed system and method for optimizing generation of test cases. The subject matter is not intended for use in determining or limiting the scope of the claimed subject matter.
[006] In one implementation, system and method for optimizing generation of test cases is disclosed. In one aspect, the system comprises a memory, and a processor coupled to the memory. Further, the processor may be capable of executing instructions in the memory to perform one or more steps described now. The processor may be configured to receive, one or more parameters to be used for establishing an interaction with a target application. The interaction is established through one or more browsers. Further the processor may be configured to identify, one or more interactive elements associated with the target application. Furthermore, the processor is configured to generate, one or more test cases by using the one or more interactive elements. The one or more test cases comprises one or more steps created in a predefined informative format, such that the predefined informative format groups the one or more test steps in a logical set according to link identified between the one or more test steps.
[007] In another implementation, a method for optimizing generation of test cases is disclosed. The method comprising receiving, one or more parameters to be used for establishing an interaction with a target application, wherein the interaction is established through one or more browsers. Further the method comprises identifying, one or more interactive elements associated with the target application. Furthermore, the method comprising generating, one or more test cases by using the one or more interactive elements. The one or more test cases comprises one or more steps created in a predefined informative format, such that the predefined informative format groups the one or more test steps in a logical set according to link identified between the one or more test steps.
BRIEF DESCRIPTION OF THE DRAWINGS
[008] The foregoing detailed description of embodiments is better understood when read in conjunction with the appended drawings. For the purpose of illustrating of the present subject matter, an example of construction of the present subject matter is provided as figures; however, the invention is not limited to the specific method and system disclosed in the document and the figures.
[009] The present subject matter is described in detail with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer various features of the present subject matter.
[010] Figure 1 illustrates a system architecture diagram 100 of a system 102 for optimizing generation of test cases, in accordance with an embodiment of the present subject matter.
[011] Figure 2 illustrates a block level diagram of the system 102, in accordance with an embodiment of the present subject matter.
[012] Figure 3A illustrates a method 300 for optimizing generation of test cases, in accordance with an embodiment of the present subject matter.
[013] Figure 3B illustrates an embodiment of a method 300 for optimizing generation of test cases, in accordance with an embodiment of the present subject matter.
[014] Figure 4 illustrates a working flowchart of the method 300 for optimizing generation of test cases, in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[015] Some embodiments of this disclosure, illustrating all its features, will now be discussed in detail. The words "comprising," "having," "containing," and "including," and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods for optimizing generation of test cases, similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present disclosure, the exemplary, system and method for optimizing generation of test cases are now described. The disclosed embodiments for optimizing generation of test cases are merely examples of the disclosure, which may be embodied in various forms.
[016] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments for optimizing generation of test cases. However, one of ordinary skill in the art will readily recognize that the present disclosure for optimizing generation of test cases is not intended to be limited to the embodiments described, but is to be accorded the widest scope consistent with the principles and features described herein.
[017] Generally, many test automation tools provide record and playback features allowing users to interactively record user’s actions and replay the user’s actions any number of times, comparing actual results to expected results. However, reliance on such test automation tools feature poses major reliability and maintainability problems. Relabelling a button or moving the button to another part of the window may require the test to be re-recorded. Record and playback also often add irrelevant activities or incorrectly records some activities. Such involvement of irrelevant activities also makes the automation tool time consuming in operation.
[018] The present subject matter overcomes a problem of recording the test steps forming a test case. The present subject matter identifies and generates the test steps automatically. Such generation of test cases saves time, and generates a comprehensive set of test steps, compared to the manual process of recoding the steps. The present invention also generates the test cases automatically without manual intervention.
[019] Furthermore, the present subject matter may utilize domain agnostic therefore the present disclosed system browses through the target application and navigates through all the hyperlink, executes forms of the target application and navigates through all sections of the target application. The target application here refers to a web-based application. Domain agnostic refers to a new task of transferring knowledge from a source domain to data from multiple heterogeneous target domains.
[020] Referring now to Figure 1, a system architecture diagram 100 of a system 102 for optimizing generation of test cases, in accordance with an embodiment of the present subject matter may be described. In one example, the system 102 may be connected with mobile devices 104-1 through 104-N (collectively referred as 104) through a communication network 106.
[021] It should be understood that the system 102 and the mobile devices 104 correspond to computing devices. It may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, a cloud-based computing environment, or a smart phone and the like. It may be understood that the mobile devices 104 may correspond to a variety of a variety of portable computing devices, such as a laptop computer, a desktop computer, a notebook, a smart phone, a tablet, a phablet, and the like.
[022] In one implementation, the communication network 106 may be a wireless network, a wired network, or a combination thereof. The communication network 106 can be implemented as one of the different types of networks, such as intranet, Local Area Network (LAN), Wireless Personal Area Network (WPAN), Wireless Local Area Network (WLAN), wide area network (WAN), the internet, and the like. The communication network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, MQ Telemetry Transport (MQTT), Extensible Messaging and Presence Protocol (XMPP), Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further, the communication network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[023] Referring now to Figure 2, a block diagram 200 of the system 102 is illustrated in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 may be configured to fetch and execute computer-readable instructions stored in the memory 206.
[024] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, a command line interface, and the like. The I/O interface 204 may allow a user to interact with the system 102. Further, the I/O interface 204 may enable the system 102 to communicate with the mobile devices 104, and other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[025] The memory 206, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of modules 208. The memory 206 may include any computer-readable medium or computer program product known in the art including, for example, volatile memory, such as Static Random Access Memory (SRAM) and Dynamic Random Access Memory (DRAM), and/or non-volatile memory, such as Read Only Memory (ROM), Erasable Programmable ROM (EPROM), Electrically Erasable and Programmable ROM (EEPROM), flash memories, hard disks, optical disks, and magnetic tapes.
[026] The memory 206 may include data generated as a result of the execution of one or more of the modules 208. The memory 206 is connected to a plurality of modules 208. The system 102 comprises a receiving module 212, an identifying module 214, and a test case generation module 216.
[027] The data 230 may include a repository 238 for storing data processed, computed, received, and generated by one or more of the modules 208. Furthermore, the data 210 may include other data 240 for storing data generated as a result of the execution of modules than the ones mentioned above.
[028] In one implementation, the system 102 is configured to optimize generation of test cases. The processor 202 may be configured to receive one or more parameters from the user through the I/O Interface 106. The parameters that are received comprises at least one of the Uniform Resource Locator (URL) of the target application, words to be ignored, explore to a level of the target application, and restrict to sub-domain of the target application. The target application comprises a web-based application and the one or more browsers comprises at least one of selenium web browser and standard web browser. The parameters are then used for establishing an interaction with a target application. The interaction is established through one or more browsers.
[029] To have a better understanding of the received parameters such as words to be ignored, explore to a level of the target application, and restrict to sub-domain of the target application, in an embodiment, the system 102 is domain agnostic therefore the system 102 browses through the target application, navigates through all hyperlink associated with the target application, executes forms and navigates through all sections of the target application. Sections of the target application contains sections such as hyperlink, forms, lists, checkboxes and alike.
[030] The system 102 may browse through each of the section of the target application whether the sections are internal sections of the target application or external sections of the target application. The external sections may point to other application within the same domain or other domain, such as for a target application internal.abc.com may be linked to other application external.xyz.com. So the user for the external sections, may choose to exclude the external sections by putting the external sections into the exclusion lists. Therefore, the user may limit all its browsing to internal.abc.com through the exclusion. Domain agnostic refers to a new task of transferring knowledge from a source domain to data from multiple heterogeneous target domains.
[031] For example, if in case a customer may want to restrict its browsing to a few sections or may want to exclude a few sections of the target web application. In such case, the system 102 provides the user to restrict a few parameters before establishing interaction with the target application by allowing the user to input the one or more parameters to be used for establishing an interaction with a target application. For example, the parameters are for words to be ignored may be news, financials, audits etc., explore to level of the target application may be 2 or 3 and restrict to sub-domain of the target application a ‘name/text’ of the sub-domain is specified. The system 102 may therefore ignore the words news, financials and audit while browsing through the sections of the target application. The system 102 may not browse level 2 and 3 of the target applications. And the system may not enter into a sub-domain that is restricted.
[032] Further the processor 202 is configured to identify one or more interactive elements associated with the target application. The one or more interactive elements comprise at least one of web links, web forms, web form elements, custom interactive elements of the target application. The custom interactive elements may be videos, simulations or custom components that are embedded within the target application.
[033] Upon identification of the one or more interactive elements, the processor 202 may further generate, one or more test cases by using the one or more interactive elements. The one or more test cases comprises one or more steps created in a predefined informative format. The predefined informative format comprises a tree type structure arranging interactive elements in a hierarchy. The predefined informative format groups the one or more test steps in a logical set according to link identified between the one or more test steps. The link identified may be selected from any kind of HTML link. For example, such as link to a ‘contact us’ or an ‘about us’ page or it could be external applications. The external application may be a link to an external application such as external.xyz.com with the target application. The one or more test cases may comprise a user customized test case. The user customized test case is generated by using one or more assertions and by combining one or more test steps. The one or more test assertions are conditions put within the test cases, in order to allow a tester to check that condition. For example, to check whether the identified link is correct or incorrect, or its text is correct or incorrect according to business needs, an assertion is put to check the identified link is correct or in correct. In another example, if in the portal, there should be a link called as ‘About Us’, but instead of that the link is ‘Our Company’. This may be tested automatically through putting test assertions.
[034] In an exemplary embodiment, the processor 202 is configured to gather the information from the interactive elements such as positioning of the interactive elements, attributes associated with the interactive elements and descriptions of the interactive elements. The information gathered from the interactive elements is then combined to form test cases. The test cases may be linked logically considering the flow of the application. For example, a login page, where a user will be required to enter a username followed by a password. The system 102 would therefore form a logical aggregation of test steps to enter the login id and password, such as add a login id and then add a password, which is the flow of the application.
[035] In an embodiment, one or more test cases may be used for generating a new test case scenario. A test case scenario is the combination of one or more test cases that are to be executed together the one or more tests cases are provided as the parameter for generating new test cases scenarios. For example, a login page for any target application may be considered. The login page provides a personalized experience to end-user. On the login page, there may be three options provided for receiving inputs from the user -Username (first option), Password (second option), and Submit (third option). According to the 3 options, the end-user is required is take an action (input information) while executing each of the three options. While the 3 options are aggregated to form one or multiple test cases by various permutations and combinations. For example, end-user entered username but did not put password, or an end user entered the password but not the username. There may be various such possible combinations that may be the test cases. Such one or more test cases are combined together for generation of a test scenario.
[036] In an embodiment, the processor 202 may be configured to generate an execution plan for executing the one or more test scenarios. The execution plan may be a combination of the test scenarios that are needed to be executed at a time. For example, once the test cases are generated automatically by the system 102, a Quality assurance or the Tester may need to assemble the generated test cases according to an execution plan they want to execute. The execution plan may require only a few test case scenarios to be tested as a part of workflow. The test scenarios that are the part of the workflow forms a part of the execution plan. The execution plan has a plan name, data set required for executing the test cases, time interval for how frequent the execution plan is to be run and the total time required for running the execution plan. Further the processor 202 may be configured to generate a report towards a success status of the execution of the one or more test scenarios. The report may comprise a summary of the Report Execution, Test Plan, Test Scenario, Data Set, and Execution Status that is Pass or Fail. The report also may comprise of details of each row executed from the Data Set; For each Data Set Row, details of each Test Case Executed; For each Test Case Executed, details of the Test Steps Executed and for each Test Step Executed, the PASS / FAIL status and details of any errors encountered in executing the Test Step. The test results of the test scenarios executed and show whether each test case scenario has passed or failed according to the execution plan.
[037] Referring now to Figure 3A, a method 300 for optimizing generation of test cases is described, in accordance with an embodiment of the present subject matter. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types.
[038] The order in which the method 300 for optimizing generation of test cases is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300 or alternate methods. Additionally, individual blocks may be deleted from the method 300 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 300 may be considered to be implemented in the above described system 102.
[039] At block 302, the one or more parameters are received from the user through the I/O Interface 106 by the processor 202 through the receiving module 212. The parameters that are received comprises at least one of the Uniform Resource Locator (URL) of the target application, words to be ignored, explore to level, and restrict to sub-domain. The parameters are to be used for establishing an interaction with the target application. The interaction is established through one or more browsers. The target application comprises the web-based application and the one or more browsers comprises at least one of selenium web browser and standard web browser.
[040] At block 304, one or more interactive elements associated with the target application are identified by the processor 202 by the identification module 214. The one or more interactive elements comprise at least one of web links, web forms, web form elements, custom interactive of the target application.
[041] At block 306, Upon identification of the one or more interactive elements, one or more test cases are generated by the processor 202 through the test case generation module by using the one or more interactive elements. The one or more test cases comprises one or more steps created in the predefined informative format. The predefined informative format comprises the tree type structure arranging interactive elements in the hierarchy. The predefined informative format groups the one or more test steps in the logical set according to link identified between the one or more test steps. The one or more test cases may comprise the user customized test case. The user customized test case is generated by the user by combining one or more test steps and by using one or more assertions.
[042] in an embodiment additionally the method 300 for optimizing generation of test cases may generate the new test scenario using one or more test cases. The one or more test cases are provided as the parameter for generating new test scenarios.
[043] Referring to figure 3C, in another embodiment, additionally the method 300 for optimizing generation of test cases, at block 308 the processor 202 may generate the execution plan for executing the one or more test scenarios. The tester may need to assemble the generated test cases according to an execution plan they want to execute. The execution plan may require only few scenarios to be tested as the part of its workflow. The test scenarios that are the part of the workflow forms the part of the test scenario. The execution plan has a plan name, data set required for executing the test cases, time interval for how frequent the execution plan is to be run and the total time required for running the execution plan.
[044] At block 310, the report towards the success status of the execution may be generated of the one or more test scenarios. The report may comprise the test results of the test scenarios executed and show whether each scenario has passed or failed the test plan.
[045] Referring now to figure 4, an example embodiment showing the workflow of the method 300 for optimizing generation of test cases is described.
[046] At step 312, the user tends to login into the system to initiate the generation of test cases.
[047] At step 314, a new project for generation of test cases is created. The user inputs the one or more parameters for establishing an interaction with the target application. One or more parameters are inputted by the user that may comprise at least one of the Uniform Resource Locator (URL) of the target application, words to be ignored, explore to level, and restrict to sub-domain.
[048] At step 316, upon entering the parameters, the identification of the interactive elements is initiated by the user. The identification of the interactive elements is an auto discovery done by bots. Bots are automated programs that execute a defined plan of code scripts. A bot may have various entry variables and exit conditions. Bots may invoke other bots depending on scenarios. The bots are programmed to identify elements according to their pre-defined conditions and ignore others elements. Bots are activated upon the entering of the parameters. Various bots such as a link bot, a form bot, list bot and alike are activated to identify the interactive elements in the target application. One or more interactive elements such as comprise at least one of web links, web forms, web form elements, custom interactive of the target application. associated with the target application are identified.
[049] At step 318, the status of the identification is checked for whether the identification step is completed, whether the new project is aborted or if the identification of the interactive elements has failed.
[050] at step 320, if the user in any circumstances need to abort the identification of the interactive elements, the user may abort the identification of the interactive elements.
[051] At step 322, upon aborting the identification of the interactive elements, an abort message is raised in the system 102. The interactive elements that is at least one of web links, web forms, web form elements, custom interactive of the target application are partially identified.
[052] at step 324, due to any error or technical problem the identification of the interactive elements may fail.
[053] At step 326, a failure message is raised in the system.
[054] At step 328, the system completes the identification of the interactive elements.
[055] At step 330, one or more test cases are generated by using the one or more interactive elements. The one or more test cases comprises one or more steps created in the hierarchical tree format according to the interactive elements identified in step 330.
[056] At step 332 the user can enter and create his own test assertions by combining one or more test cases and then the test assertions are linked to the test cases.
[057] At step, 334, the user may create new customized test cases by adding new test steps or by combining one or more test steps and one or more test assertions.
[058] At step 336, the processor 202 generates the new test scenario using and combining one or more test cases that may be run together for testing the target application. The one or more test cases are provided as the parameter for generating new test scenarios.
[059] At step 338, the user, Quality Assurance or the tester may generate an execution plan for executing the one or more test scenarios. That is the tester may need to assemble the generated test cases according to an execution plan they want to execute. The execution plan may require only the few scenarios to be tested as the part of its workflow. The execution plan has a plan name, data set required for executing the test cases, how frequent the execution plan is to be run and the time required for running the execution plan.
[060] At step 340, once the execution plan is generated, the user/Quality assurance/tester may run the execution plan.
[061] At step 342, the report is generated towards the success status of the execution of the one or more test scenarios. The report may comprise the test results of the test scenarios executed and show whether each scenario has passed or failed according to the execution plan.
[062] In an embodiment, at step 344, upon identification of the interactive elements, the system may identify data sets and a system defined data variables. That is, for each interactive element identified, a new data variable is created by the system. The user may also create user defined data variables. These data variables are then utilised in a dataset to assign values. These data variables are utilised during the identification of elements in the auto discovery or test plan execution, when the system requires to enter values in any of the interactive element, it refers to the data variables and the defined values.
[063] At step 346, if the user is a technical person, the user may generate user defined data variables for ach interactive form that is identified. The user defined data variable generation is an optional step.
[064] At step 348, alternatively the user can provide data hint by providing name values and keywords to the identified interactive elements. When the Auto Discovery for identification of the interactive elements is run, and it encounters an Input Element, the system tries to enter the value in the Input Element. The system picks up the value from: 1) The data set – if a data set is selected as default for auto discovery and if the input element matches a variable in the data set or 2) From data hint – if there is no default data set defined or if the default data set does not have a variable that matches the input element, the value is chosen from the data hint. Data hints are instructions to the system on what values to use for input elements based on certain attributes of the input element. For example, data hints helps the system make a good guess on the value to enter for say an input element that requires first name and distinguish it from another element that requires phone number. The user may also define custom data hints before running the auto discovery process. The system has some in built data hints which it uses if the user has not defined any custom data hint. At step 350, the user may modify the data set of the identified interactive elements by dynamic editing. Dynamic editing refers editing the value of a variable in a data set by just clicking on the old value and overwriting it with the new value without having to first download the data set, editing the value and uploading the edited data set. The user may upload the identified interactive elements in a csv format to add in the data set of the interactive elements.
[065] At step 352, the dataset is linked with the identification module 212, for in case the identification of the interactive elements needs to be re-run. Then the system performs the steps 338, 340, and 342. The user, Quality Assistant or the tester may generate an execution plan for executing the one or more test scenarios. That is the tester may need to assemble the generated test cases according to an execution plan they want to execute. The execution plan may require only the few scenarios to be tested as the part of its workflow. The execution plan has the plan name, data set required for executing the test cases, how frequent the execution plan is to be run and the time required for running the execution plan. Once the execution plan is generated, the user/Quality assistant/tester may run the execution plan. The report is generated towards the success status of the execution of the one or more test scenarios. The report may comprise the test results of the test scenarios executed and show whether each scenario has passed or failed according to the execution plan.
[066] Exemplary embodiments discussed above may provide certain advantages. Though not required to practice aspects of the disclosure, these advantages may include the following.
[067] Some embodiments may enable a system 102 and a method 300 to generate the test cases automatically without manual intervention,
[068] Some embodiments may enable a system 102 and a method 300 to generate the test cases in an optimised time period.
[069] Some embodiments may enable a system 102 and a method 300 to create a comprehensive list of test steps based on the functionalities in the web-bases system.
[070] Some embodiments may enable a system 102 and a method 300 to provide greater test coverage.
[071] Some embodiments may enable a system 102 and a method 300 to eliminate errors that would otherwise be present in the manual test step generation process.
[072] Some embodiments may enable a system 102 and a method 300 to create faster creation of test cases and assertions faster.
[073] Some embodiments may enable a system 102 and a method 300 to provide widest possible test coverage of target application eliminating the need to manually created hundreds of test cases to cover all possible use cases.
[074] Although implementations for system and method for optimizing generation of test cases have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for optimizing generation of test cases.
| # | Name | Date |
|---|---|---|
| 1 | 202021056497-STATEMENT OF UNDERTAKING (FORM 3) [25-12-2020(online)].pdf | 2020-12-25 |
| 2 | 202021056497-REQUEST FOR EXAMINATION (FORM-18) [25-12-2020(online)].pdf | 2020-12-25 |
| 3 | 202021056497-FORM FOR STARTUP [25-12-2020(online)].pdf | 2020-12-25 |
| 4 | 202021056497-FORM FOR SMALL ENTITY(FORM-28) [25-12-2020(online)].pdf | 2020-12-25 |
| 5 | 202021056497-FORM 18 [25-12-2020(online)].pdf | 2020-12-25 |
| 6 | 202021056497-FORM 1 [25-12-2020(online)].pdf | 2020-12-25 |
| 7 | 202021056497-FIGURE OF ABSTRACT [25-12-2020(online)].jpg | 2020-12-25 |
| 8 | 202021056497-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [25-12-2020(online)].pdf | 2020-12-25 |
| 9 | 202021056497-EVIDENCE FOR REGISTRATION UNDER SSI [25-12-2020(online)].pdf | 2020-12-25 |
| 10 | 202021056497-DRAWINGS [25-12-2020(online)].pdf | 2020-12-25 |
| 11 | 202021056497-DECLARATION OF INVENTORSHIP (FORM 5) [25-12-2020(online)].pdf | 2020-12-25 |
| 12 | 202021056497-COMPLETE SPECIFICATION [25-12-2020(online)].pdf | 2020-12-25 |
| 13 | 202021056497-RELEVANT DOCUMENTS [18-02-2021(online)].pdf | 2021-02-18 |
| 14 | 202021056497-MARKED COPIES OF AMENDEMENTS [18-02-2021(online)].pdf | 2021-02-18 |
| 15 | 202021056497-FORM 13 [18-02-2021(online)].pdf | 2021-02-18 |
| 16 | 202021056497-AMENDED DOCUMENTS [18-02-2021(online)].pdf | 2021-02-18 |
| 17 | 202021056497-Proof of Right [16-05-2021(online)].pdf | 2021-05-16 |
| 18 | 202021056497-FORM-26 [16-05-2021(online)].pdf | 2021-05-16 |
| 19 | Abstract1.jpg | 2021-10-19 |
| 20 | 202021056497-FER.pdf | 2022-07-19 |
| 21 | 202021056497-FER_SER_REPLY [13-12-2022(online)].pdf | 2022-12-13 |
| 22 | 202021056497-COMPLETE SPECIFICATION [13-12-2022(online)].pdf | 2022-12-13 |
| 23 | 202021056497-CLAIMS [13-12-2022(online)].pdf | 2022-12-13 |
| 1 | SearchHistoryE_11-07-2022.pdf |