Abstract: Disclosed is a system and method for automated testing of a software application. The method comprises creating one or more test cases for a test scenario, wherein the test scenario depends on the software application under test, and wherein a test case of the one or more test cases comprises one or more test steps. The creating comprises defining one or more test objects for the test case, assigning a pre-defined test action for a test step using a keyword, assigning first test data to the test case, and assigning second test data to the test step, wherein the second test data is reused for the one or more test steps. The method further comprises executing the one or more test cases, wherein an execution of the one or more test cases depends on one or more predefined conditions.
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
.&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
A SYSTEM AND METHOD FOR AUTOMATED TILTING OF A SOFTWARE
APPLICATION
Applicant
Tata Consultancy Services Limited A Company Incorporated in India under The Companies Act, 1956
Having address:
Nirmal Building, 9th Floor.
Nariman Point, Mumbai 400021.
Maharashtra, India
The following specification particularly describes the invention and the manner in which it is to be performed.
CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY
[001] The present application claims priority to Indian Provisional Patent
Application No. 3596/MUM/2012, filed on Dec 24, 2012, the entirety of which is hereby incorporated by reference.
TECHNICAL FIELD
[002] The present subject matter described herein, in general, relates to software
testing, and more particularly to a system and method for automated testing of a software application.
BACKGROUND
[003] Automation in software testing plays an important role in reducing time and
cost required for testing a software application. Although running an automated test is faster than manual testing, the time saved is diverted to developing and maintaining automated test scripts. In addition, many talented testers prefer manual testing so that they can spend their time devising creative test cases rather than becoming proficient with complex automated languages.
[004] There exist many tools providing automatic testing of software application,
wherein automated test scripts are created through record or playback or scripting by automation engineers. In this model creation and maintenance of the automated test scripts requires higher degree of technical skills. Moreover, the existing tools provide a high level of automation only in test execution phase and the automation percentage reduces drastically in the entire testing life cycle, thereby increasing the cost.
SUMMARY
[005] This summary is provided to introduce aspects related to system(s) and
method(s) for automated testing of a software application and the aspects are further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter,
[006] In one implementation, a method for automated testing of a software
application is disclosed. The method comprises creating one or more test cases for a test
scenario, wherein the test scenario depends on the software application under test, and
wherein a test case of the one or more test cases comprises one or more test steps, and wherein
the test case is reused as a test step while creating the one or more test cases. The creating
further comprises defining one or more test objects for the test case, wherein a test object of
the one or more test objects corresponds to the test step and assigning a pre-defined test action
for the test step using a keyword wherein the keyword is reused for the one or more test
objects. The creating further comprises assigning first test data to the test case, wherein the
first test data is reused for a pre-defined number of iterations and for the one or more test
cases and assigning second test data to the test step, wherein the second test data is reused for
the one or more test steps. The method further comprises executing the one or more test cases,
wherein an execution of the one or more test cases depends on one or more predefined
conditions, wherein, the creating, the defining one or more test objects, the assigning a pre
defined test action, the assigning first test data, the assigning second test data, and the
executing are performed by a processor using programmed instructions stored in a memory.
[007] In one implementation, a system for automated testing of a software
application. The system comprises a processor; and a memory coupled to the processor, wherein the processor is capable of executing a plurality of modules stored in the memory, and wherein the plurality of modules comprises a creating module configured to create one or more test cases for a test scenario, wherein the test scenario depends on the software application under test, and wherein a test case of the one or more test cases comprises one or more test steps, and wherein the test case is reused as a test step while creating the one or more test cases. The creating module further comprises a defining module configured to define one or more test objects for the test case, wherein a test object of the one or more test
objects corresponds to the test step and an assigning module configured to assign a predefined test action for the test step using a keyword, wherein the keyword is reused for the one or more test objects, assign first test data to the test case, wherein the first test data is reused for a pre-defined number of iterations, and wherein the first test data is reused for the one or more test cases and assign second test data to the test step, wherein the second test data is reused for the one or more test steps. The system further comprises an execution module configured to execute the one or more test cases, wherein an execution of the one or more test cases depends on one or more pre-defined conditions.
[008] In one implementation, a computer program product having embodied thereon
a computer program for automated testing of a software application is disclosed, the computer program product comprises a program code for creating one or more test cases for a test scenario, wherein the test scenario depends on the software application under test, and wherein a test case of the one or more test cases comprises one or more test steps, and wherein the test case is reused as a test step while creating the one or more test cases. The program code for creating comprises a program code for defining one or more test objects for the test case, wherein a test object of the one or more test objects corresponds to the test step, and a program code for assigning a pre-defined test action for the test step using a keyword, wherein the keyword is reused for the one or more test objects. The program code for creating further comprises a program code for assigning first test data to the test case, wherein the first test data is reused for a pre-defined number of iterations and for the one or more test cases and a program code for assigning second test data to the test step, wherein the second test data is reused for the one or more test steps. The computer program product further comprises a program code for executing the one or more test cases, wherein an execution of the one or more test cases depends on one or more predefined conditions.
BRIEF DESCRIPTION OF THE DRAWINGS
[009] The detailed description is described with reference to the accompanying
figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
[0010] Figure 1 illustrates a network implementation of a system for automated testing
of a software application is shown, in accordance with an embodiment of the present subject matter.
[0011] Figure 2 illustrates the system, in accordance with an embodiment of the
present subject matter.
[0012] Figure 3 illustrates a test scenario in accordance with an exemplary
embodiment of the present subject matter.
[0013] Figure 4 illustrates a test case comprising one or more test steps in accordance
with an exemplary embodiment of the present subject matter.
[0014] Figure 5 illustrates a method for automated testing of a software application, in
accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[0015] Systems and methods for automated testing of a software application are
described. At first, one or more test cases may be created for a test scenario, wherein the test scenario depends on the software application under test. The one or more test cases may comprise one or more test steps, wherein the test case of the one or more test cases may be reused as a test step. A creation of the test case comprises defining test objects for the test case and assigning a pre-defined test action for a test step using a keyword.
[0016] Further test data may be assigned to the test case, wherein the test data may be
stored in an external database or may be assigned locally for the test case. As the test data is stored in the external database, the test data may be reused for the one or more test cases. Also, the test data may be assigned for the test step, wherein the test data may be reused for the one or more test steps. Further, the one or more test cases may be executed using the test data, wherein an execution of the one or more test cases depends on one or more predefined conditions. The pre-defined conditions comprise at least one of a failure of the execution of the test step, or a success of the execution of the test step.
[0017] While aspects of described system and method for automated testing of a
software application may be implemented in any number of different computing systems,
environments, and/or configurations, the embodiments are described in the context of the following exemplary system.
[0018] Referring now to Figure 1, a network implementation 100 of for automated
testing of a software application is illustrated, in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 provides for creating one or more test cases for a test scenario of the software application under test. The test case of the one or more test cases may comprise one or more test steps. The test case may be created by defining one or more test objects for the test case and by assigning test data to the test case and to a test step of the test case. The system 102 further executes the one or more test cases to generate one or more reports of the execution.
[0019] Although the present subject matter is explained considering that the system
102 is implemented as on a server, it may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, and the like. In one implementation, the system 102 may be implemented in a cloud-based environment. It will be understood that the system 102 may be accessed by multiple users through one or more user devices 104-1, 104-2... 104-N, collectively referred to as user devices 104 hereinafter, or applications residing on the user devices 104. Examples of the user devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, and a workstation. The user devices 104 are communicatively coupled to the system 102 through a network 106.
[0020] In one implementation, the network 106 may be a wireless network, a wired
network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include
a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[0021] Referring now to Figure 2, the system 102 is illustrated in accordance with an
embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 206.
[0022] The I/O interface 204 may include a variety of software and hardware
interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with a user directly or through the client devices 104. Further, the I/O interface 204 may enable the media system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[0023] The memory 206 may include any computer-readable medium known in the art
including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
[0024] The modules 208 include routines, programs, objects, components, data
structures, etc., which perform particular tasks, functions or implement particular abstract data types. In one implementation, the modules 208 may include a creating module 212, a defining module 214, an assigning module 216, an execution module 218, a generation module 220, a report generation module 222, a repeating module 224 and other modules 226. The other
modules 226 may include programs or coded instructions that supplement applications and functions of the system 102.
[0025] The data 210, amongst other things, serves as a repository for storing data
processed, received, and generated by one or more of the modules 208. The data 210 may also include a system database 228, and other data 230. The other data 230 may include data generated as a result of the execution of one or more modules in the other modules 226.
[0026] In one implementation, at first, a user may use the client device 104 to access
the system 102 via the I/O interface 204. The user may register themselves using the I/O interface 204 in order to use the system 102. The working of the system 102 may be explained in detail in Figures 3, 4 and 5 explained below. The system 102 may be used for automated testing of a software application. Specifically, in the present implementation, the creating module 212 may be configured to create one or more test cases for a test scenario, wherein the test scenario depends on the software application under test. Further, a test case of the one or more test cases comprises one or more test steps, wherein the test case may be reused as a test step.
[0027] The creating module 212 further comprises the defining module 214
configured to define one or more test objects for the test case, wherein a test object of the one or more test objects corresponds to the test step. In an exemplary embodiment of the system 102, the one or more test objects may be defined using an object repository file. The Object repository file may be saved in an object repository supported by C# platform. The object repository file comprises classes and methods, wherein the methods define one or more properties of the test object. The system 102 enables customization of the one or more properties and of values of the one or more properties. The customization may be performed by a user. By way of a specific example, the one or more properties for the test object may comprise 'Display Text', 'Type', and 'Class'. Also, regular expressions may be used for the one or more properties to handle dynamic values of the one or more properties. The system 102 also enables deletion of the one or more properties which are unwanted to create an optimized object repository.
[0028] Further, the assigning module 216 may be configured to assign a pre-defined
test action for the test step using a keyword, wherein the keyword may be reused for the one
or more test objects. The keyword may be selected from a pre-defined set of keywords. The pre-defined set of keywords may be stored in one or more excel files, wherein the one or more excel files may be stored in the system database 228. Further, the system 102 facilitates an implementation of new keywords or new test actions. The system 102 also facilitates a customization of existing keywords and existing test actions. The implementation and the customization may be performed using a plurality of functions. The plurality of functions may be created or customized by the user in C# or Visual Basic.Net (VB.net). The plurality of the functions may be the methods defined in a class. Further, the plurality of functions may be defined in a user defined library file. A format of the user defined library file may be based on a programming language chosen by the user. By way of a specific example, *.cs may be the format for C# programming language and *.vb may be the format for VB.Net programming language.
[0029] By way of a specific example, the implementation of the new test actions
requires adding the plurality of functions as «ud_functioname» in the user defined library file. Further, a name of the plurality of the functions may be added as «ud_functioname» to the user defined library file.
[0030] The assigning module 216 further assigns first test data to the test case,
wherein the first test data may be reused for a pre-defined number of iterations. The first test data may be further reused for the one or more test cases. The assigning module 216 may be further configured to assign second test data to the test step, wherein the second test data may be reused for the one or more test steps. The first test data and the second test data may be assigned from a test data stored in an external database. The external database may comprise of an excel workbook. In another embodiment of the system 102 the first test data and the second test data may be assigned locally for the test case and the one or more test steps.
[0031] In an embodiment of the system 102, a test case workbook may be used to
create the one or more test cases. In an exemplary embodiment of the system 102, the test case workbook may be an excel file. The test case workbook may further comprise one or more worksheets. The one or more worksheets may be provided in a test case development workbook. The one or more worksheets may comprise a test driver sheet, a test case sheet and a reusable test case sheet.
SNO
Test_ _Case_Sheet_Name Test_Case_Name
1 sheet_nm create tc
2 sheet_nm update_tc
3 sheet_nm save_tc
Test driver sheet
[0032] The test driver sheet shown above may define a 'TestCaseSheetName' or a
'Test Scenario Name' and a 'Test_Case_Name' for the test scenario. The test driver sheet may comprise three columns, wherein a first column may be a serial number (SNO), and wherein the SNO is unique for a test driver sheet row. The system 102 may use the SNO to refer the test driver sheet. The Test CaseSheetName is a text field representing the test driver sheet or a test scenario name for the test case. A combination of the Test_Case_Sheet_Name field and the test case are unique and may be used by the system 102 to refer to the one or more test cases. The Test_Case_Name is a text field representing a name of the test case. A combination of the Test_Case_Name field and the test scenario is unique and may be used by the system 102 to refer the one or more test cases.
[0033] A sheet shown below represents a test scenario sheet.
Step
No Test Case Test Object Test Action LOC Param eters Execute Test Data
0 Login External [2]
1 Welcome: Mercury Tours OPEN Y VJJRL
2 userNAME SETINPUT Y V_Name[4,5]
3 Desc Edit Password SETINPUT Y V Password [4,5]
0 Execute
I Welcome: Mercury Tours OPEN Y http://newtour.de moaut.com
2 userNAME SETINPUT Y Nishanth
3 Desc Edit Password SETINPUT Y AUTO_V_Passw ord
4 Submit Click Y
Test Scenario Sheet
10034] In another embodiment, the test scenario sheet may be used to create the one or
more test cases. For the test scenario there may be one or more test cases. The test scenario sheet may comprise one or more columns comprising a 'Step No', a 'Test Case', an 'Object', an 'Action', a 'LOC, one or more 'Parameters', 'Execute' and 'Data'. The 'Step No' is a unique number used to identify the test step in the test case. When the 'Step no' is "0" a corresponding row in the test scenario sheet defines a name for the test case. Also, the column 'Execute' does not define a test action to be executed for the corresponding row.
[0035] The column 'Test Case' defines a name for the test case. The name is defined
only for a row having the 'Step No' '0'. The rest of the rows in the test scenario sheet for the test case do not define a value for the column 'Test Case'. Further, the column 'Object' comprises a drop down box listing a plurality of test objects covering a plurality of testing tools. By way of a specific example, if a plurality of test object repositories for software testing frameworks comprising QuickTest professional (QTP) and Selenium is specified, a combined list of names of the plurality of test objects may be listed in the column 'Object'.
[0036] Further, the column 'Test Action' comprises a drop down box listing a
plurality of test actions provided in a default action map or a custom action map. The plurality _ of test actions may be the pre-defined set of keywords used to replicate a user specific action. The plurality of test actions enables the user to test the software applications without any scripting. The column 'LOC specifies a number of rows to be skipped or moved when the one or more test steps comprises conditional statements. The conditional statements may comprise IF/ELSEIF/ELSE/ENDIF/WHILE/WHEN. Further, the column 'Parameters' specifies one or more parameters for the one or more test steps. When there is more than one parameter, parameter values may be specified, separated by commas thereby avoiding addition of new columns for the one or more parameters. By way of a specific example, the one or more parameters may comprise 'Quantity', and 'Size' and the parameter values may comprise '1' for a parameter 'Quantity' and 'large' for a parameter 'size'. The column 'Execute' specifies only two values "Y" or "N". "Y" represents an execution of a particular test step and "N" represents a non-execution of the particular test step. The column 'Data' may be used to enter the test data.
[0037] The one or more test cases may be reused across a plurality of test case sheets.
By way of a specific example, to refer a particular test case from another test case sheet, the test action may be defined in a format 'fn_TestCaseName!, wherein 'fir denotes a particular test case which may be reused. The 'TestCaseName' denotes the name of the particular test case. When the particular test case to be reused is in a same test case sheet then the test action may be defined as«fn_TestCaseName». The name of the test case sheet may be defined as «Reusable_Testcases» to store the one or more test cases which may be reused.
[0038] Further, the test data to be assigned for the execution of the test case may be
stored in the excel workbook and may be reused across the one or more test cases. The excel workbook may comprise two work sheets, a data sheet and a driver sheet. One or more data sheets may be used to define the test data irrespective of the test case. The test data may be organized based on functionality and is common for a whole test case execution. The driver sheet may be used to map the test data and the test case. A normalization technique may also be used to maintain the test data in the excel workbook. In one cycle of the execution, the plurality of test case sheets may be executed as the plurality of test case sheets share the excel workbook.
[0039] Further, the one or more data sheets may be used for specifying the test data,
wherein the test data may be grouped in to different work sheets based on the functionality. The name of a test data variable may be used as a column name and values for the test data variable may be specified under the column in consecutive rows. As a result, the test data variable may have multiple sets of values. In an embodiment of the system 102, the test data variable may be named starting with 'V_', thereby enabling the system 102 to understand that the test data is a variable and the value is specified in the data sheet using a keyword 'V_'
[0040] In an embodiment of the system, the one or more test cases may be executed
by the execution module 218 in multiple iterations and multiple test data sets may be used. Specifying the test data at a test step level enables the normalization technique used for maintaining the test data.
Step
No Test Case Test Object Test Action LOC Param eters Execute Test Data
0 Login External [2]
1 Welcome: Mercury Tours OPEN Y V URL
2 userNAME SETINPUT Y V_Name[4,5]
3 Desc_Edit_Password SET1NPUT Y V Password[4.5]
[0041] The above sheet illustrates a particular test case 'Login' comprising of one or
more test steps. External[] may be specified in a 'Data' column of the 0th row of the particular test case in the test case sheet. As a result, the test case 'login" may be executed for '2' iterations and the test data is assigned to the test case from the external database, wherein the external database may be at least a data sheet of the one or more data sheets. In another embodiment of the system 102, the test data may be assigned locally for the test case and the one or more test steps of the test case.
[0042] The system 102 further comprises a generation module 220 configured to
generate a plurality of automation scripts for the one or more test cases. The one or more test cases may be selected by the user for generating the plurality of automation scripts from the test driver sheet. Further, the user may modify the test one or more test cases using the creating module 212. In a next step, the user may select a programming language for generating the automation scripts.
[0043] Further, the execution module 218 may be configured to execute the one or
more test cases using the first test data and the second test data, wherein an execution of the one or more test cases depends on one or more pre-defined conditions. The one or more predefined conditions comprise at least one of a failure of the execution of the test step, or a success of the execution of the test step. The execution module 218 controls a flow of the execution even when there is a failure of the test step. The execution module 218 executes the test case even when the execution of the test step fails. The execution module 218 further comprises a repeating module 224 configured to repeat the execution of the one or more test cases for a specified number of times in a loop.
[0044] In another embodiment of the system 102, the execution module 218 further
enables exiting a particular test case on failure of a particular test step. By way of a specific example, when the particular test step fails and the particular test case may not be executed successfully, the execution of the particular test case may be stopped. The execution module 218 also enables exiting a particular test run on failure of a particular test step.
[0045] In an embodiment of the system 102, the system 102 may be configured to
provide an option to use run time values of one or more variables. An output value of the test step may be used as an input data for another test step. The output value may be output and stored in a variable, wherein the output value may be used in another test step. The output value may be used for another test step only for a single test run and may not be used in another test run.
[0046] The system 102 further comprises a report generation module 222 configured
to generate one or more reports of the execution at one or more levels. The one or more levels comprise a test step level, a test case level and a test scenario level. The one or more reports comprises a plurality of excel test reports generated by the report generation module 222. Results generated by the execution of the one or more test cases may be updated in a excel test report by the time the execution is complete and control returns to the system 102.
[0047] In an exemplary embodiment of the system 102, a result for the test step is
updated in the test scenario sheet. The test scenario sheet defines the test case against the test step. The results are updated by adding additional columns after the "Data" column in the test scenario sheet. The result comprises of one or more fields. The one or more fields comprises 'Expected Result [Iteration no]', 'Actual Result [Iteration no]' and 'Status'. The field, Expected Result [Iteration no] gives an expected result in case of validation test steps. The field Actual Result [Iteration no] gives an actual run time result for a particular validation test step. The field 'Status' is the result of the test step and may have the values 'PASS', 'FAIL5 or 'SKIP'. The field 'Status' may have the value 'PASS' for the validation test steps that are executed successfully, 'FAIL' for validation test steps that are not executed successfully and 'SKIP' for the test steps that are not executed as an execution flag for the test step may be "N".
[0048] The report at the test case level comprises consolidated result of the test case
including the iterations specified in the test driver sheet in a separate column named "Result". The reports may comprise a HTML test report, a test execution summary, a test case summary report and a test log. The HTML test report may be generated by the report generation module 222 at the end of the execution of a test suite, wherein the test suite comprises of the plurality of test cases. By way of a specific example, the HTML test report may be stored in a "test results" folder which may be sub divided based on a date of the execution again sub divided based on names of one or more test suites. The HTML test report may comprise the test execution summary in a consolidated form and the test execution summary. The test execution summary in a consolidated form is generated when more than one test case sheet is executed. The test execution summary is generated for individual test case sheet. The test execution summary may comprise of a plurality of fields, wherein the plurality of fields comprise 'test suite', 'total test case count', 'pass', 'failed', 'skipped' and 'trend analysis'. The field 'test suite' states the name of the test suite. The field 'pass' further comprises 'test case count' stating a count of the test cases passed, 'percentage' stating percentage of the test cases passed. The field 'failed' further comprises 'test case count' stating a count of the test cases failed, 'percentage' stating percentage of the test cases failed. The field 'skipped' further comprises a count of the test cases skipped and 'percentage' stating percentage of the test cases skipped. The 'trend analysis' is the field comprising bar graph for displaying Pass. Failed and Skipped values.
[0049] The test case summary report details the names of the plurality of test cases
executed in a particular test suite and the results of individual test case. The test case summary report comprises of a plurality of fields, wherein the plurality of fields comprises 'total no of test cases', 'passed', 'failed', and 'skipped'. The test log may be created for each test case sheet workbook and may be used for debugging in case of errors.
WORKING EXAMPLE
[0050] Referring to the Figure 3, in an exemplary embodiment of the system, a
particular test scenario 300 is described. In a first step (302) of the test scenario, a Uniform Resource Locator (URL) may be launched. In a next step (304), a username may be entered by a user. Further at step 306, a password may be entered by the user. In a next step (308), the
user may click on 'Sign in' Subsequent to clicking on 'Sign in', at step 310, it may be checked if book a ticket page' exists. Further at step 312, the user may click on 'logout'. In a next test step 314, the user may close the browser. Further: the test data may be used for the particular test scenario and the 'logout' functionality.
[0051] Still referring to figure 3, the use of a test case 'Log Out' as the test step in the
particular test scenario is described. The test case 'Log Out' may be created as a reusable test case, wherein the test case 'Log Out' is further reused as the test step in the particular test scenario. The test case 'Log Out' comprises the test step 'Click on the logout link'.
[0052] Further, referring to Figure 4, a test case 'SampleTC may be created by the
creating module 112 for the particular test scenario illustrated in Figure 3. The test case comprises a plurality of test steps shown in column 2 comprising Launch Browser, Enter User Name, Enter Password, Click on Login Button, Logout function and Close Browser. The defining module 214 may be configured to define a plurality of test objects (as shown in column 3) for the test case, wherein the test object from the plurality of test objects corresponds to the plurality of test steps. The plurality of test objects comprise 'UIUSEREdit' for the test step Enter User Name, 'UlPASSOWRDEdit' for the test step Enter Password and 'UILoginButton' for the test step Click on Login Button. Further, the assigning module 216 assigns a plurality of test actions (as shown in column 4) from the pre-defined set of test actions for the plurality of test steps using keywords, wherein the keywords may be reused for the plurality of test objects. The test action may comprise 'LAUNCHBROWSER', SET, CLICK, fnLogout and CLOSE. The 'Logout function' may be a pre-defined test case which may be reused as the test step in the test case 'SampleTC'.
[0053] Further, the assigning module 216 assigns the first test data (as shown in
column 8) to the test case, wherein the first test data may be reused for a pre-defined number of iterations. The column "Test Data" depicts assigning of 'EXTERNAL[2]' as the first test data. 'EXTERNAL[2]' specifies that the test case may be executed for '2' iterations and the first test data may be assigned from the data sheet workbook, wherein the data sheet workbook is the external database. Further, the assigning module 216 assigns the second test data to the test step, wherein the second test data may be reused for the one or more test steps. The second test data may be further assigned locally for the test step. In the test case
'SampleTC, the second test data may be assigned to the test step Enter User Name and the test step Enter Password by assigning VJJsername[2,3] and V_password[2.3]. Further, the column 9, illustrates an expected result on execution of the plurality of test steps. The columns. 'LOC', 'Parameters' and 'Execute' illustrate the columns as described in the test scenario sheet.
[0054] Referring now to Figure 5, a method 500 for automated testing of a software
application is shown, in accordance with an embodiment of the present subject matter. The method 500 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method 500 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0055] The order in which the method 500 is described is not intended to be construed
as a limitation, and any number of the described method blocks can be combined in any order to implement the method 500 or alternate methods. Additionally, individual blocks may be deleted from the method 500 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 500 may be considered to be implemented in the above described system 102.
[0056] At block 502, one or more test cases for a test scenario may be created. In one
implementation, the one or more test cases may be created by the creating module 212.
[0057] At block 504. one or more test objects may be defined for the test case. In one
implementation, the one or more test objects may be defined by the defining module 214.
[0058] At block 506, a pre-defined test action may be assigned for the test step using a
keyword. In one implementation, the pre-defined test action may be assigned by the assigning module 216.
[0059] At block 508. first test data may be assigned to the test case. In one
implementation, the first test data may be assigned to the test case by the assigning module 216.
[0060] At block 510, second test data may be assigned to the test step. In one
implementation, the second test data may be assigned to the test step by the assigning module 218.
[0061] At block 512, the one or more test cases may be executed. In one
implementation, the one or more test cases may be executed by the execution module 220.
[0062] Although implementations for methods and systems for automated testing of a
software application have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for automated testing of a software application.
WE CLAIM:
1. A method for automated testing of a software application, the method comprising:
creating one or more test cases for a test scenario, wherein the test scenario depends on the software application under test, and wherein a test case of the one or more test cases comprises one or more test steps, and wherein the test case is reused as a test step while creating the one or more test cases, the creating comprising:
defining one or more test objects for the test case, wherein a test object of the one or more test objects corresponds to the test step;
assigning a pre-defined test action for the test step using a keyword, wherein the keyword is reused for the one or more test objects;
assigning first test data to the test case, wherein the first test data is reused for a pre-defined number of iterations and for the one or more test cases; and
assigning second test data to the test step, wherein the second test data is reused for the one or more test steps; and
executing the one or more test cases, wherein an execution of the one or more test cases depends on one or more predefined conditions.
wherein, the creating, the defining one or more test objects, the assigning a predefined test action, the assigning first test data, the assigning second test data, and the executing are performed by a processor using programmed instructions stored in a memory.
2. The method as claimed in claim 1 further comprising generating a plurality of automation scripts for the one or more test cases.
3. The method as claimed in claim 1 further comprising generating one or more reports of the execution at one or more levels.
4. The method as claimed in claim 3, wherein the one or more levels comprises a test step level, a test case level and a test scenario level.
5. The method as claimed in claim 1, wherein the first test data and the second test data is stored in an externa! database.
6. The method as claimed in claim 1, wherein the first test data and the second test data is assigned locally for the test case and the one or more test steps.
7. The method as claimed in claim 1. wherein the predefined conditions comprises at least one of a failure of the execution of the test step, or a success of the execution of the test step.
8. The method as claimed in claim 1, wherein the executing further comprises repeating the execution of the one or more test cases for a specified number of times in a loop.
9. A system for automated testing of a software application, the system comprising;
a processor; and
a memory coupled to the processor, wherein the processor is capable of executing a plurality of modules stored in the memory, and wherein the plurality of modules comprising: a creating module configured to create one or more test cases for a test scenario, wherein the test scenario depends on the software application under test, and wherein a test case of the one or more test cases comprises one or more test steps, and wherein the test case is reused as a test step while creating the one or more test cases, the creating module further comprising;
a defining module configured to define one or more test objects for the test case, wherein a test object of the one or more test objects corresponds to the test step;
an assigning module configured to
assign a pre-defined test action for the test step using a keyword, wherein the keyword is reused for the one or more test objects;
assign a first test data to the test case, wherein the first test data is reused for a pre-defined number of iterations, and for the one or more test cases; and
assign a second test data to the test step, wherein the second test data is reused for the one or more test steps; and an execution module configured to execute the one or more test cases, wherein an execution of the one or more test cases depends on one or more pre-defined conditions.
10. The system as claimed in claim 9, wherein the system further comprises a generation module configured to generate a plurality of automation scripts for the one or more test cases.
11. The system as claimed in claim 9, wherein the system further comprises a report generation module configured to generate one or more reports of the execution at one or more levels.
12. The system as claimed in claim 11, wherein the one or more levels comprises a test step level, a test case level and a test scenario level.
13. The system as claimed in claim 9, wherein the first test data and the second test data is stored in an external database.
14. The system as claimed in claim 9, wherein the first test data and the second test data is assigned locally for the test case and the one or more test steps.
15. The system as claimed in claim 9, wherein the one or more conditions comprises at least one of a failure of the execution of the test step, or a success of the execution of the test step.
16. The system as claimed in claim 9. wherein the execution module further comprises a repeating module configured to repeat the execution of the one or more test cases for a specified number of times in a loop.
17. A computer program product having embodied thereon a computer program for automated testing of a software application, the computer program product comprising:
a program code for creating one or more test cases for a test scenario, wherein the test scenario depends on the software application under test, and wherein a test case of the one or more test cases comprises one or more test steps, and wherein the test case is reused as a test step while creating the one or more test cases, the program code for creating comprising:
a program code for defining one or more test objects for the test case, wherein a test object of the one or more test objects corresponds to the test step;
a program code for assigning a pre-defined test action for the test step using a keyword, wherein the keyword is reused for the one or more test objects;
a program code for assigning first test data to the test case, wherein the first test data is reused for a pre-defined number of iterations and for the one or more test cases;
a program code for assigning second test data to the test step, wherein the second test data is reused for the one or more test steps; and
a program code for executing the one or more test cases, wherein an execution of the one or more test cases depends on one or more predefined conditions.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 3596-MUM-2012-FORM 5(23-12-2013).pdf | 2013-12-23 |
| 1 | 3596-MUM-2012-IntimationOfGrant10-03-2023.pdf | 2023-03-10 |
| 2 | 3596-MUM-2012-FORM 3(23-12-2013).pdf | 2013-12-23 |
| 2 | 3596-MUM-2012-PatentCertificate10-03-2023.pdf | 2023-03-10 |
| 3 | 3596-MUM-2012-PETITION UNDER RULE 137 [16-01-2023(online)].pdf | 2023-01-16 |
| 3 | 3596-MUM-2012-FORM 2(TITLE PAGE)-(23-12-2013).pdf | 2013-12-23 |
| 4 | 3596-MUM-2012-RELEVANT DOCUMENTS [16-01-2023(online)].pdf | 2023-01-16 |
| 4 | 3596-MUM-2012-FORM 2(23-12-2013).pdf | 2013-12-23 |
| 5 | 3596-MUM-2012-Written submissions and relevant documents [16-01-2023(online)].pdf | 2023-01-16 |
| 5 | 3596-MUM-2012-FORM 18(23-12-2013).pdf | 2013-12-23 |
| 6 | 3596-MUM-2012-DRAWING(23-12-2013).pdf | 2013-12-23 |
| 6 | 3596-MUM-2012-Correspondence to notify the Controller [05-01-2023(online)].pdf | 2023-01-05 |
| 7 | 3596-MUM-2012-FORM-26 [05-01-2023(online)]-1.pdf | 2023-01-05 |
| 7 | 3596-MUM-2012-DESCRIPTION(COMPLETE)-(23-12-2013).pdf | 2013-12-23 |
| 8 | 3596-MUM-2012-FORM-26 [05-01-2023(online)].pdf | 2023-01-05 |
| 8 | 3596-MUM-2012-CORRESPONDENCE(23-12-2013).pdf | 2013-12-23 |
| 9 | 3596-MUM-2012-CLAIMS(23-12-2013).pdf | 2013-12-23 |
| 9 | 3596-MUM-2012-US(14)-HearingNotice-(HearingDate-09-01-2023).pdf | 2022-12-09 |
| 10 | 3596-MUM-2012-ABSTRACT(23-12-2013).pdf | 2013-12-23 |
| 10 | 3596-MUM-2012-CLAIMS [09-04-2020(online)].pdf | 2020-04-09 |
| 11 | 3596-MUM-2012-COMPLETE SPECIFICATION [09-04-2020(online)].pdf | 2020-04-09 |
| 11 | ABSTRACT 1.jpg | 2018-08-11 |
| 12 | 3596-MUM-2012-DRAWING [09-04-2020(online)].pdf | 2020-04-09 |
| 12 | 3596-MUM-2012-FORM 26(11-2-2013).pdf | 2018-08-11 |
| 13 | 3596-MUM-2012-FER_SER_REPLY [09-04-2020(online)].pdf | 2020-04-09 |
| 13 | 3596-MUM-2012-FORM 2.pdf | 2018-08-11 |
| 14 | 3596-MUM-2012-FORM 2(TITLE PAGE).pdf | 2018-08-11 |
| 14 | 3596-MUM-2012-OTHERS [09-04-2020(online)].pdf | 2020-04-09 |
| 15 | 3596-MUM-2012-FER.pdf | 2019-10-09 |
| 15 | 3596-MUM-2012-FORM 1.pdf | 2018-08-11 |
| 16 | 3596-MUM-2012-CORRESPONDENCE(11-2-2013).pdf | 2018-08-11 |
| 16 | 3596-MUM-2012-FORM 1(5-3-2013).pdf | 2018-08-11 |
| 17 | 3596-MUM-2012-DRAWING.pdf | 2018-08-11 |
| 17 | 3596-MUM-2012-CORRESPONDENCE(5-3-2013).pdf | 2018-08-11 |
| 18 | 3596-MUM-2012-CORRESPONDENCE.pdf | 2018-08-11 |
| 18 | 3596-MUM-2012-DESCRIPTION(PROVISIONAL).pdf | 2018-08-11 |
| 19 | 3596-MUM-2012-CORRESPONDENCE.pdf | 2018-08-11 |
| 19 | 3596-MUM-2012-DESCRIPTION(PROVISIONAL).pdf | 2018-08-11 |
| 20 | 3596-MUM-2012-CORRESPONDENCE(5-3-2013).pdf | 2018-08-11 |
| 20 | 3596-MUM-2012-DRAWING.pdf | 2018-08-11 |
| 21 | 3596-MUM-2012-CORRESPONDENCE(11-2-2013).pdf | 2018-08-11 |
| 21 | 3596-MUM-2012-FORM 1(5-3-2013).pdf | 2018-08-11 |
| 22 | 3596-MUM-2012-FER.pdf | 2019-10-09 |
| 22 | 3596-MUM-2012-FORM 1.pdf | 2018-08-11 |
| 23 | 3596-MUM-2012-OTHERS [09-04-2020(online)].pdf | 2020-04-09 |
| 23 | 3596-MUM-2012-FORM 2(TITLE PAGE).pdf | 2018-08-11 |
| 24 | 3596-MUM-2012-FER_SER_REPLY [09-04-2020(online)].pdf | 2020-04-09 |
| 24 | 3596-MUM-2012-FORM 2.pdf | 2018-08-11 |
| 25 | 3596-MUM-2012-DRAWING [09-04-2020(online)].pdf | 2020-04-09 |
| 25 | 3596-MUM-2012-FORM 26(11-2-2013).pdf | 2018-08-11 |
| 26 | 3596-MUM-2012-COMPLETE SPECIFICATION [09-04-2020(online)].pdf | 2020-04-09 |
| 26 | ABSTRACT 1.jpg | 2018-08-11 |
| 27 | 3596-MUM-2012-ABSTRACT(23-12-2013).pdf | 2013-12-23 |
| 27 | 3596-MUM-2012-CLAIMS [09-04-2020(online)].pdf | 2020-04-09 |
| 28 | 3596-MUM-2012-CLAIMS(23-12-2013).pdf | 2013-12-23 |
| 28 | 3596-MUM-2012-US(14)-HearingNotice-(HearingDate-09-01-2023).pdf | 2022-12-09 |
| 29 | 3596-MUM-2012-CORRESPONDENCE(23-12-2013).pdf | 2013-12-23 |
| 29 | 3596-MUM-2012-FORM-26 [05-01-2023(online)].pdf | 2023-01-05 |
| 30 | 3596-MUM-2012-FORM-26 [05-01-2023(online)]-1.pdf | 2023-01-05 |
| 30 | 3596-MUM-2012-DESCRIPTION(COMPLETE)-(23-12-2013).pdf | 2013-12-23 |
| 31 | 3596-MUM-2012-DRAWING(23-12-2013).pdf | 2013-12-23 |
| 31 | 3596-MUM-2012-Correspondence to notify the Controller [05-01-2023(online)].pdf | 2023-01-05 |
| 32 | 3596-MUM-2012-Written submissions and relevant documents [16-01-2023(online)].pdf | 2023-01-16 |
| 32 | 3596-MUM-2012-FORM 18(23-12-2013).pdf | 2013-12-23 |
| 33 | 3596-MUM-2012-RELEVANT DOCUMENTS [16-01-2023(online)].pdf | 2023-01-16 |
| 33 | 3596-MUM-2012-FORM 2(23-12-2013).pdf | 2013-12-23 |
| 34 | 3596-MUM-2012-PETITION UNDER RULE 137 [16-01-2023(online)].pdf | 2023-01-16 |
| 34 | 3596-MUM-2012-FORM 2(TITLE PAGE)-(23-12-2013).pdf | 2013-12-23 |
| 35 | 3596-MUM-2012-PatentCertificate10-03-2023.pdf | 2023-03-10 |
| 35 | 3596-MUM-2012-FORM 3(23-12-2013).pdf | 2013-12-23 |
| 36 | 3596-MUM-2012-FORM 5(23-12-2013).pdf | 2013-12-23 |
| 36 | 3596-MUM-2012-IntimationOfGrant10-03-2023.pdf | 2023-03-10 |
| 1 | 2020-08-1816-15-03AE_18-08-2020.pdf |
| 1 | SearchStrategyMatrix_04-10-2019.pdf |
| 2 | 2020-08-1816-15-03AE_18-08-2020.pdf |
| 2 | SearchStrategyMatrix_04-10-2019.pdf |