FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)
/. Title of the invention: PERFORMANCE TESTING REQUIREMENT CHECKLIST
2. Applicant(s)
NAME NATIONALITY ADDRESS
TATA CONSULTANCY Nirmal Building, 9th Floor. Nariman Point.
SERVICES LIMITED lndian Mumbai, Maharashtra 400021. India
2. Preamble to the description
COMPLETE SPECIFICATION
The following specification particularly describes the invention and the manner in which it
is to be performed.
TECHNICAL FIELD
[0001] The present subject matter described herein, in general, relates to performance
testing and, in particular, to performance testing of software applications.
BACKGROUND
[0002] Software applications are tested for functionality, bugs, and glitches in order to
determine whether the software applications meet required goals or not. Also, a software application is tested for quality assurance, verification, validation, and reliability estimation. While the former testing is a functional testing, the later testing, for purposes of quality assurance, is generally referred to as performance testing. Testing a software application is a trade-off between budget, time, and required quality.
[0003] In order to test the functionality of a software application, functional testing
requirements may be gathered. In one approach a functional testing requirement specification is prepared. The functional testing requirement specification is indicative of the properties for which the application needs to be tested. Based on the functional testing requirement specification, test cases may be designed and executed on the software application to verify its functionality. In case, execution of the test cases reveals that the software application is not performing in an excepted manner, the software application is reported to be malfunctioning.
[0004] The systematic approach of preparing the functional testing requirement
specification and performing the functional testing in accordance with the same may often not be followed in case of performance testing.
SUMMARY
[0005] This summary is provided to introduce concepts related to systems and
methods for generating a performance testing requirement checklist and the concepts are further described below in the detailed description. This summary is not intended to identify
essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[0006] In one implementation, a system for generating a performance testing
requirement checklist is provided. The system comprises a processor and a memory coupled to the processor. The memory comprises a checklist generator configured to generate the performance testing requirement checklist for gathering performance testing requirements. The performance testing requirement checklist comprises a set of questions associated with at least one performance testing phase. The at least one performance testing phase is selected from a pre-engagement phase, a requirement phase, a planning phase, a script design phase, a test readiness phase, a test execution phase, a reporting phase, and a sign off phase.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] The detailed description is described with reference to the accompanying
figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and components.
[0008] Figure 1 illustrates a network implementation of a system for generating a
performance testing requirement checklist, in accordance with an embodiment of the present subject matter.
[0009] Figure 2 shows a flowchart illustrating a method for generating a performance
testing requirement checklist, in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[0010] Performance testing is generally conducted to ensure the quality of a software
application. Performance testing often provides a software product a competitive advantage. However, conventionally, performance testing may not always be given due importance and is not systematically performed in a software development lifecycle. For example, often times, prior to conducting performance testing of a software application, the performance testing requirement may not have been completely understood resulting in incomplete or
inefficient performance testing that in turn consequents in rework and wastage of testing resources of an organization.
[0011] Typically, information relating to performance testing requirement is either
sent over an email or stated over a phone, thereby leading to gaps in the information received. For example, a client who wishes to have a software application tested may provide their performance testing requirement to an organization conducting the performance testing in an unsystematic manner. The performance testing requirement may even be undocumented which may also lead to miscommunication and difference in expectations between the organization and the client. Further, such information may lead to rework. Therefore, effectiveness of a performance testing may rely on understanding the performance testing requirement.
[0012] In one embodiment of the present subject matter, in order to gather information
relating to performance testing requirement systematically and completely, a performance testing requirement checklists may be used.
[0013] Checklists in general are used for various purposes. Such checklists may be
required to ensure quality and save time. Apart from gathering information, checklists also ensure complete compliance with a process. In one example, each time an aircraft takes a flight, a checklist is used to ensure that all components of the aircraft are working at satisfactory level. Similarly, a document proof read checklist may ensure that all parameters in the document proof read checklist are satisfied. Likewise, software testing checklist may ensure that all steps are performed while gathering the performance requirement for testing a software application.
[0014] System and method for generating a performance testing requirement checklist
(PTRC) are described herein. The PTRC disclosed herein is capable of gathering information systematically and completely. Specifically, the PTRC may be used to gather performance testing requirements for assessing performance of a software application. In order to gather all requirements related to performance testing, a set of questions may be provided in the PTRC. The set of questions are designed to capture all requirements related to performance testing of a software application in a systematic manner. The set of questions may be associated with various performance testing phases. The performance testing phases may be a pre-
engagement phase, a requirement phase, a planning phase, a script design phase, a test readiness phase, a test execution phase, a reporting phase, and a sign off phase.
[0015] While aspects of described system and method for generating a performance
testing requirement checklist may be implemented in any number of different computing systems, environments, and/or configurations, the embodiments are described in the context of the following exemplary system.
[0016] Referring now to Figure 1, a network implementation 100 of a system 102 for
generating a performance testing requirement checklist is illustrated, in accordance with an embodiment of the present subject matter. Further, the system 102 may be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, and the like. It will be understood that the system 102 may be accessed by employees of an organization through one or more client devices 104-1, 104-2,... 104-N, collectively referred to as client devices 104 hereinafter, or applications residing on client devices 104. Examples of the client devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, and a workstation. The client devices 104 are communicatively coupled to the system 102 through a network 106.
[0017] In one implementation, the network 106 may be a wireless network, a wired
network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[0018] In one embodiment, the system 102 may include at least one processor 108, an
I/O interface 110, and a memory 112. The at least one processor 108 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors,
central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 108 is configured to fetch and execute computer-readable instructions stored in the memory 112.
[0019] The I/O interface 110 may include a variety of software and hardware
interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 110 may allow the system 102 to interact with a user directly or through the client devices 104. Further, the I/O interface 110 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 110 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 110 may include one or more ports for connecting a number of devices to one another or to another server.
[0020] The memory 112 may include any computer-readable medium known in the art
including, for example, volatile memory such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 112 may include modules 114 and data 116.
[0021] The modules 114 include routines, programs, objects, components, data
structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 114 may include a checklist generator 118, a dashboard generator 120, other modules 122 may include programs or coded instructions that supplement applications and functions of the system 102.
[0022] The data 116, amongst other things, serves as a repository for storing data
processed, received, and generated by one or more of the modules 114. The data 116 may also include other data 124. The other data 124 may include data generated as a result of the execution of one or more modules in the other module 122.
[0023] As shown in Figure 1, the system 102 may include several modules for
generating a performance testing requirement checklist (PTRC). The PTRC may be stored in a database 126 that is connected to the system 102, as shown in figure 1. Although in the
present embodiment the database 126 is shown to be outside the system 102, in another embodiment, the database 126 may be a part of the system 102. In one implementation, the PTRC may be used by an organization such as an Information Technology (IT) organization, a product manufacturing organization, a telecommunication organization, or other conglomerates. The present subject matter may be explained mainly considering the organization to be an IT organization; however, it will be appreciated by a person skilled in the art that the organization may be any organization involved in any line of business.
[0024] In the present implementation, the checklist generator 118 may generate the
PTRC. The PTRC may be used for gathering performance testing requirements for assessing performance of a software application. Specifically, the checklist generator 118 may provide a set of questions in the PTRC. The set of questions is designed to gather substantially all requirements related to performance testing of the software application. The set of questions is associated with non-functional specification of a software application. The non-functional specification defines performance targets of a software application. In one example, a performance target of a software application could be to have a response time of less than 2 seconds. Having the response time of less than 2 seconds would mean that the software application must perform certain activities in less than 2 seconds.
[0025] In another example, a performance target of a software application could be to
stay stable, without getting hung, during a high volume of user traffic. This could be a performance target of a banking related software application which is used by millions of users simultaneously. Therefore, the performance targets of a software application may include response time target, expected throughput, expected hits per second, resource usage targets, and the like. It may be understood that the performance of a software application may be tested based upon the performance targets that need to be met to certify that the software application's performance is convincing.
[0026] In one implementation, the set of questions may relate to at least one
performance testing phase. The at least one performance testing phase may be selected from a pre-engagement phase, a requirement phase, a planning phase, a script design phase, a test readiness phase, a test execution phase, a reporting phase, and a sign off phase. As the set of questions is associated with various performance testing phases, the set of questions may
therefore ensure that all testing related requirements are gathered for testing the performance of a software application.
[0027] In order to gather information related to the pre-engagement phase, the set of
questions may relate to pre-engagement activities, such as scope definitions for the stakeholders involved, for example a client and a testing team, infrastructure of the software application, and project resource information. In one example, for the pre-engagement phase, the set of questions is designed to gather customer related information and project scope related information.
[0028] Further, in order to gather information related to the next phase, i.e., the
requirement phase, the set of questions may relate to an objective of performance testing, a test environment, testing tools available with the customer, workload details, and performance targets. For example, for the requirement phase, the set of questions may be used to gather information on a functionality/usage of the software application to be tested; hardware requirements, i.e., what should be the specification of the hardware which is to be used to run the software application; technology stack to be used, such as the test environment and the test tool; software application architecture and technology; workload details, such as user traffic, and the like.
[0029] In one example, a particular test environment and/or a test tool may be
required for performance testing. As a part of gathering performance testing requirement, it needs to be determined that who would be investing in procuring such tools. For instance, the client may already have a license for the test environment and the test tool. The licensed test environment and the licensed test tool may be provided by the client. Therefore, the client may request the testing team to do the performance testing using the licensed test environment and the licensed test tool. Further, in another example, workload details relating to the performance testing needs to be confirmed. Workload details may include a load profile that needs to be simulated during performance testing. The load profile may contain a number of transactions to be performance tested, a mix percentage of the number of transactions, concurrent user distribution against each transaction, duration of the performance testing, a steady state period, a ramp up/down period, an average number of users, a maximum number of users, and the like.
[0030] After gathering information in the requirement phase, planning for testing the
software application may be done. In order to gather information related to the planning phase, the set of questions may relate to a test planning process and strategy. For example, for the planning phase, the set of questions may be used to gather information on an objective and scope of project; performance testing approach; functionality of the software application; performance testing coverage; a risk mitigation, i.e. ways to minimize a risk involved; test environment and monitoring tools used by the customer, and the like. Therefore, in the planning phase, a lot of information may be needed to plan the testing of the software application properly. Further, in the planning phase, detailed information about the test environment in which the performance testing has to be conducted may be obtained. In one example, if the test environment is unlike a production environment then a production environment configuration may be provided to understand a percentage difference between the test environment and the production environment.
[0031] Further, in the next phase, i.e., in the script design phase, the set of questions
may relate to industry scripts standards and naming conventions which are to be used to design scripts. For example, in the script design phase, the set of questions may be used to gather information on boiler plate and naming conventions; run time settings; test data; recording options, and the like. The set of questions may ensure that same scripts are used both onsite and offshore; and that the boiler plate and naming conventions are included in the scripts; and that the runtime settings are proper; and that whether test data is included in the scripts or not.
[0032] Further, in the test readiness phase, the set of questions may relate to readiness
of test environment; test scripts; test data; software application under test; test server, and the like. Further, in the next phase, i.e., in the test execution phase, the set of questions may ensure that pre-execution standards and post execution standards are implemented. For example, in the test execution phase, the set of questions may be used to ensure that a smoke test is done; all stakeholders are informed about the test execution; all test results are collated after the test execution is complete. The smoke test may ensure that there are no glitches in the software application.
[0033] Further, in the reporting phase, the set of questions may relate to reporting
standards, reporting formats, and reporting schedule. Specifically, the set questions include whether the quick summary have been sent to the stakeholders/clients after the test execution or not; whether the test report with correct performance counters have been shared with all stakeholders or not; whether to send monthly reports or quarterly reports, whether the client is satisfied or not; whether the client has provided feedback or not; whether the testing team has deleted useless files; whether the project has been released completely or not, and the like. Further, in the sign off phase, the set of questions may relate to a progress of the complete testing project; a closure process, and the like.
[0034] In one implementation, the set of questions mentioned above may be answered
using the I/O 110 interface. Specifically, the set of questions may be answered using the client devices 104. In one example, the customer and/or the testing team may use the client devices 104 to answer the set of questions by accessing the database 126 through the I/O interface 110. Based upon a percentage of the set of questions responded/answered to in each of the at least one performance testing phase, a dashboard may be generated by the dashboard generator 120. Specifically, the dashboard may be displayed on the I/O interface and may summarize a status of the set of questions in each of the performance testing phase. For example, if only 70% of the set of questions are answered in the requirement phase and 50% of the set of questions are answered in the planning phase, then the dashboard generator 120 may present this information on the dashboard. In one example, the dashboard may show a bar chart indicating the percentage of the set of questions responded/answered to in each of the performance testing phases.
[0035] Therefore, the system 102 may be used to build the PTRC which may be used
to gather all the information needed for testing a performance of a software application, thereby saving time in rework.
[0036] Referring now to Figure 2, a method 200 for generating a performance testing
requirement checklist is shown, in accordance with an embodiment of the present subject matter. The method 200 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs. objects, components, data structures, procedures, modules, functions, etc., that perform
particular functions or implement particular abstract data types. The method 200 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0037] The order in which the method 200 is described is not intended to be construed
as a limitation, and any number of the described method blocks can be combined in any order to implement the method 200 or alternate methods. Additionally, individual blocks may be deleted from the method 200 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 200 may be considered to be implemented in the above described system 102.
[0038] At block 202, a set of questions for gathering performance testing requirements
may be provided. The set of questions may be associated with at least one performance testing phase. The at least one performance testing phase may be selected from a pre-engagement phase, a requirement phase, a planning phase, a script design phase, a test readiness phase, a test execution phase, a reporting phase, and a sign off phase. In one example, the set of questions may be provided by the checklist generator 118.
[0039] At block 204, a performance testing requirement checklist is generated based
upon the set of questions. In one example, the performance testing requirement checklist is generated by the checklist generator 118.
[0040] At block 206, one or more answers for the set of questions may be received. In
one example, the I/O interface 110 may be used to receive the one or more answers.
[0041] At block 208, a dashboard is generated. The dashboard is indicative of a
percentage of the set of questions responded to in each of the at least one performance testing phase. The dashboard may be displayed on the I/O interface and may summarize a status of the set of questions in each of the performance testing phase. In one example, the dashboard may be generated by the dashboard generator 120.
[0042] Although implementations for methods and systems for generating a
performance testing requirement checklist have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for generating a performance testing requirement checklist.
I/We claim:
1. A system (102) for generating a performance testing requirement checklist, the system
(102) comprising:
a processor (108); and
a memory (112) coupled to the processor (108), the memory (112) comprising
a checklist generator (118) configured to generate the performance testing requirement checklist for gathering performance testing requirements, wherein the performance testing requirement checklist comprises a set of questions associated with at least one performance testing phase, and wherein the at least one performance testing phase is selected from a pre-engagement phase, a requirement phase, a planning phase, a script design phase, a test readiness phase, a test execution phase, a reporting phase, and a sign off phase.
2. The system (102) of claim 1, further comprising an Input/Output interface (110) configured to receive one or more answers for the set of questions.
3. The system (102) of claim 1, further comprising a dashboard generator (120) configured to generate a dashboard indicating a percentage of the set of questions responded to in each of the at least one performance testing phase.
4. The system (102) of claim 1, wherein the set of questions for the pre-engagement phase is associated with at least one of stakeholders, infrastructure, and project resource information.
5. The system (102) of claim 1, wherein the set of questions for the requirement phase is associated with at least one of an objective of performance testing, a test environment, a test tool, workload, and performance targets.
6. The system (102) of claim 1, wherein the set of questions is associated with nonfunctional specification of a software application, wherein the non-functional specification defines the performance targets of the software application.
7. A method for generating a performance testing requirement checklist, the method comprising:
providing a set of questions for gathering performance testing requirements, wherein the set of questions is associated with at least one performance testing phase, and wherein the at least one performance testing phase is selected from a pre-engagement phase, a requirement phase, a planning phase, a script design phase, a test readiness phase, a test execution phase, a reporting phase, and a sign off phase: and
generating the performance testing requirement checklist based upon the set of questions.
8. The method of claim 7, further comprising receiving one or more answers for the set of questions.
9. The method of claim 7, further comprising generating a dashboard indicating a percentage of the set of questions responded to in each of the at least one performance testing phase.
10. The method of claim 7, wherein the set of questions for the planning phase is associated with at least-one of an objective of a project, a performance testing approach, and a test environment.
11. A computer-readable medium having embodied thereon a computer program for executing a method for generating a performance testing requirement checklist, the method comprising
providing a set of questions for gathering performance testing requirements, wherein the set of questions is associated with at least one performance testing phase, and wherein the at least one performance testing phase is selected from a pre-engagement phase, a requirement phase, a planning phase, a script design phase, a test readiness phase, a test execution phase, a reporting phase, and a sign off phase; and
generating the performance testing requirement checklist based upon the set of questions.