Sign In to Follow Application
View All Documents & Correspondence

Method And System For Extrapolating Performance Of An Application

Abstract: Disclosed is a method and system for extrapolating performance of an application in a computerized target environment. The system comprises a first test script module, a second test script module, and an extrapolation module. The first test script module is configured to perform a first load test on the application to obtain a first set of load test results. The second test script module is configured to enable a second load test on the application in order to obtain a second set of load test results. The extrapolation module is configured to extrapolate the performance of the application accessible by a second set of client devices in the computerized target environment based on statistical analysis performed on the first set of load test results and the second set of load test results

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
12 June 2013
Publication Number
22/2015
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

TATA CONSULTANCY SERVICES LIMITED
NIRMAL BUILDING, 9TH FLOOR, NARIMAN POINT, MUMBAI 400021, MAHARASHTRA, INDIA

Inventors

1. DUTTAGUPTA, SUBHASRI
TATA CONSULTANCY SERVICES LIMITED, AKRUTI BUSINESS PORT, ROAD NO 13 MIDC ANDHERI EAST, MUMBAI- 400093 INDIA
2. KHANAPURKAR, AMOL BHASKAR
TATA CONSULTANCY SERVICES LIMITED, AKRUTI BUSINESS PORT, ROAD NO 13 MIDC ANDHERI EAST, MUMBAI- 400093 INDIA
3. SINGH, RUPINDER VIRK
TATA CONSULTANCY SERVICES LIMITED, AKRUTI BUSINESS PORT, ROAD NO 13 MIDC ANDHERI EAST, MUMBAI- 400093 INDIA
4. NAMBIAR, MANOJ KARUNAKARAN
TATA CONSULTANCY SERVICES LIMITED, AKRUTI BUSINESS PORT, ROAD NO 13 MIDC ANDHERI EAST, MUMBAI- 400093 INDIA

Specification

FORM 2
THE PATENTS ACT 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
METHOD AND SYSTEM FOR EXTRAPOLATING PERFORMANCE OF AN
APPLICATION
APPLICANT:
Tata Consultancy Services Limited A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building. 9th floor,
Nariman point, Mumbai 400021,
Maharashtra, India
The following specification describes the invention and the manner in which it is to be performed.

The present invention comprises an improvement in, or a modification of the invention claimed in the specification of the Indian Patent Application No. 1405/MUM/2011.
TECHNICAL FIELD
[001] The present subject matter described herein, in general, relates to load analysis
of an application, and more particularly to a method and system for extrapolating performance of the application in a computerized target environment.'
BACKGROUND
[002] Before an IT application is launched, a load testing is performed on the IT
application in a test /source environment comprising of application servers, web servers and database servers running on one or more commodity servers. The performance of the IT application is gathered in terms of throughput, response time and consumption of various resources on the servers by performing load analysis of large number of users concurrently accessing the IT application. Such load analysis is performed before the IT application is actually deployed at a production/target environment. Now, when the IT application is moved to the production/target environment, the performance metrics of the application may be different than that of the test/source environment. One of the reasons for the difference in performance metrics may be the difference in configuration of the resources in the test/source environment and the production/target environment. For example, the test/source environment may be having configuration of Central 8 Core CPU 2.66 GHz Xeon with 1MB L2 cache, 8 GB Physical RAM which may be different from the configuration of the resources deployed at the production/target environment.
[003] In the present scenario, the performance of the IT application may be
determined by using three different methods implemented by an event simulator, an analytical model and an architectural simulator respectively. The event simulator performs simulation that involves careful analysis of each of the component of the IT application by representing them accurately in a queuing model and implementing the business function flow through the system. On the other hand, the analytical model requires knowledge of the exact configuration

of the resources deployed at the target environment. If the configuration of the resources of the source environment is different from the target environment, then it becomes difficult to determine the performance of the IT application when the IT application is deployed in the target environment. Further, the analytical model is limited to test only the performance of specific IT applications. The architectural simulator is useful for predicting the performance of the IT application when the deployment is limited to only single server. Further it does not allow execution of a multi-tier application. Further running the IT application for different workloads would involve executing it multiple times which would require significant simulation time.
[004] In general, the load testing provides for reduction in resources, in terms of time
and cost. However such a reduction is achieved only when the load testing is able to mimic the scenarios of the target environment or production environment in the testing environment. Further, mimicking the scenarios of the production environment may add to the cost of the load testing, which may in turn defeat the purpose of applying the load testing. Often at times, load analysis performed using conventional load analysis techniques may not be valid for the target environment or the production environment. Hence, there exists a technical challenge of validating of test analysis performed in the test/source with respect to the production/target environment, primarily due to variance in the configuration of resources on the either environments.
SUMMARY
[005] Before the present systems and methods, are described, it is to be understood
that this application is not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments which are not expressly illustrated in the present disclosures. It is also to be understood that the terminology used in the description is for the purpose of describing the particular versions or embodiments only, and is not intended to limit the scope of the present application. This summary is provided to introduce aspects related to systems and methods extrapolating performance of an application in a computerized target environment and the aspects are further described below in the detailed description. This

summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[006] In one implementation, a system for extrapolating performance of an
application in a computerized target environment is disclosed. In one aspect, the computerized target environment may be a production environment where the application may be deployed. The system comprises a processor and a memory coupled to the processor for executing a plurality of modules present in the memory. The plurality of modules comprises a first test script module, a second test script module and an extrapolation module. The first test script module is configured to perform a first load test on the application to obtain a first set of load test results. The first set of load test results is indicative of performance of the application accessible by a first set of client devices in a computerized test environment. In one aspect, the first set of load test results comprises throughput, response time and think time. In one implementation, the first set of client devices may be in the range of about 100-200. Moreover, the second test script module is configured to enable a second load test on the application to obtain a second set of load test results. The second set of load test results is indicative of performance of the application accessible by a single client device in the computerized target environment. In one aspect, the second set of load test results comprises service demand of a central processing unit (CPU), service demand of a disk, service demand of a network, and service demand of a memory. Based on the first set of test results and the second set of the test results, an extrapolation module is further configured to extrapolate the performance of the application accessible by a second set of client devices in the computerized target environment. The performance is extrapolated based on statistical analysis performed on the first set of load test results and the second set of load test results using linear regression or S-curve technique. The second set of client devices represents a threshold value of maximum number of client devices that may access the application satisfying a throughput requirement of the computerized target environment determined through the statistical analysis.
[007] In another implementation, a method for extrapolating performance of an
application in a computerized target environment is disclosed. The method initially performs a first load test on the application using a first test script module to obtain a first set of load test results. The first set of load test results is indicative of performance of the application

accessible by a first set of client devices in a computerized test environment. The method
further enables a second load test on the application using a second test script module to
obtain a second set of load test results. The second set of load test results is indicative of
performance of the application accessible by a single client device in the computerized target
environment. Based on the first set of test results and the second set of the test results, the
method further extrapolates the performance of the application accessible by a second set of
client devices in the computerized target environment based on statistical analysis performed
on the first set of load test results and the second set of load test results.
[008] In yet another implementation, a computer program product having embodied
thereon a computer program for extrapolating performance of an application in a computerized target environment is disclosed. The computer program product comprises a program code for performing a first load test on the application using a first test-script module to obtain a first set of load test results. The first set of load test results is indicative of performance of the application accessible by a first set of client devices in a computerized test environment. The computer program product further comprises a program code for enabling a second load test on the application using a second test-script module to obtain a second set of load test results. The second set of load test results is indicative of performance of the application accessible by a single client device in the computerized target environment. The computer program product further comprises a program code for extrapolating the performance of the application accessible by a second set of client devices in the computerized target environment based on statistical analysis performed on the first set of load test results and the second set of load test results.
BRIEF DESCRIPTION OF THE DRAWINGS
[009] The foregoing detailed description of embodiments is better understood when
read in conjunction with the appended drawings. For the purpose of illustrating the invention, there is shown in the present document example constructions of the invention; however, the invention is not limited to the specific methods and apparatus disclosed in the document and the drawings

[0010] The detailed description is described with reference to the accompanying
figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
[0011] Figure 1 illustrates a network implementation of a system for extrapolating
performance of an application in a computerized target environment is shown, in accordance with an embodiment of the present subject matter.
[0012] Figure 2 illustrates the system, in accordance with an embodiment of the
present subject matter.
[0013] Figure 3 illustrates an accuracy of a method for extrapolating performance of
an application in a computerized target environment for a specific application, in accordance with an embodiment of the present subject matter.
[0014] Figure 4 illustrates various steps of a method for extrapolating performance of
an application, in accordance with an embodiment of the present subject matter.
[0015] The figures depict various embodiments of the present invention for purposes
of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
DETAILED DESCRIPTION
[0016] Some embodiments of this invention, illustrating all its features, will now be
discussed in detail. The words "comprising," "having," "containing," and "including," and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present invention, the exemplary, systems and methods are now

described. The disclosed embodiments are merely exemplary of the invention, which may be embodied in various forms.
[0017] Various modifications to the embodiment will be readily apparent to those
skilled in the art and the generic principles herein may be applied to other embodiments. For example, although the present invention will be described in the context of a system and method for extrapolating performance of an application in a computerized target environment, one of ordinary skill in the art will readily recognize that the method and system can be utilized in any situation where there is need to extrapolate the performance of the application in a computerized test environment or the computerized target environment. Thus, the present invention is not intended to be limited to the embodiments illustrated, but is to be accorded the widest scope consistent with the principles and features described herein.
[0018] Systems and methods for extrapolating performance of an application to be
deployed at a computerized target environment by testing the application in a computerized test environment are described. In one aspect, the application may be any IT enabled application that may be accessed by a plurality of users from any remote location, deployed at a server on the computerized target environment. Further the computerized test environment may be any test environment where the testing of the IT enabled application is performed in order to measure the performance of a plurality of components associated with the IT enabled application. On the other hand, the computerized target environment may be a production environment where the IT enabled application is actually deployed after testing the performance of the plurality of components associated with the IT enabled application. In one aspect, the systems and methods further assumes that the computerized target environment is available for performing small tests or running a benchmark in order to gather specific architectural characteristics. Further, the testing of the IT enabled application is performed to analyze the performance parameters of the plurality of components of the IT enabled application. The plurality of components that are associated with the IT enabled application may comprise multiple components with multi-tiered architecture. The three tiers of the application are Web tier, Application tier and Database tier wherein the three tiers may further comprise a Web server, an Application server and a Database server respectively. For simplicity it is assumed that, the Web server, the Application server and the Database server

are going to remain same between the computerized target environment and the computerized test environment. The only change between the computerized target environment and the computerized test environment may be an alteration in configurations amongst different servers as aforementioned.
[0019] In order to extrapolate the performance of the IT enabled application, a load
test on the IT enabled application is performed at the computerized test environment to obtain a first set of load test results. The first set of load test results is indicative of performance of the application in terms of throughput, response time, think time, memory utilized etc. accessed by a first set of client devices. After performing the load test on the application, the application is executed on the computerized target environment to obtain a second set of load test results. The second set of load test results is indicative of performance of the application in terms of service demand of one or more resources such as service demand of CPU, service demand of Disk or service demand of Network, percentage of memory utilized. In one aspect, the second set of load test results is obtained by accessing the IT enabled application using a single client device.
[0020] Based on the first set of load test results and the second set of load test results,
the systems and methods are further enabled to extrapolate the performance of the IT enabled application for a specific number of client devices until the application encounters a first bottleneck in the computerized target environment. In one aspect, the extrapolation is performed by using a mixture of statistical techniques such as linear regression technique and S-curve technique. In one aspect, the performance may be described in terms of scalability, response time, and throughput of the application. Accordingly, based on the extrapolation, the application may be deployed on the computerized target environment or necessary changes may be incorporated in the application before deploying the application on the computerized target environment.
[0021] Since, the present load analysis technique does not require the knowledge of
the application or the deployment infrastructure; it saves on time and resources. Further, an extrapolation technique only requires the load testing results for a small number of client devices that is used to determine application performance for a large number of user devices

on the computerized target environment. Accordingly, the present technique may be applied to a variety of applications without involving any change.
[0022] The systems and methods, related to extrapolating performance of an
application in the computerized target environment as described herein, can be implemented on a variety of computing systems such as a server, a desktop computer, a notebook or a portable computer, a mainframe computer a mobile computing device, and an entertainment device.
[0023] While aspects of described system and method for extrapolating the
performance of the IT enabled application in a computerized target environment may be implemented in any number of different computing systems, environments, and/or configurations, the embodiments are described in the context of the following exemplary system.
[0024] The presently described embodiments will be best understood by reference to
the drawings, wherein like parts are designated by like numerals throughout. Moreover, flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
[0025] Referring now to Figure 1, a network implementation 100 of a system 102 for
extrapolating performance of an application in a computerized target environment is illustrated, in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 performs a first load test on the application to obtain a first set of load test results. The first set of load test results is indicative of performance of the application accessible by a first set of client devices in a computerized test environment. In one embodiment, the system 102 requires a second load test on the application to obtain a second set of load test results. The second set of load test results is indicative of performance of the application accessible by a single client device in the computerized target environment. After obtaining the first set of load test results and the second set of load test results, the system 102 is further adapted to predict performance of the application. The system 102 predicts the

performance of the application by using a mixture of statistical techniques such as linear regression technique and S-curve technique. The performance is extrapolated for a large number of client devices until the application encounters a first bottleneck in the computerized target environment.
[0026] Although the present subject matter is explained considering that the system
102 is implemented on a server, it may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, and the like. It will be understood that the system 102 may be accessed by multiple users through one or more user devices 104-1, 104-2... 104-N, collectively referred to as user 104 hereinafter, or applications residing on the user devices 104. Examples of the user devices 104 may include, but are not limited to, a portable computer, a personal digital assistant a handheld device, and a workstation. The user devices 104 are communicatively coupled to the system 102 through a network 106.
[0027] In one implementation, the network 106 may be a wireless network, a wired
network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[0028] Referring now to Figure 2. the system 102 is illustrated in accordance with an
embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions.

Among other capabilities, the at least one processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 206.
[0029] The I/O interface 204 may include a variety of software and hardware
interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with a user directly or through the client devices 104. Further, the I/O interface 204 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[0030] The memory 206 may include any computer-readable medium or computer
program product known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or nonvolatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
[0031] The modules 208 include routines, programs, objects, components, data
structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 may include a first test script module 212, a second test script module 214, an extrapolation module 216, and other modules 218. The other modules 218 may include programs or coded instructions that supplement applications and functions of the system 102.
[0032] The data 210, amongst other things, serves as a repository for storing data
processed, received, and generated by one or more of the modules 208. The data 210 may also include a system database 220, and other data 222. The other data 222 may include data generated as a result of the execution of one or more modules in the other module 218.
[0033] In one implementation, at first, a user may use the client device 104 to access
the system 102 via the I/O interface 204. The users may register themselves using the I/O

interface 204 in order to use the system 102. The working of the system 102 may be explained in detail in Figures 3 and 4 explained below.
[0034] Further referring to Figure 2, a detailed working of the components of the
system 102 along is illustrated, in accordance with an embodiment of the present subject matter. In one implementation, a method and system for extrapolating the performance of an application in the computerized target environment by initially testing the application in a computerized test environment is disclosed herein. In one aspect, the computerized test environment and the computerized target environment may comprise of a central processing unit (CPU), a disk, a network, and a memory. In order to extrapolate the performance of the application, the first test script module 212 is configured to perform a first load test on the application in the computerized test environment to obtain a first set of load test results. In one aspect, the first load test is performed by executing the first test script module 212 on the application. The first test script module 212 comprises a virtual load occupied by a first set of client devices that may be in the range of 100-200 devices.
[0035] In one embodiment, the first set of load test results obtained may be stored in
the system database 220. In one embodiment of the invention, the first set of load test results obtained by performing the first load test on the application using the first test script module 212 is indicative of performance of the application in terms of throughput, response time. More specifically, the first set of load test results may comprise, but are not limited to, throughput, response time,, and resources such as CPU, disk, network and memory utilized etc. The first set of load test results may indicate the performance of the application accessible by the first set of client devices in the computerized test environment. In one example, time taken by the application to respond to a request sent by a client device from the first set of client devices may be understood as response time of the application. Further, the number of requests handled by the application per unit time may be understood as the throughput of the application.
[0036] In an exemplary embodiment of the invention, in order to predict the
performance of the application, the first test script module 212 is adapted to verify the scalability of the application. The scalability of the application is verified based on the throughput of the application. In one aspect, when none of the hardware or software resources

on the computerized test environment are bottlenecked, the throughput is expected to increase in proportion with the higher number of devices. In one example, as the number of devices of the first set of client devices increases from 100-200 devices, the throughput of the application is expected to increase. Based on this, the scalability is verified by performing the first load test on the application for at least two distinct virtual load levels, for example 100 devices or 200 devices. In this example, consider, the at least two distinct virtual load levels are referred herein as N1and N2 (N1 > N2), wherein N1 indicates 100 devices and N2 indicates 200 devices. Consider the throughput for N1 is X1 pages/sec and for N2 is X2 pages/sec. The extrapolation technique assumes that the computerized target environment is higher in configurations than the computerized test environment in terms of number of cores and the size of the memory. As a result, it is reasonable to assume that even on the computerized target environment the throughput of the application for the virtual load levels N1, f2 are equal to X1 and X2.
[0037J In one embodiment, the second test script module 214 is configured to enable a
second load test on the application to obtain a second set of load test results in the computerized target environment. In one aspect, the computerized target environment may be a production environment where the application may be deployed. In one aspect, the second load test is performed by executing the second test script module 214 wherein the second test script module 214 comprises a virtual load occupied by a single client device.
[0038] The second set of load test results is stored in the system database 220. The
second set of load test results may comprise, but are not limited to, a service demand of a central processing unit (CPU), a service demand of a disk, a service demand of a network, a percentage of memory utilized that indicates the performance of the application accessible by the single client device in the computerized target environment. In an exemplary embodiment of the invention, the service demand of the central processing unit (CPU) is determined by executing the second test script module 214 to perform the second load test on the application in the computerized target environment. After performing the second load test, a resource utilization of CPU is gathered by using an external system monitoring tool. In one aspect of the invention, the system 102 further enables the second test script module 214 to perform the second load test on the application reiteratively. In an embodiment, no other application is executed on the computerized target environment during the performance of the second load test of the application. In one example, if a sample application script for a telecom application

is run over '1' number of iterations by the second test script module 214, then CPU, disk, and network busy time in seconds may be obtained. In one aspect, the second test script module 214 further offsets the utilization of hardware resources, such as the central processing unit (CPU), the disk, the network, and the memory, for no-load or default load situation i.e. before and after the second load test performed on the application in the computerized target environment in order to obtain the second load test results.
[0039] After performing the second load test on the application using the second test
script module 214, the throughput of the application is gathered from the computerized target environment. Based on the throughput, a service demand of CPU is computed based on the utilization law. For example if utilization of CPU for the single user test is C% and throughput is X pages/sec, then the service demand of CPU on the computerized target environment is obtained by using below formula:

where 'SD' is the service demand of CPU, 'C is the utilization of CPU in percentage and 'X' is the throughput. Based on the same principle as aforementioned, the service demand of Disk, the service demand of Network and percentage of memory utilized are obtained respectively.
[0040] After performing the first load test and the second load test on the application
at the computerized test environment and the computerized target environment respectively, the system 102 further configures the extrapolation module 216 to extrapolate the performance of the application accessible by a second set of client devices in the computerized target environment. In order to extrapolate the performance of the application, the extrapolation module 216 is further configured to extract the first set of load test results and the second set of load test results from the system database 220. In one aspect, the second set of client devices represents a threshold value of maximum number client devices that may access the application satisfying the throughput of the computerized target environment. Further, the threshold value of the maximum number client devices are determined by performing the statistical analysis on the first set of load test results and the second set of load test results extracted from the system database 220 using a combination of existing

techniques of linear regression or S-curve technique. The extrapolation module 216 may extrapolate the performance of the application using extrapolation techniques disclosed in the Indian Patent Application No. 1405/MUM/2011.
[0041] Figure 3 illustrates an exemplary plot indicating load analysis results obtained
for a test application using the load analysis technique in accordance with an embodiment of the present subject matter. The test application is an ecommerce J2EE application. The application is an on-line application where users can browse and search for various types of pets in five top-level categories. It displays details including prices, inventory, and images for all items within each category. Further, with authenticated login it provides full shopping cart facility that includes credit card option for billing and shipping. The throughput of the test application is extrapolated when it is run on a low-range server for load level N1 = 500 users and N2=700 users. The low-range server may have features comprising Intel core Duo CPU with 2.33GHz processor speed, 4MB Cache and 2 GB RAM. For this application, disk is the hardware resource with maximum service demand on the computerized test environment. The throughput values by performing the load test using the above mentioned load level are 101 pages/sec and 142 pages/sec as the first set of load test results. After obtaining the throughput, the single user test is performed on a mid-range server and CPU service demand is 0.833ms and Disk service demand is 0.994ms as second set of load test results. The mid-range server may have features comprising Quad Core AMD Opteron processor 275 with 2.19 GHz processor speed, 2MES L2 cache, and 4 GB RAM. After obtaining the first set of load test results and the second set of load test results, the extrapolation is performed on the application for a large number of users using the obtained throughput values and service demand information. The application scales up-to 4000 users- after which the throughput starts to drop. As it can be seen that the extrapolation results are closer to the actual load test results, and it may be concluded that with actual load testing results from a small number of the client devices, the present load analysis technique can be used to extend it to a large number of the user devices with accuracy not less than 90%.

ADVANTAGES OF THE INVENTION
[0042] The present invention enables a system and method to extrapolate the
performance of the application to be deployed at the computerized target environment without actually deploying the application at the computerized target environment.
[0043] The present invention further predicts the throughput of the application at
higher load and further determines the factors affecting the throughput.
[0044] The present invention further identifies the maximum number of client devices
that may access the application satisfying the throughput and further identifies bottleneck in the performance of the application on exceeding the maximum number of client devices.
[0045] The present invention further predicts the performance of the application by
determining the scalability of the application and how the application scales from small users to large number of users.
[0046] The present invention further assists in determining the consumption of various
resources while specific number client devices accessing the application deployed at the computerized target environment.
[0047] Referring now to Figure 4, a method 400 for extrapolating performance of an
application in a computerized target environment is shown, in accordance with an embodiment of the present subject matter. The method 400 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method 400 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0048] The order in which the method 400 is described is not intended to be construed
as a limitation, and any number of the described method blocks can be combined in any order to implement the method 400 or alternate methods. Additionally, individual blocks may be

deleted from the method 400 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 400 may be considered to be implemented in the above described system 102.
[0049] At block 402, a first load test on the application may be performed to obtain a
first set of load test results. The first set of load test results is indicative of performance of the application accessible by a first set of client devices in a computerized test environment. In one implementation, the first load test may be performed by the first test script module 212.
[0050] At block 404, a second load test on the application is performed in order to
obtain a second set of load test results. The second set of load test results is indicative of performance of the application accessible by a single client device in the computerized target environment In one implementation, the second load test may performed by the second test script module 214.
[0051] At block 406, the performance of the application accessible by a second set of
client devices in the computerized target environment is extrapolated by performing statistical analysis on the first set of load test results and the second set of load test results.
[0052] Although implementations for methods and systems for extrapolating
performance of an application in a computerized target environment have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for extrapolating performance of an application.

WE CLAIM:
1. A method for extrapolating performance of an application in a computerized target environment, the method comprising:
performing, by a processor, a first load test on the application using a first test script module to obtain a first set of load test results, wherein the first set of load test results is indicative of performance of the application accessible by a first set of client devices in a computerized test environment;
enabling, by the processor, a second load test on the application using a second test script module to obtain a second set of load test results, wherein the second set of load test results is indicative of performance of the application accessible by a single client device in the computerized target environment; and
extrapolating, by the processor, the performance of the application accessible by a second set of client devices in the computerized target environment based on statistical analysis performed on the first set of load test results and the second set of load test results.
2. The method of claim 1, wherein the computerized test environment and the computerized target environment may comprise of a central processing unit (CPU), a disk, a network, and a memory.
3. The method of claim 1, wherein the first set of load test results comprises throughput, response time, and resources such as CPU, disk, network and memory used.
4. The method of claim 1. wherein the computerized target environment is a production environment where the application is deployed.
5. The method of claim 1, wherein the second set of load test results comprises service demand of a central processing unit (CPU), service demand of a disk, service demand of a network, percentage of memory used .

6. The method of claim 1, wherein the statistical analysis is performed using at least one of a linear regression technique and an S-curve technique.
7. The method of claim 1, wherein the first set of client devices is in the range of about 100-200.
8. The method of claim 1, wherein the second set of client devices represents a threshold value of maximum number client devices that can access the application satisfying a throughput requirement of the computerized target environment determined through the statistical analysis.
9. The method of claim 8, wherein exceeding the threshold value may result in a bottleneck in the performance of the application in the computerized target environment.
10. A system (102) for extrapolating performance of an application in a computerized target environment, the system comprising:
a processor (202); and
a memory (206) coupled to the processor (202), wherein the processor (202) is capable of executing a plurality of modules (208) stored in the memory (206), and wherein the plurality of modules (208) further comprising:
a first test script module (212) configured to perform a first load test on the application to obtain a first set of load test results, wherein the first set of load test results is indicative of performance of the application accessible by a first set of client devices in a computerized test environment;
a second test script module (214) configured to enable a second load test on the application to obtain a second set of load test results, wherein the second set of load test results is indicative of performance of the application accessible by a single client device in the computerized target environment; and
an extrapolation module (216) configured to extrapolate the performance of the application accessible by a second set of client devices in the computerized target

environment based on statistical analysis performed on the first set of load test results and the second set of load test results.
11. The system of claim 10, wherein the computerized test environment and the computerized target environment comprise a central processing unit (CPU), a disk, a network, and a memory.
12. The system of claim 10. wherein the extrapolation module (216) is configured to perform the statistical analysis using at least one of a linear regression technique and an S-curve technique.
13. A computer program product having embodied thereon a computer program for extrapolating performance of an application in a computerized target environment, the computer program product comprising:
a program code for performing a first load test on the application using a first test-script to obtain a first set of load test results, wherein the first set of load test results is indicative of performance of the application accessible by a first set of client devices in a computerized test environment;
a program code for enabling a second load test on the application using a second test-script to obtain a second set of load test results, wherein the second set of load test results is indicative of performance of the application accessible by a single client device in the computerized target environment; and
a program code for extrapolating the performance of the application accessible by a second set of client devices in the computerized target environment based on statistical analysis performed on the first set of load test results and the second set of load test results.

Documents

Orders

Section Controller Decision Date
15 grant Subhra banerjee 2023-03-14
15 grant Subhra banerjee 2023-03-14

Application Documents

# Name Date
1 2000-MUM-2013-IntimationOfGrant14-03-2023.pdf 2023-03-14
1 ABSTRACT.jpg 2018-08-11
2 2000-MUM-2013-FORM 3.pdf 2018-08-11
2 2000-MUM-2013-PatentCertificate14-03-2023.pdf 2023-03-14
3 2000-MUM-2013-Written submissions and relevant documents [11-01-2023(online)].pdf 2023-01-11
3 2000-MUM-2013-FORM 26(6-9-2013).pdf 2018-08-11
4 2000-MUM-2013-FORM-26 [29-12-2022(online)]-1.pdf 2022-12-29
4 2000-MUM-2013-FORM 2.pdf 2018-08-11
5 2000-MUM-2013-FORM-26 [29-12-2022(online)].pdf 2022-12-29
5 2000-MUM-2013-FORM 2(TITLE PAGE).pdf 2018-08-11
6 2000-MUM-2013-FORM 18.pdf 2018-08-11
6 2000-MUM-2013-Correspondence to notify the Controller [28-12-2022(online)].pdf 2022-12-28
7 2000-MUM-2013-US(14)-HearingNotice-(HearingDate-02-01-2023).pdf 2022-12-01
7 2000-MUM-2013-FORM 1.pdf 2018-08-11
8 2000-MUM-2013-FORM 1(3-7-2013).pdf 2018-08-11
8 2000-MUM-2013-CLAIMS [18-03-2020(online)].pdf 2020-03-18
9 2000-MUM-2013-COMPLETE SPECIFICATION [18-03-2020(online)].pdf 2020-03-18
9 2000-MUM-2013-DRAWING.pdf 2018-08-11
10 2000-MUM-2013-DESCRIPTION(COMPLETE).pdf 2018-08-11
10 2000-MUM-2013-FER_SER_REPLY [18-03-2020(online)].pdf 2020-03-18
11 2000-MUM-2013-CORRESPONDENCE.pdf 2018-08-11
11 2000-MUM-2013-OTHERS [18-03-2020(online)].pdf 2020-03-18
12 2000-MUM-2013-CORRESPONDENCE(6-9-2013).pdf 2018-08-11
12 2000-MUM-2013-FER.pdf 2019-09-18
13 2000-MUM-2013-ABSTRACT.pdf 2018-08-11
13 2000-MUM-2013-CORRESPONDENCE(3-7-2013).pdf 2018-08-11
14 2000-MUM-2013-CLAIMS.pdf 2018-08-11
15 2000-MUM-2013-ABSTRACT.pdf 2018-08-11
15 2000-MUM-2013-CORRESPONDENCE(3-7-2013).pdf 2018-08-11
16 2000-MUM-2013-CORRESPONDENCE(6-9-2013).pdf 2018-08-11
16 2000-MUM-2013-FER.pdf 2019-09-18
17 2000-MUM-2013-OTHERS [18-03-2020(online)].pdf 2020-03-18
17 2000-MUM-2013-CORRESPONDENCE.pdf 2018-08-11
18 2000-MUM-2013-FER_SER_REPLY [18-03-2020(online)].pdf 2020-03-18
18 2000-MUM-2013-DESCRIPTION(COMPLETE).pdf 2018-08-11
19 2000-MUM-2013-COMPLETE SPECIFICATION [18-03-2020(online)].pdf 2020-03-18
19 2000-MUM-2013-DRAWING.pdf 2018-08-11
20 2000-MUM-2013-CLAIMS [18-03-2020(online)].pdf 2020-03-18
20 2000-MUM-2013-FORM 1(3-7-2013).pdf 2018-08-11
21 2000-MUM-2013-FORM 1.pdf 2018-08-11
21 2000-MUM-2013-US(14)-HearingNotice-(HearingDate-02-01-2023).pdf 2022-12-01
22 2000-MUM-2013-Correspondence to notify the Controller [28-12-2022(online)].pdf 2022-12-28
22 2000-MUM-2013-FORM 18.pdf 2018-08-11
23 2000-MUM-2013-FORM 2(TITLE PAGE).pdf 2018-08-11
23 2000-MUM-2013-FORM-26 [29-12-2022(online)].pdf 2022-12-29
24 2000-MUM-2013-FORM 2.pdf 2018-08-11
24 2000-MUM-2013-FORM-26 [29-12-2022(online)]-1.pdf 2022-12-29
25 2000-MUM-2013-Written submissions and relevant documents [11-01-2023(online)].pdf 2023-01-11
25 2000-MUM-2013-FORM 26(6-9-2013).pdf 2018-08-11
26 2000-MUM-2013-PatentCertificate14-03-2023.pdf 2023-03-14
26 2000-MUM-2013-FORM 3.pdf 2018-08-11
27 ABSTRACT.jpg 2018-08-11
27 2000-MUM-2013-IntimationOfGrant14-03-2023.pdf 2023-03-14

Search Strategy

1 2019-08-1415-15-39_14-08-2019.pdf