Sign In to Follow Application
View All Documents & Correspondence

System And Method For Steady State Performance Testing Of A Multiple Output Software System

Abstract: This disclosure relates generally to software performance testing, and more particularly to a system and method for steady state performance testing of a multiple output software system. According to one exemplary embodiment, a processor-implemented performance test for steady-state determination method is described. The method may include executing, via one or more hardware processors, a performance test of a web-based application, calculating, via the one or more hardware processors, a plurality of output metrics based on the performance test, determining, via the one or more hardware processors, whether each of the output metrics has achieved steady state within micro, macro, and global initial time windows, and providing an overall steadiness indication based on the determination of whether each of the output metrics has achieved steady state within the micro, macro, and global time windows. Fig 2

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
10 March 2015
Publication Number
12/2015
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2023-02-20
Renewal Date

Applicants

WIPRO LIMITED
Doddakannelli, Sarjapur Road, Bangalore 560035, Karnataka, India.

Inventors

1. SOURAV SAM BHATTACHARYA
13418 North Clifftop Drive, Fountain Hills, Arizona 85268, United States of America

Specification

CLIAMS:WE CLAIM
1. A processor-implemented performance test steady-state determination method, comprising:
executing, via one or more hardware processors, a performance test of a web-based application;
calculating, via the one or more hardware processors, a plurality of output metrics based on the performance test;
determining, via the one or more hardware processors, whether each of the output metrics has achieved steady state within micro, macro, and global initial time windows; and
providing an overall steadiness indication based on the determination of whether each of the output metrics has achieved steady state within the micro, macro, and global time windows.

2. The method of claim 1, further comprising:
generating, via the one or more hardware processors, an overall steadiness score based on the determination of whether each of the output metrics has achieved steady state within the micro, macro, and global time windows; and
determining whether to provide the overall steadiness indication based on the overall steadiness score.

3. The method of claim 2, wherein the overall steadiness score is generated as:
,
wherein is the overall steadiness score, is a number of output metrics, is a priority weight corresponding to a th metric, and , , and are values indicating steadiness of the th metric within the micro, macro, and global time windows respectively.

4. The method of claim 3, wherein , , and are binary values.
5. The method of claim 1, wherein the micro, macro, and global initial time windows are defined based on user input either statically, or dynamically, or both, within a continuum of tests.

6. The method of claim 5, wherein the micro initial time window is set by a user to about 1% of an overall time span for the performance test, the macro initial time window is set by the user to about 10% of the overall time span for the performance test, and the global initial time window is set by the user to about 100% of the overall time span for the performance test.

7. The method of claim 6, wherein the overall time span for the performance test is determined based on an input traffic profile for the performance test.

8. The method of claim 1, wherein determining whether one of the output metrics has achieved steady state within one of the time windows comprises:
calculating an average value for that metric within that time window;
determining whether a maximum value for that metric within that time window deviate by less than predetermined threshold values from the average value; and
determining whether a minimum value for that metric within that time window deviate by less than predetermined threshold values from the average value.

9. A performance test steady-state determination system, comprising:
a hardware processor; and
a memory storing instructions executable by the hardware processor for:
executing, via one or more hardware processors, a performance test of a web-based application;
calculating, via the one or more hardware processors, a plurality of output metrics based on the performance test;
determining, via the one or more hardware processors, whether each of the output metrics has achieved steady state within micro, macro, and global initial time windows; and
providing an overall steadiness indication based on the determination of whether each of the output metrics has achieved steady state within the micro, macro, and global time windows.

10. The system of claim 9, the memory further storing instructions for:
generating, via the one or more hardware processors, an overall steadiness score based on the determination of whether each of the output metrics has achieved steady state within the micro, macro, and global time windows; and
determining whether to provide the overall steadiness indication based on the overall steadiness score.

11. The system of claim 10, wherein the overall steadiness score is generated as:
,
wherein is the overall steadiness score, is a number of output metrics, is a priority weight corresponding to a th metric, and , , and are values indicating steadiness of the th metric within the micro, macro, and global time windows respectively.

12. The system of claim 11, wherein , , and are binary values.
13. The system of claim 9, wherein the micro, macro, and global initial time windows are defined based on user input either statically, or dynamically, or both, within a continuum of tests.

14. The system of claim 13, wherein the micro initial time window is set by a user to about 1% of an overall time span for the performance test, the macro initial time window is set by the user to about 10% of the overall time span for the performance test, and the global initial time window is set by the user to about 100% of the overall time span for the performance test.

15. The system of claim 14, wherein the overall time span for the performance test is determined based on an input traffic profile for the performance test.

16. The system of claim 9, wherein determining whether one of the output metrics has achieved steady state within one of the time windows comprises:
calculating an average value for that metric within that time window;
determining whether a maximum value for that metric within that time window deviate by less than predetermined threshold values from the average value; and
determining whether a minimum value for that metric within that time window deviate by less than predetermined threshold values from the average value.

17. A non-transitory computer-readable medium storing performance test steady-state determination instructions for:
executing, via one or more hardware processors, a performance test of a web-based application;
calculating, via the one or more hardware processors, a plurality of output metrics based on the performance test;
determining, via the one or more hardware processors, whether each of the output metrics has achieved steady state within micro, macro, and global initial time windows; and
providing an overall steadiness indication based on the determination of whether each of the output metrics has achieved steady state within the micro, macro, and global time windows.

Dated this 10th day of March, 2015
Shwetha A Chimalgi
Of K&S Partners
Agent for the Applicant
,TagSPECI:Technical Field
This disclosure relates generally to software performance testing, and more particularly to a system and method for steady state performance testing of a multiple output software system.

Documents

Application Documents

# Name Date
1 1160-CHE-2015 FORM-9 10-03-2015.pdf 2015-03-10
2 1160-CHE-2015 FORM-18 10-03-2015.pdf 2015-03-10
3 IP30383-spec.pdf 2015-03-13
4 IP30383-fig.pdf 2015-03-13
5 FORM 5-IP30383 - Conventional.pdf 2015-03-13
6 FORM 3-IP30383 - Conventional.pdf 2015-03-13
7 abstract 1160-CHE-2015.jpg 2015-03-16
8 1160-CHE-2015 POWER OF ATTORNEY 05-06-2015.pdf 2015-06-05
9 1160-CHE-2015 FORM-1 05-06-2015.pdf 2015-06-05
10 1160-CHE-2015 CORRESPONDENCE OTHERS 05-06-2015.pdf 2015-06-05
11 1160-CHE-2015-FER.pdf 2019-12-31
12 1160-CHE-2015-Information under section 8(2) [19-06-2020(online)].pdf 2020-06-19
13 1160-CHE-2015-FORM 3 [19-06-2020(online)].pdf 2020-06-19
14 1160-CHE-2015-FER_SER_REPLY [19-06-2020(online)].pdf 2020-06-19
15 1160-CHE-2015-US(14)-HearingNotice-(HearingDate-21-09-2022).pdf 2022-08-24
16 1160-CHE-2015-POA [29-08-2022(online)].pdf 2022-08-29
17 1160-CHE-2015-FORM 13 [29-08-2022(online)].pdf 2022-08-29
18 1160-CHE-2015-Correspondence to notify the Controller [29-08-2022(online)].pdf 2022-08-29
19 1160-CHE-2015-AMENDED DOCUMENTS [29-08-2022(online)].pdf 2022-08-29
20 1160-CHE-2015-Written submissions and relevant documents [06-10-2022(online)].pdf 2022-10-06
21 1160-CHE-2015-PETITION UNDER RULE 137 [06-10-2022(online)].pdf 2022-10-06
22 1160-CHE-2015-Certified Copy of Priority Document [21-11-2022(online)].pdf 2022-11-21
23 1160-CHE-2015-PatentCertificate20-02-2023.pdf 2023-02-20
24 1160-CHE-2015-IntimationOfGrant20-02-2023.pdf 2023-02-20

Search Strategy

1 search29(2)_10-12-2019.pdf
2 search29(1)_10-12-2019.pdf
3 search09AE_07-01-2021.pdf

ERegister / Renewals

3rd: 20 May 2023

From 10/03/2017 - To 10/03/2018

4th: 20 May 2023

From 10/03/2018 - To 10/03/2019

5th: 20 May 2023

From 10/03/2019 - To 10/03/2020

6th: 20 May 2023

From 10/03/2020 - To 10/03/2021

7th: 20 May 2023

From 10/03/2021 - To 10/03/2022

8th: 20 May 2023

From 10/03/2022 - To 10/03/2023

9th: 20 May 2023

From 10/03/2023 - To 10/03/2024

10th: 07 Mar 2024

From 10/03/2024 - To 10/03/2025

11th: 07 Mar 2025

From 10/03/2025 - To 10/03/2026