Sign In to Follow Application
View All Documents & Correspondence

Method And Device For Improving Software Performance Testing

Abstract: Embodiments of the present disclosure disclose a method and a device for improving software performance testing. The method comprises receiving input data from one or more test management systems. The method further comprises identifying at least one behavior model based on the input data. The method further comprises correlating the at least one behavior model with at least one of affected parameters to determine one or more performance issues in the input data. The method further comprises verifying the one or more performance issues by reassessing the at least one behavior model. FIGURE 5

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
10 February 2015
Publication Number
09/2015
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ipo@knspartners.com
Parent Application

Applicants

WIPRO LIMITED
Doddakannelli, Sarjapur Road, Bangalore 560035, Karnataka, India.

Inventors

1. SURYA VORGANTI
Plot No: 1/562, Krishna Nagar, 4th Main Road, Perumbakkam, Chennai, Tamil Nadu.

Specification

CLIAMS:We claim:
1. A method for improving software performance testing, comprising:
receiving, by a performance test computing device, input data from one or more test management systems;
identifying, by the performance test computing device, at least one behavior model based on the input data;
correlating, by the performance test computing device, the at least one behavior model with at least one of affected parameters to determine one or more performance issues in the input data; and
verifying, by the performance test computing device, the one or more performance issues by reassessing the at least one behavior model.
2. The method as claimed in claim 1, wherein the input data comprises a plurality of log files selected from at least one of operating system logs, web server logs, application server logs, or database server log.
3. The method as claimed in claims 1, wherein identifying the at least one behavior model comprises:
obtaining one or more headers from a plurality of log files;
comparing the plurality of log files to obtain one or more system violations; and
classifying the one or more headers in to the at least one behavior model based on predefined patterns, wherein the at least one behavior model is one of a memory model, a process model, a hardware model, and a virtual model.
4. The method as claimed in claim 1, wherein correlating at least one behavior model comprises:
generating one or more patterns for the at least one behavior model; and
correlating the one or more patterns with the one or more affected parameters to identify one or more performance issues, wherein the affected parameters is one of memory leaks, code instability and page response time.
5. The method as claimed in claim 4 further comprises determining at least one of system behavior for the performance testing and predefined resources available at the time of the performance test to identify the one or more performance issues.
6. The method as claimed in claim 1, wherein verifying the one or more performance issues by reassessing the at least one behavior model comprises:
generating one or more headers based on a modular technique; and
comparing the one or more headers using one or more predefined patterns to reassess the at least one behavior model.
7. The method as claimed in claim 1 further comprises generating one or more reports comprising the one or more performance issues identified by the performance test computing device.
8. A performance test computing device for improving software performance testing, comprising:
a processor; and
a memory communicatively coupled to the processor, wherein the memory stores processor-executable instructions, which, on execution, causes the processor to:
receive input data from one or more test management systems;
identify at least one behavior model based on the input data;
correlate the at least one behavior model with at least one of affected parameters to determine one or more performance issues in the input data; and
verify the one or more performance issues by reassessing the at least one behavior model.
9. The performance test computing device as claimed in claim 8, wherein the input data comprises a plurality of log files selected from at least one of operating system logs, web server logs, application server logs, or database server log.
10. The performance test computing device as claimed in claim 8, wherein identifying the at least one behavior model comprises:
obtaining one or more headers from a plurality of log files;
comparing the plurality of log files to obtain one or more system violations; and
classifying the one or more headers in to the at least one behavior model based on predefined patterns, wherein the at least one behavior model is one of a memory model, a process model, a hardware model, and a virtual model.
11. The performance test computing device as claimed in claim 8, wherein correlating at least one behavior model comprises:
generating one or more patterns for the at least one behavior model; and
correlating the one or more patterns with the one or more affected parameters to identify one or more performance issues, wherein the affected parameters is one of memory leaks, code instability and page response time.
12. The performance test computing device as claimed in claim 11 further comprises determining at least one of system behavior for the performance testing and predefined resources available at the time of the performance test to identify the one or more performance issues.
13. The performance test computing device as claimed in claim 8, wherein verifying the one or more issues by reassessing the at least one behavior model comprises:
generating one or more headers based on a modular technique; and
comparing the one or more headers using one or more predefined patterns to reassess the at least one behavior model.
14. The performance test computing device as claimed in claim 8 further comprises generating one or more reports comprising the one or more performance issues identified by the performance test computing device.
15. A non-transitory computer readable medium including instructions stored thereon that when processed by at least one processor cause a system to perform operations comprising:
receiving input data from one or more test management systems;
identifying at least one behavior model based on the input data;
correlating the at least one behavior model with at least one of affected parameters to determine one or more performance issues in the input data; and
verifying the one or more performance issues by reassessing the at least one behavior model.
Dated this 10th day of February, 2015

SRAVAN KUMAR GAMPA
OF K & S PARTNERS
AGENT FOR THE APPLICANT
,TagSPECI:TECHNICAL FIELD

The present subject matter is related, in general to performance testing and more particularly, but not exclusively to a method and a device for improving software performance testing.

Documents

Application Documents

# Name Date
1 656-CHE-2015 FORM-9 10-02-2015.pdf 2015-02-10
2 656-CHE-2015 FORM-18 10-02-2015.pdf 2015-02-10
3 656-CHE-2015-Request For Certified Copy-Online(12-02-2015).pdf 2015-02-12
4 656CHE2015_CertifiedCopyRequest.pdf ONLINE 2015-02-13
5 abstract 656-CHE-2015.jpg 2015-02-16
6 IP29839-spec.pdf 2015-03-12
7 IP29839-fig.pdf 2015-03-12
8 FORM 5-IP29839.pdf 2015-03-12
9 FORM 3-IP29839.pdf 2015-03-12
10 656CHE2015_CertifiedCopyRequest.pdf 2015-03-13
11 656-CHE-2015 POWER OF ATTORNEY 24-07-2015.pdf 2015-07-24
12 656-CHE-2015 FORM-1 24-07-2015.pdf 2015-07-24
13 656-CHE-2015 CORRES PONDENCE OTHERS 24-07-2015.pdf 2015-07-24
14 656-CHE-2015-FER.pdf 2019-09-27
15 656-CHE-2015-FER_SER_REPLY [27-03-2020(online)].pdf 2020-03-27
16 656-CHE-2015-US(14)-HearingNotice-(HearingDate-30-01-2024).pdf 2024-01-10
17 656-CHE-2015-POA [15-01-2024(online)].pdf 2024-01-15
18 656-CHE-2015-FORM 13 [15-01-2024(online)].pdf 2024-01-15
19 656-CHE-2015-Correspondence to notify the Controller [15-01-2024(online)].pdf 2024-01-15
20 656-CHE-2015-AMENDED DOCUMENTS [15-01-2024(online)].pdf 2024-01-15
21 656-CHE-2015-Written submissions and relevant documents [14-02-2024(online)].pdf 2024-02-14

Search Strategy

1 search_19-09-2019.pdf
2 amended_searchAE_19-06-2020.pdf