Abstract: Systems and methods of testing, of software applications, based on business process models are described herein. In one example, the method comprises receiving, by a processor, the at least one business process model, wherein the at least one business process model is indicative of a business process associated with the software application and analyzing, by the processor, the at least one business process model to identify at least one test scenario. The method further comprises generating, by the processor, a set of test cases and test data for the at least one test scenario and producing, by the processor, a set of test automation scripts based on one or more keywords associated with the at least one test scenario. Fig.1
CLIAMS:We claim:
1. A software application testing (SAT) system for testing a software application, based on at least one business process model, the SAT system comprising:
a processor;
a memory executable by the processor;
a data input module, executable by the processor, to receive the at least one business process model from a user, wherein the at least one business process model is indicative of at least one business process associated with the software application;
a test scenario identification module, executable by the processor, to analyze the at least one business process model to identify at least one test scenario;
a test case generation module, executable by the processor, to generate a set of test cases and test data for the at least one test scenario; and
a test script generation module, executable by the processor, to produce a set of test automation scripts based on one or more keywords associated with the at least one test scenario.
2. The SAT system as claimed in claim 1, wherein the test scenario identification module further:
parses the at least one business process model;
identifies the at least one distinct standalone business process from the parsed at least one business process model;
ascertains one or more start points of the at least one distinct standalone business process;
determines one or more path flows of the at least one distinct standalone business process from the ascertained one or more start points; and
identifies the at least one test scenario based on the one or more path flows.
3. The SAT system as claimed in claim 2, wherein the test scenario identification module further:
identifies a plurality of levels of testing from the parsed at least one business process model, wherein the plurality of levels comprise at least one of functional tests, system tests and system integration steps; and
links the at least one distinct standalone business process common to the plurality of levels of testing to identify the at least one test scenario.
4. The SAT system as claimed in claim 1, wherein the test case generation module further:
receives the test data from a user;
optimizes the received test data at an operational level based on the at least one test scenario; and
generates the set of test cases for the at least one test scenario.
5. The SAT system as claimed in claim 1, wherein the SAT system further comprises a change management module, coupled to the processor, to:
receive an updated business process model from the user, wherein the updated business process model corresponds to an updated business process associated with the software application;
parse the updated business process model;
compare the parsed updated business process model with a previous business model so as to generate a comparative snapshot of the impact of the changes due to the updated business process model;
analyze the comparative snapshot to identify the a test scenario which has been one of added, modified and deleted in the updated business process model; and
map the test cases and the test data to the one of added, modified and deleted test scenarios.
6. The SAT system as claimed in claim 1, wherein the SAT system further comprises a risk management module, coupled to the processor, to associate a risk index with the at least one test scenario based on the business criticality of the at least one test scenario, wherein the risk index is indicative of a priority of the at least one test scenario.
7. The SAT system as claimed in claim 1, wherein the SAT system further comprises a test suite output module, coupled to the processor, to:
format the test cases and the test scripts in a user-defined template; and
upload the test cases and the test scripts to at least one communicatively coupled test management tool for execution.
8. The SAT system as claimed in claim 1, wherein the SAT system further comprises a test suite output module, coupled to the processor, to:
fetch the test cases generated by the test case generation module;
map the test cases to pre-defined fields or place holders in a user-defined template; and
generate test automation scripts, based on keywords associated with the test cases and the user-defined template.
9. A computer implemented method of testing of a software application based on at least one business process model, the method comprising:
receiving, by a processor, the at least one business process model, wherein the at least one business process model is indicative of at least business process associated with the software application;
analyzing, by the processor, the at least one business process model to identify at least one test scenario;
generating, by the processor, a set of test cases and test data for the at least one test scenario; and
producing, by the processor, a set of test automation scripts based on one or more keywords associated with the at least one test scenario.
10. The method as claimed in claim 9, wherein the analyzing further comprises:
parsing, by the processor, the at least one business process model;
identifying, by the processor, at least one distinct standalone business process from the parsed at least one business process model;
ascertaining, by the processor, one or more start points of the at least one distinct standalone business process;
determining, by the processor, one or more path flows of the at least one distinct standalone business process from the ascertained one or more start points; and
identifying, by the processor, the at least one test scenario based on the one or more path flows.
11. The method as claimed in claim 10, wherein the analyzing further comprises:
identifying a plurality of levels of testing from the parsed at least one business process model, wherein the plurality of levels comprise at least one of functional tests, system tests and system integration steps; and
linking the at least one distinct standalone business process common to the plurality of levels of testing to identify the at least one test scenario.
12. The method as claimed in claim 9, wherein the generating further comprises:
receiving, by the processor, the test data from a user;
optimizing, by the processor, the received test data at an operational level based on the at least one test scenario; and
generating, by the processor, the set of test cases for the at least one test scenario.
13. The method as claimed in claim 9, wherein the method further comprises:
receiving, by the processor, an updated business process model from a user, wherein the updated business process model corresponds to an updated business process associated with the software application;
parsing, by the processor, the updated business process model;
comparing, by the processor, the parsed updated business process model with a previous business model so as to generate a comparative snapshot of the impact of the changes due to the updated business process model;
analyzing, by the processor, the comparative snapshot to identify the a test scenario which has been one of added, modified and deleted in the updated business process model; and
mapping, by the processor, the test cases and the test data to the one of added, modified and deleted test scenarios.
14. The method as claimed in claim 9, wherein the method further comprises:
associating, by the processor, a risk index with the at least one test scenario based on the business criticality of the at least one test scenario, wherein the risk index is indicative of the criticality and the priority of the at least one test scenario; and
executing, by the processor, the test cases associated with the at least one test scenario in an order based on the risk index.
15. The method as claimed in claim 9, wherein the method further comprises:
formatting, by the processor, the test cases and the test scripts in a user-defined template; and
uploading, by the processor, the test cases and the test scripts, to at least one communicatively coupled test management tool, for execution.
16. The method as claimed in claim 9, wherein the method further comprises:
fetching, by the processor, the test cases generated by the test case generation module;
mapping, by the processor, the test cases to pre-defined fields or place holders in a user-defined template; and
generating, by the processor, test automation scripts, based on keywords associated with the test cases and the user-defined template.
17. A non-transitory computer readable medium comprising a set of computer executable instructions, which, when executed on a computing system causes the computing system to perform the steps of:
receiving the at least one business process model, wherein the at least one business process model is indicative of a business process associated with the software application;
analyzing the at least one business process model to identify at least one test scenario;
generating a set of test cases and test data for the at least one test scenario; and
producing a set of test automation scripts based on one or more keywords associated with the at least one test scenario.
18. The non-transitory computer readable medium as claimed in claim 17, wherein the set of computer executable instructions, which, when executed on the computing system causes the computing system to further perform the steps of:
parsing, by the processor, the at least one business process model;
identifying at least one distinct standalone business process from the parsed at least one business process model;
ascertaining one or more start points of the at least one distinct standalone business process;
determining one or more path flows of the at least one distinct standalone business process from the ascertained one or more start points; and
identifying the at least one test scenario based on the one or more path flows.
19. The non-transitory computer readable medium as claimed in claim 17, wherein the set of computer executable instructions, which, when executed on the computing system causes the computing system to further perform the steps of:
receiving an updated business process model from a user, wherein the updated business process model corresponds to an updated business process associated with the software application;
parsing the updated business process model;
comparing the parsed updated business process model with a previous business model so as to generate a comparative snapshot of the impact of the changes due to the updated business process model;
analyzing the comparative snapshot to identify the a test scenario which has been one of added, modified and deleted in the updated business process model; and
mapping the test cases and the test data to the one of added, modified and deleted test scenarios.
20. The non-transitory computer readable medium as claimed in claim 17, wherein the set of computer executable instructions, which, when executed on the computing system causes the computing system to further perform the steps of:
associating a risk index with the at least one test scenario based on the business criticality of the at least one test scenario, wherein the risk index is indicative of the criticality and the priority of the at least one test scenario; and
executing the test cases associated with the at least one test scenario in an order based on the risk index.
Dated this 12th day of February, 2014
SRAVAN KUMAR GAMPA
K&S PARTNERS
AGENT FOR THE APPLICANT
,TagSPECI:TECHNICAL FIELD
The present subject matter relates to testing of software applications, and, particularly but not exclusively, to testing, of software applications, based on business process models.
| # | Name | Date |
|---|---|---|
| 1 | IP26220-Spec.pdf | 2014-02-12 |
| 2 | IP26220-Fig.pdf | 2014-02-12 |
| 3 | FORM 5.pdf | 2014-02-12 |
| 4 | FORM 3.pdf | 2014-02-12 |
| 5 | Form-9(Online).pdf | 2014-02-13 |
| 6 | abstract658-CHE-2014.jpg | 2014-02-19 |
| 7 | 658-CHE-2014-FER.pdf | 2019-11-14 |
| 1 | SearchStrategyMatrix658CHE2014_31-10-2019.pdf |