Abstract: ABSTRACT A SYSTEM AND A METHOD FOR PROVIDING AUTOMATED PERFORMANCE DETECTION OF APPLICATION PROGRAMMING INTERFACES A system and a method for automating performance detection of one or more application programming interfaces (APIs) is provided. The present invention provides for retrieving one or more test cases and associated test data as per respective test case ID"s and generate one or more test requests by applying a data enrichment technique. Further, the present invention provides for executing one or more generated test requests on an API under test, analyze a response received from the API under test, perform response validation, detect any defects in the API based on the received response, and generate a detailed report of the executed test request. Furthermore, the present invention provides a visual interface for selecting test cases, creating test cases, editing test cases, editing test requests, display execution of test requests and test reports.
We claim:
1. A method for automating performance detection of one or
more application programming interfaces, performed by a
performance detection engine interfacing with an API subsystem,
a test management database and a report database, the
performance detection engine executing instructions stored in a
memory via a processor, said method comprising:
generating, by the performance detection engine, one or more test requests from one or more test cases and associated test data retrieved for an API under test by retrieving one or more request templates from a request knowledge base based on unique test case IDs associated with the retrieved one or more test cases and compiling one or more test cases and associated test data with the request template corresponding to respective test cases;
analyzing, by the performance detection engine, a response received from the API under test on execution of the one or more test requests, wherein the received response is compared with an actual response associated with the executed test request; and
validating, by the performance detection engine, the response received from the API under test, wherein the API under test is labelled as defective if the response to the executed test does not match with the actual response.
2. The method as claimed in claim 1, wherein retrieving one
or more test cases and associated test data comprises analyzing,
by the performance detection engine, an API under test from the
one or more API's comprised by the API subsystem and retrieving
one or more test cases and associated test data from the test
management database based on a first set of rules, wherein the
first set of rules comprises examining the functions and
protocols comprised by the API and evaluating the test cases
based on said functions and protocols.
3. The method as claimed in claim 1, wherein the test cases are edited via a visual interface by an end-user via a client device.
4. The method as claimed in claim 1, wherein one or more test requests are arranged for execution in an order of preference and edited on invocation by the visual interface by the end-user via the client device.
5. The method as claimed in claim 1, wherein the generated test request comprises information associated with test data, test scenario, test case description, test steps, expected response, and actual response.
6. The method as claimed in claim 1, wherein a check is performed to determine if all the test requests have been executed and a detailed report of the executed test requests is generated.
7. A system for automating performance detection of one or more application programming interfaces on invocation of a visual interface by an end-user, said system interfacing with an API subsystem, a test management database and a report database, the system comprising:
a memory storing program instructions; a processor configured to execute program instructions stored in the memory; and a performance detection engine in communication with the processor and configured to:
generate one or more test requests from the retrieved one or more test cases and associated test data by retrieving one or more request templates from a request knowledgebase based on unique test case IDs associated with the retrieved one or more test cases and compiling one or more test cases and associated test data with the request template corresponding to respective test cases;
analyze a response received from the API under test on execution of the test request, wherein the received response is
compared with an actual response associated with the executed test request; and
validate the response received from the API under test, wherein the API under test is labelled as defective if the response to the executed test does not match with the actual response.
8. The system as claimed in claim 7, wherein the visual interface allows interaction with the performance detection engine and is configured with graphical icons to select one or more APIs from the API subsystem, create test cases, select one or more parameters of a test case, edit test data, edit test requests, and display step by step execution of test requests and test results.
9. The system as claimed in claim 7, wherein the request knowledgebase is a collection of request templates supporting multiple protocols, where the request templates are indexed based on unique test case IDs associated with one or more test cases stored in the test-management database.
10. The system as claimed in claim 9, wherein protocols supported by request templates are selected from SOAP, HTTP, JSON/REST, SWIFT, ACCORD and FIX.
11. The system as claimed in claim 5, wherein the performance detection engine comprises an interfacing and data collection unit in communication with the processor, said interfacing and data collection unit configured to interact with the API subsystem for testing one of more APIs comprised by the API subsystem, and retrieve one or more test cases and associated test data from the test-management database.
12. The system as claimed in claim 11, wherein a test case comprises a test scenario, a test description, test steps, expected response and actual response.
13. The system as claimed in claim 7, wherein the performance detection engine comprises a data compilation unit in
communication with the processor, said data compilation unit configured to generate test requests from the retrieved one or more test cases and associated test data by applying a data enrichment technique, wherein each generated test request comprises information associated with test data, test scenario, test case description, test steps, expected response, and actual response.
14. The system as claimed in claim 7, wherein the performance detection engine comprises a request execution unit in communication with the processor, said request execution unit is configured to arrange the generated one or more test requests in the order of preference, trigger said one or more test requests in the order of preference and determine if all the test requests have been executed.
15. The system as claimed in claim 7, wherein the performance detection engine comprises an analysis and validation unit in communication with the processor, said analysis and validation unit is configured to analyze and validate response received on execution of one or more test request from the API under test, wherein the received response is compared with the actual response associated with the executed test request and the API under test is labelled as working fine, if the response to the executed test request is same as the actual response associated with the test request.
16. The system as claimed in claim 15, wherein the API under test is labelled as defective if the response to the executed test request do not match with the actual response associated with the test request.
17. The system as claimed in claim 16, wherein the analysis and validation unit provides a debug mode via the visual interface to correct errors in the API under test.
18. The system as claimed in claim 7, wherein the performance detection engine comprises an orchestration and report generation unit in communication with the processor, said
orchestration and report generation unit is configured to generate a detailed report of the executed test requests and display a result window via the visual interface, wherein the result window comprises a portion with a list of executed test requests and a test request description portion providing details of the executed test requests.
19. The system as claimed in claim 18, wherein the report is classified based on severity of the result generated based on the executed test requests including errors, warnings, and informational messages.
20. The system as claimed in claim 18, wherein the result window includes a print dialog to print test reports, wherein the print dialog allows selection of information from the detailed report for printing.
21. A computer program product comprising:
a non-transitory computer-readable medium having computer-readable program code stored thereon, the computer-readable program code comprising instructions that, when executed by a processor, cause the processor to:
generate one or more test requests from the retrieved one or more test cases and associated test data by retrieving one or more request templates from a request knowledgebase based on unique test case IDs associated with the retrieved one or more test cases and compiling one or more test cases and associated test data with the request template corresponding to respective test cases;
analyze a response received from the API under test on execution of the test request, wherein the received response is compared with an actual response associated with the executed test request; and
validate the response received from the API under test, wherein the API under test is labelled as defective if the
response to the executed test does not match with the actual response.
| # | Name | Date |
|---|---|---|
| 1 | 201741044932-STATEMENT OF UNDERTAKING (FORM 3) [14-12-2017(online)].pdf | 2017-12-14 |
| 2 | 201741044932-PROOF OF RIGHT [14-12-2017(online)].pdf | 2017-12-14 |
| 3 | 201741044932-POWER OF AUTHORITY [14-12-2017(online)].pdf | 2017-12-14 |
| 4 | 201741044932-FORM 1 [14-12-2017(online)].pdf | 2017-12-14 |
| 5 | 201741044932-DRAWINGS [14-12-2017(online)].pdf | 2017-12-14 |
| 6 | 201741044932-COMPLETE SPECIFICATION [14-12-2017(online)].pdf | 2017-12-14 |
| 7 | 201741044932-REQUEST FOR CERTIFIED COPY [21-12-2017(online)].pdf | 2017-12-21 |
| 8 | 201741044932-FORM 18 [21-12-2017(online)].pdf | 2017-12-21 |
| 9 | Correspondence by Agent_Form 1,Power of Attorney_26-12-2017.pdf | 2017-12-26 |
| 10 | 201741044932-FORM 3 [27-04-2018(online)].pdf | 2018-04-27 |
| 11 | 201741044932-FER.pdf | 2021-10-17 |
| 1 | documentD1E_14-09-2020.pdf |
| 2 | 2020-09-1411-20-11E_14-09-2020.pdf |