Abstract: The present disclosure discloses method and system for uninterrupted automation testing of end-user application. The automated testing system receives information about test automation tool, test scenarios of the end-user application and screen flow for each of the test scenarios, from the user. The system then identifies objects, control objects for each of the screens present in each of the test scenarios by using NLP. The objects, control objects, automated steps and corresponding test data are stored in the database. During real time, pre-stored automation steps are executed, and modified screens are identified from failed execution logs. The system then identifies modified objects and control objects in the modified screen and updates object properties. The system also maps the modified control objects to corresponding next screen using NLP. The automation steps are updated for the modified objects, control objects and the test data based on the updated automation steps.
1. A method of performing uninterrupted automated testing of an end-user application,
the method comprising:
retrieving, by an automated testing system, automation steps and test data for one or more test scenarios of a plurality of test scenarios, from a database associated with the automated testing system;
identifying, by the automated testing system, one or more modified screens from a plurality of screens for each test scenario of an end-user application by executing the automation steps using the test data; and
performing, by the automated testing system, for each of the one or more modified screens:
determining a modification in one or more control objects in the one or
more modified screens;
identifying next screen for the one or more modified screens based on
process flow of the end-user application; and
mapping the one or more modified control objects to corresponding next
screen for the one or more modified screens using Natural Language Processing
(NLP).
2. The method as claimed in claim 1, further comprising storing the automation steps and
the test data for the one or more test scenarios in the database, which is retrievable by
the automated testing system, wherein the storing comprises:
receiving, by the automated testing system, information about test automation tool, test scenarios of the end-user application, and screen flow for each of the test scenarios, from user of the end-user application, wherein the screen flow comprises the plurality of screens, for each of the test scenarios;
obtaining, by the automated testing system, a plurality of objects associated with the end-user application along with predefined properties of the plurality of objects using the test automation tool;
generating, by the automated testing system, the automation steps for the plurality of screens in each of the test scenarios;
obtaining, by the automated testing system, the test data for the plurality of objects of the plurality of screens from a test data management system using Natural Language Processing (NLP); and
storing, by the automated testing system, the automation steps and the test data in the database associated with the automated testing system.
3. The method as claimed in claim 2, further comprises:
selecting, by the automated testing system, the one or more control objects along with predefined properties of the one or more control objects, from the plurality of objects, using Natural Language Processing (NLP); and
storing, by the automated testing system, the one or more control objects along with predefined properties of the one or more control objects in the database.
4. The method as claimed in claim 1, wherein the one or more test scenarios of plurality
of test scenarios are identified by:
retrieving, by the automated testing system, log of execution of automation steps of the plurality of test scenarios;
identifying, by the automated testing system, one or more modified screens of the plurality of screens having object identification related failure based on the log; and
determining, by the automated testing system, the one or more test scenarios comprising the one or more modified screens.
5. The method as claimed in claim 1 further comprising:
generating, by the automated testing system, updated automated steps for the one or more modified screens;
obtaining, by the automated testing system, updated test data for the updated automation steps from the test data management system; and
storing, by the automated testing system, the updated automation steps and the updated test data in the database associated with the automated testing system.
6. The method as claimed in claim 2, wherein the test automation tool is selected based on type of the end-user application.
7. The method as claimed in claim 1, wherein the process flow is generated by constructing a tree structure of the plurality of screens of each of the test scenarios.
8. An automated testing system for uninterrupted automated testing of an end-user application, comprising:
a processor; and
a memory communicatively coupled to the processor, wherein the memory stores processor instructions, which, on execution, causes the processor to:
retrieve automation steps and test data for one or more test scenarios of a plurality of test scenarios, from a database associated with the automated testing system;
identify one or more modified screens from a plurality of screens for each test scenario of an end-user application by executing the automation steps using the test data; and
perform for each of the one or more modified screens:
determine a modification in the one or more control objects in the one or more modified screens;
identify next screen for the one or more modified screens based on process flow of the end-user application; and
map the one or more modified control objects to corresponding next screen for the one or more modified screens using Natural Language Processing (NLP).
9. The automated testing system as claimed in claim 8, wherein the processor is further configured to store automation steps and the test data for the one or more test scenarios in the database, which is retrievable by the automated testing system, wherein the storing comprises:
receiving information about test automation tool, test scenarios of the end-user application and screen flow for each of the test scenarios, from user of the end-user application, wherein the screen flow comprises the plurality of screens, for each of the test scenarios;
obtaining a plurality of objects associated with the end-user application along with predefined properties of the plurality of objects using the test automation tool;
generating the automation steps for the plurality of screens in each of the test scenarios;
obtaining the test data for the plurality of objects of the plurality of screens from a test data management system using Natural Language Processing (NLP); and
storing the automation steps and the test data in the database associated with the automated testing system.
10. The automated testing system as claimed in claim 9, wherein the processor is
configured to:
select the one or more control objects along with predefined properties of the one or more control objects, from the plurality of objects, using Natural Language Processing (NLP); and
store the one or more control objects along with predefined properties of the one or more control objects in the database.
11. The automated testing system as claimed in claim 8, wherein the processor is
configured to identify the one or more test scenarios of plurality of test scenarios by:
retrieving log of execution of automation steps of the plurality of test scenarios;
identifying one or more modified screens of the plurality of screens having object identification related failure based on the log; and
determining the one or more test scenarios comprising the one or more modified screens.
12. The automated testing system as claimed in claim 8, wherein the processor is further
configured to:
generate updated automated steps for the one or more modified screens;
obtain updated test data for the updated automation steps from the test data management system; and
store the updated automation steps and the updated test data in the database associated with the automated testing system.
13. The automated testing system as claimed in claim 8, wherein the test automation tool is selected based on type of the end-user application.
14. The automated testing system as claimed in claim 8, wherein the process flow is generated by constructing a tree structure of the plurality of screens of each of the test scenarios.
| # | Name | Date |
|---|---|---|
| 1 | 201841036900-STATEMENT OF UNDERTAKING (FORM 3) [29-09-2018(online)].pdf | 2018-09-29 |
| 2 | 201841036900-REQUEST FOR EXAMINATION (FORM-18) [29-09-2018(online)].pdf | 2018-09-29 |
| 2 | 201841036900-PatentCertificate29-02-2024.pdf | 2024-02-29 |
| 3 | 201841036900-POWER OF AUTHORITY [29-09-2018(online)].pdf | 2018-09-29 |
| 4 | 201841036900-FORM 18 [29-09-2018(online)].pdf | 2018-09-29 |
| 5 | 201841036900-FORM 1 [29-09-2018(online)].pdf | 2018-09-29 |
| 6 | 201841036900-DRAWINGS [29-09-2018(online)].pdf | 2018-09-29 |
| 7 | 201841036900-DECLARATION OF INVENTORSHIP (FORM 5) [29-09-2018(online)].pdf | 2018-09-29 |
| 8 | 201841036900-COMPLETE SPECIFICATION [29-09-2018(online)].pdf | 2018-09-29 |
| 9 | abstract 201841036900.jpg | 2018-10-01 |
| 10 | 201841036900-Request Letter-Correspondence [09-10-2018(online)].pdf | 2018-10-09 |
| 11 | 201841036900-Power of Attorney [09-10-2018(online)].pdf | 2018-10-09 |
| 12 | 201841036900-Form 1 (Submitted on date of filing) [09-10-2018(online)].pdf | 2018-10-09 |
| 13 | 201841036900-Proof of Right (MANDATORY) [04-03-2019(online)].pdf | 2019-03-04 |
| 14 | Correspondence by Agent_Form30,Form1_07-03-2019.pdf | 2019-03-07 |
| 15 | 201841036900-FORM 3 [16-05-2021(online)].pdf | 2021-05-16 |
| 16 | 201841036900-PETITION UNDER RULE 137 [17-05-2021(online)].pdf | 2021-05-17 |
| 17 | 201841036900-OTHERS [17-05-2021(online)].pdf | 2021-05-17 |
| 18 | 201841036900-FER_SER_REPLY [17-05-2021(online)].pdf | 2021-05-17 |
| 19 | 201841036900-DRAWING [17-05-2021(online)].pdf | 2021-05-17 |
| 20 | 201841036900-COMPLETE SPECIFICATION [17-05-2021(online)].pdf | 2021-05-17 |
| 21 | 201841036900-CLAIMS [17-05-2021(online)].pdf | 2021-05-17 |
| 22 | 201841036900-FER.pdf | 2021-10-17 |
| 23 | 201841036900-US(14)-HearingNotice-(HearingDate-27-09-2023).pdf | 2023-09-07 |
| 24 | 201841036900-POA [08-09-2023(online)].pdf | 2023-09-08 |
| 25 | 201841036900-FORM 13 [08-09-2023(online)].pdf | 2023-09-08 |
| 26 | 201841036900-Correspondence to notify the Controller [08-09-2023(online)].pdf | 2023-09-08 |
| 27 | 201841036900-AMENDED DOCUMENTS [08-09-2023(online)].pdf | 2023-09-08 |
| 28 | 201841036900-US(14)-ExtendedHearingNotice-(HearingDate-13-10-2023).pdf | 2023-10-06 |
| 29 | 201841036900-Correspondence to notify the Controller [10-10-2023(online)].pdf | 2023-10-10 |
| 30 | 201841036900-Written submissions and relevant documents [28-10-2023(online)].pdf | 2023-10-28 |
| 31 | 201841036900-Information under section 8(2) [28-10-2023(online)].pdf | 2023-10-28 |
| 32 | 201841036900-FORM 3 [28-10-2023(online)].pdf | 2023-10-28 |
| 33 | 201841036900-FORM-26 [30-10-2023(online)].pdf | 2023-10-30 |
| 34 | 201841036900-PatentCertificate29-02-2024.pdf | 2024-02-29 |
| 35 | 201841036900-IntimationOfGrant29-02-2024.pdf | 2024-02-29 |
| 1 | 2020-12-0915-21-36E_09-12-2020.pdf |