Abstract: ABSTRACT END-TO-END LOW CODE AUTOMATED TEST CASE CREATION, EXECUTION AND REPORTING FRAMEWORK Conventional test automation platforms have challenges such as they require manual effort for test case creation, needs script development for each test case. A method and system for end-to-end low code automated test case creation, execution, and reporting framework is disclosed. It addresses the challenges in the existing code testing solutions by creating pre-configured registries and catalog that assist to create automated test cases and test steps on the fly. It provides one-stop solutions for component testing, platform testing, or enterprise (end-to-end) testing managed through configurations. The system provides a solution for testing all types of packages and functions such as APIS, UI screens, and database table and supports multi-platform and multi-technology. All functions are created as generic functions which are not dependent on any platforms and technology. Once a test case request is received the system searches for use cases and test steps for execution. [To be published with Figure 1]
Description:FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
END-TO-END LOW CODE AUTOMATED TEST CASE CREATION, EXECUTION AND REPORTING FRAMEWORK
Applicant
Tata Consultancy Services Limited
A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th floor,
Nariman point, Mumbai 400021,
Maharashtra, India
Preamble to the description:
The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD
[001] The embodiments herein generally relate to the field of code testing and, more particularly, to a method and system for end-to-end low code automated test case creation, execution, and reporting framework.
BACKGROUND
[002] The existing low code/no code test case generation platforms are concentrating more on the individual components. Thus, a test case generated for automated User Interface (UI) testing cannot be used for other components like Application Programming Interface (API) testing. There is no single tool for different test phases. There are no solutions for creating test case with generic scripts once and re-use it across different workflows through keyword driven rules orchestration, where it helps to save cost and time involved in creating test cases.
SUMMARY
[003] Embodiments of the present disclosure present technological improvements as solutions to one or more of the above-mentioned technical problems recognized by the inventors in conventional systems.
[004] For example, in one embodiment, a method for end-to-end low code automated test case creation, execution, and reporting framework is provided. The method includes determine, for a committed code of each of a plurality of applications received from a continuous integration and continuous deployment (CI/CD) pipeline, a plurality of scenarios from a scenario registry for testing an impacted list comprising impacted packages and functions received from the CI/CD pipeline, wherein the scenarios are determined by a package manager data model executed by one or more hardware processors that stores modelled relationship between packages and functions of each of the plurality of applications, wherein the impacted packages and function list comprises Application Programming Interfaces (APIs), User Interface (UI) screens, and the database (DB) tables.
[005] Further, the method includes executing a component test phase on receiving a trigger from the CI/CD pipeline for the component test phase, the component test phase for each application among the plurality of applications comprising steps of: (a) segregating the impacted list of each application into a plurality of groups based on whether the packages and functions impact the APIs, the UI screens, or the database; (b) obtaining for each of the plurality of groups a set of rules defined in the scenario registry, wherein the set of rules define a sequence of steps to be executed for testing the determined plurality of scenarios for each of the impacted APIs, impacted UIs, and impacted DB tables; (c) generating one or more test cases for each rule among a set of rules for each of the impacted APIs, impacted UIs, and impacted DB tables, wherein the number of test cases to be created is based steps configured for each rule in the scenario registry; (d) defining an expected outcome for each of the test cases based on a validation catalog that stores expected outcome code and message configured against each element in the request/response, wherein an element is part of XML or JSON; (e) executing each test case by a test executor by identifying an associated scenario from the determined plurality of scenarios, associated test data from a test data registry and referring to an orchestration registry to follow a plurality of orchestration steps; and (f) transferring control back to the CI/CD pipeline post publishing the test results of each of the test cases.
[006] Further, the method includes executing a platform test phase to test each platform among a plurality of platforms of an enterprise on receiving the trigger from the CI/CD pipeline, wherein each platform is built using a set of applications among the plurality of applications tested in the component test phase, the platform test phase for each platform comprising steps of: (a) identifying a platform associated with the set of applications tested during the component test phase; (b) obtaining the plurality of scenarios affected by the platform using the package manager data model; (c) obtaining for each of the platform the set of rules defined in the scenario registry specifying the sequence of steps for testing of each scenario; (d) generating one or more test cases for each rule among a set of rules for the platform, wherein the number of test cases to be used is determined based steps configured for each rule; (e) defining an expected outcome for each of the test cases based on the validation catalog that stores expected outcome and message configured against each element; (f) executing each test case by a test executor by identifying the associated scenario from among the plurality of scenarios affected by the platform using the scenario registry, corresponding test data from the test data registry and referring to the orchestration registry to follow the plurality of orchestration steps; and (g) transferring control back to the CI/CD pipeline post publishing the test results of each of the test cases.
[007] Further, the method includes executing an enterprise integration test phase for the enterprise on receiving the trigger from the CI/CD pipeline, wherein the enterprise is built on the plurality of platforms.
[008] In another aspect, a system for end-to-end low code automated test case creation, execution, and reporting framework is provided. The system comprises a memory storing instructions; one or more Input/Output (I/O) interfaces; and one or more hardware processors coupled to the memory via the one or more I/O interfaces, wherein the one or more hardware processors are configured by the instructions to determine, for a committed code of each of a plurality of applications received from a continuous integration and continuous deployment (CI/CD) pipeline, a plurality of scenarios from a scenario registry for testing an impacted list comprising impacted packages and functions received from the CI/CD pipeline, wherein the scenarios are determined by a package manager data model executed by one or more hardware processors that stores modelled relationship between packages and functions of each of the plurality of applications, wherein the impacted packages and function list comprises Application Programming Interfaces (APIs), User Interface (UI) screens, and the database (DB) tables.
[009] Further, the system is configured to execute a component test phase on receiving a trigger from the CI/CD pipeline for the component test phase, the component test phase for each application among the plurality of applications comprising steps of: (a) segregating the impacted list of each application into a plurality of groups based on whether the packages and functions impact the APIs, the UI screens, or the database; (b) obtaining for each of the plurality of groups a set of rules defined in the scenario registry, wherein the set of rules define a sequence of steps to be executed for testing the determined plurality of scenarios for each of the impacted APIs, impacted UIs, and impacted DB tables; (c) generating one or more test cases for each rule among a set of rules for each of the impacted APIs, impacted UIs, and impacted DB tables, wherein the number of test cases to be created is based steps configured for each rule in the scenario registry; (d) defining an expected outcome for each of the test cases based on a validation catalog that stores expected outcome code and message configured against each element in the request/response, wherein an element is part of XML or JSON; (e) executing each test case by a test executor by identifying an associated scenario from the determined plurality of scenarios, associated test data from a test data registry and referring to an orchestration registry to follow a plurality of orchestration steps; and (f) transferring control back to the CI/CD pipeline post publishing the test results of each of the test cases.
[010] Further, the system is configured to execute a platform test phase to test each platform among a plurality of platforms of an enterprise on receiving the trigger from the CI/CD pipeline, wherein each platform is built using a set of applications among the plurality of applications tested in the component test phase, the platform test phase for each platform comprising steps of: (a) identifying a platform associated with the set of applications tested during the component test phase; (b) obtaining the plurality of scenarios affected by the platform using the package manager data model; (c) obtaining for each of the platform the set of rules defined in the scenario registry specifying the sequence of steps for testing of each scenario; (d) generating one or more test cases for each rule among a set of rules for the platform, wherein the number of test cases to be used is determined based steps configured for each rule; (e) defining an expected outcome for each of the test cases based on the validation catalog that stores expected outcome and message configured against each element; (f) executing each test case by a test executor by identifying the associated scenario from among the plurality of scenarios affected by the platform using the scenario registry, corresponding test data from the test data registry and referring to the orchestration registry to follow the plurality of orchestration steps; and (g) transferring control back to the CI/CD pipeline post publishing the test results of each of the test cases.
[011] Further, the system is configured to execute an enterprise integration test phase for the enterprise on receiving the trigger from the CI/CD pipeline, wherein the enterprise is built on the plurality of platforms.
[012] In yet another aspect, there are provided one or more non-transitory machine-readable information storage mediums comprising one or more instructions, which when executed by one or more hardware processors causes a method for end-to-end low code automated test case creation, execution, and reporting framework.
[013] The method includes determine, for a committed code of each of a plurality of applications received from a continuous integration and continuous deployment (CI/CD) pipeline, a plurality of scenarios from a scenario registry for testing an impacted list comprising impacted packages and functions received from the CI/CD pipeline, wherein the scenarios are determined by a package manager data model executed by one or more hardware processors that stores modelled relationship between packages and functions of each of the plurality of applications, wherein the impacted packages and function list comprises Application Programming Interfaces (APIs), User Interface (UI) screens, and the database (DB) tables.
[014] Further, the method includes executing a component test phase on receiving a trigger from the CI/CD pipeline for the component test phase, the component test phase for each application among the plurality of applications comprising steps of: (a) segregating the impacted list of each application into a plurality of groups based on whether the packages and functions impact the APIs, the UI screens, or the database; (b) obtaining for each of the plurality of groups a set of rules defined in the scenario registry, wherein the set of rules define a sequence of steps to be executed for testing the determined plurality of scenarios for each of the impacted APIs, impacted UIs, and impacted DB tables; (c) generating one or more test cases for each rule among a set of rules for each of the impacted APIs, impacted UIs, and impacted DB tables, wherein the number of test cases to be created is based steps configured for each rule in the scenario registry; (d) defining an expected outcome for each of the test cases based on a validation catalog that stores expected outcome code and message configured against each element in the request/response, wherein an element is part of XML or JSON; (e) executing each test case by a test executor by identifying an associated scenario from the determined plurality of scenarios, associated test data from a test data registry and referring to an orchestration registry to follow a plurality of orchestration steps; and (f) transferring control back to the CI/CD pipeline post publishing the test results of each of the test cases.
[015] Further, the method includes executing a platform test phase to test each platform among a plurality of platforms of an enterprise on receiving the trigger from the CI/CD pipeline, wherein each platform is built using a set of applications among the plurality of applications tested in the component test phase, the platform test phase for each platform comprising steps of: (a) identifying a platform associated with the set of applications tested during the component test phase; (b) obtaining the plurality of scenarios affected by the platform using the package manager data model; (c) obtaining for each of the platform the set of rules defined in the scenario registry specifying the sequence of steps for testing of each scenario; (d) generating one or more test cases for each rule among a set of rules for the platform, wherein the number of test cases to be used is determined based steps configured for each rule; (e) defining an expected outcome for each of the test cases based on the validation catalog that stores expected outcome and message configured against each element; (f) executing each test case by a test executor by identifying the associated scenario from among the plurality of scenarios affected by the platform using the scenario registry, corresponding test data from the test data registry and referring to the orchestration registry to follow the plurality of orchestration steps; and (g) transferring control back to the CI/CD pipeline post publishing the test results of each of the test cases.
[016] Further, the method includes executing an enterprise integration test phase for the enterprise on receiving the trigger from the CI/CD pipeline, wherein the enterprise is built on the plurality of platforms.
[017] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[018] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles:
[019] FIG. 1 is a functional block diagram of a system for end-to-end low code automated test case creation, execution, and reporting framework, in accordance with some embodiments of the present disclosure.
[020] FIG. 2 is a flow diagram illustrating a method for end-to-end low code automated test case creation, execution, and reporting framework, using the system depicted in FIG. 1, in accordance with some embodiments of the present disclosure.
[021] FIGS. 3 to FIG. 9 depict process of building of each of the plurality of modules comprising functions registry, Application Programming Interface (API) definition catalog, database definition catalog, scenario registry, test data registry, validation catalog and orchestration registry, in accordance with some embodiments of the present disclosure.
[022] FIG. 10 depicts test phase specific customization enabled by the system of FIG. 1, in accordance with some embodiments of the present disclosure.
[023] FIG. 11 depicts a test case creator flow followed by the system of FIG. 1, in accordance with some embodiments of the present disclosure.
[024] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems and devices embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION OF EMBODIMENTS
[025] Exemplary embodiments are described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the scope of the disclosed embodiments.
[026] Conventional test automation platforms have challenges such as they require manual effort for test case creation, needs script development for each test case. Further, it can be understood that a single suite is not suitable for all test phases. Furthermore skilled automation engineer is required for new products / changes to test automation and hardly can provide 100% test coverage.
[027] There are lot of no code / low code tools in the market that does automated test creation and execution however they are limited to single product. For example, a tool that provides a testing platform for User Interface (UI) does not do this for API and vice versa. A tool for a COTS product like Seibel or SAP could not be used for other COTS products. There is a solution that enables automated test creation and execution of orchestrating all these under a single umbrella bring together the orchestration of UI testing, API testing, Database testing etc.
[028] Embodiments herein provide a method and system for end-to-end low code automated test case creation, execution, and reporting framework. The system, also referred to as Testing as a Code (TaaC) framework, addresses the challenges in the existing code testing solutions by creating pre-configured registries and catalog that assist to create automated test cases and test steps on the fly. Furthermore, the framework acts as a one stop to use the same solution for different test phases such as component testing, platform testing, or enterprise (end-to-end or E2E) testing managed through configurations. The system provides a solution for testing all types of packages and functions such as APIS, UI screens, and database table and supports multi-platform and multi-technology. All functions are created as generic functions which are not dependent on any platforms and technology. Once test case request is received the TaaC framework searches for use cases and test steps at execution controller.
[029] The system disclosed has the intelligence to limit the steps to the required step by understanding through the pre-defined orchestration and the configuration. This is achieved by configuring steps outside the scope of the component or platform as 'Stubbed', which acts as a boundary for test step creation pertaining to a specific test phase. The TaaC framework is handy in a complex ecosystem like telecommunication that are continuing to be run heavily by legacy and customized applications however the solution is generic enough to be deployed into any ecosystem. As the TaaC framework is not aligned to specific technologies and uses generic principles, the solution can still to be re-used when the organization modernizes their ecosystem.
[030] The system (TaaC framework) provides catalog based automated test case creation, configuration driven automated test steps creation, configuration driven controlled execution specific to test phase, and easy maintenance. Domain knowledge is sufficient to create configurations for New product developments (NPD) and any other functional changes
[031] The TaaC framework is portable, Reusable, scalable, has global reach with industry agnostic solution providing E2E configuration driven test case creation, execution, and reporting. No development is needed for similar new products developments. The TaaC framework provides a zero touch test automation integrating into CI/CD pipeline, with one solution to fit all test phases through controlled execution
[032] With existing solutions there are certain technical challenges such as:
1. Cost: High delivery cost due to limited test automation across all test phases (From component testing till E2E testing)
2. Time: Increased Time to the Market due to the amount of time taken for manual testing across test phases.
3. Quality: Reduced Right First Time and defect leakage due to limited test coverage
[033] The TaaC Framework can provide up to 30% reduction in the overall delivery cost of a enterprise testing , minimum of 2X times accelerated software delivery, up to 96% improved Right First Time characteristic and up to 96% test coverage.
[034] Referring now to the drawings, and more particularly to FIGS. 1A through 11, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method.
[035] FIG. 1 is a functional block diagram of a system 100 for end-to-end low code automated test case creation, execution, and reporting framework, in accordance with some embodiments of the present disclosure. The system 100, also referred to a TaaC framework is integrated with a continuous integration and continuous deployment (CI/CD) pipeline. The CI/CD pipeline communicates with the system 100 to trigger for each of the test phases such as component test phase, platform test phase and enterprise integration test phase.
[036] In an embodiment, the system 100 includes a processor(s) 104, communication interface device(s), alternatively referred as input/output (I/O) interface(s) 106, and one or more data storage devices or a memory 102 operatively coupled to the processor(s) 104. The system 100 with one or more hardware processors is configured to execute functions of one or more functional blocks of the system 100.
[037] Referring to the components of system 100, in an embodiment, the processor(s) 104, can be one or more hardware processors 104. In an embodiment, the one or more hardware processors 104 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the one or more hardware processors 104 are configured to fetch and execute computer-readable instructions stored in the memory 102. In an embodiment, the system 100 can be implemented in a variety of computing systems including laptop computers, notebooks, hand-held devices such as mobile phones, workstations, mainframe computers, servers, and the like.
[038] The I/O interface(s) 106 can include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface and the like and can facilitate multiple communications within a wide variety of networks N/W and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular and the like. In an embodiment, the I/O interface (s) 106 can include one or more ports for connecting to a number of external devices or to another server or devices.
[039] The memory 102 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[040] In an embodiment, the memory 102 includes a plurality of modules 110 such as component test phase execution controller, platform test phase execution controller, enterprise integration test phase execution controller and the like. These are called by the system on receiving triggers for test phase execution from the CI/CD pipeline for impacted packages and function of the code committed to the CI/CD pipeline
[041] Further, the plurality of modules 110 include programs or coded instructions that supplement applications or functions performed by the system 100 for executing different steps involved in the process of end-to-end low code automated test case creation, execution, and reporting, being performed by the system 100. The plurality of modules 110, amongst other things, can include routines, programs, objects, components, and data structures, which performs particular tasks or implement particular abstract data types. The plurality of modules 110 may also be used as, signal processor(s), node machine(s), logic circuitries, and/or any other device or component that manipulates signals based on operational instructions. Further, the plurality of modules 110 can be used by hardware, by computer-readable instructions executed by the one or more hardware processors 104, or by a combination thereof. The plurality of modules 110 can include various sub-modules (not shown).
[042] Further, the memory 102 may comprise information pertaining to input(s)/output(s) of each step performed by the processor(s) 104 of the system100 and methods of the present disclosure.
[043] Further, the memory 102 includes a database 108. The database (or repository) 108 may include a plurality of abstracted pieces of code for refinement and data that is processed, received, or generated as a result of the execution of the plurality of modules in the module(s) 110.
[044] Although the data base 108 is shown internal to the system 100, it will be noted that, in alternate embodiments, the database 108 can also be implemented external to the system 100, and communicatively coupled to the system 100. The data contained within such an external database may be periodically updated. For example, new data may be added into the database (not shown in FIG. 1A) and/or existing data may be modified and/or non-useful data may be deleted from the database. In one example, the data may be stored in an external system, such as a Relational Database Management System (RDBMS) or NoSQL.
[045] The database 108 further includes a plurality of modules that are preconfigured or built based on user inputs. The modules include a functions registry, an API definition catalog, a database definition catalog, a scenario registry, a test data registry, a validation catalog, and a orchestration registry, a product catalog, a package manager data model, and a resource inventory. The configuring/building of the modules based on user input and information extracted from external resources is explained in conjunction with process flow depicted in FIGS. 3 through 9.
[046] The building of the function registry is depicted in FIG. 3. The functions registry based on a plurality of functions selected by a user, the plurality of functions comprising a system function category, a basic function category, and a specific function category, wherein each function is identified by a keyword identifier stored in the function registry. The system function category is used to process different key capabilities for operations of an enterprise and in-built automation capabilities. The basic function category provides common re-usable generic functions across industries comprising Parse and Interpret message, API validation, and UI validation.
[047] The functions are written, and a keyword is automatically assigned (Can be renamed) and put into the functions registry along with the descriptions from the comment statement. This keyword needs to be used in Scenario registry and Orchestration registry. The specific function category comprises the functions specific to an Industry or Organization and are highly customizable and is used for configuration of the orchestration registry.
[048] The building of the API definition catalog is depicted in FIG. 4. The API definition catalog has capability to understand a type of API specification provided by the user and parse the type of API specification to identify the elements, path, rules for the elements
[049] The building of the database definition catalog is depicted in FIG. 5. Database definition catalog has capability to understand the type of data model including the relationship between the entities, the definition of each entities with the column definition such as data type, length, and constraints. The database specification may be in the format of XML, JSON etc...
[050] The building of the scenario registry is depicted in FIG. 6. The scenario registry has a plurality of scenarios and populated predefined rules for each of the plurality of scenarios, wherein the plurality of scenarios are created automatically based on type of industry received as input from the user. Few examples for scenarios, corresponding rules are provided below:
Example 1: Once the code is committed, in the component test phase, the package manager data model indicates that the newly committed code is impacting 'searchAddress' API. Now the system 100 looks for the rules under Test API. Each of the rule points to an underlying pre-built basic function that has got the logic to fulfil these rule. For each of the rule, all combinations of test cases are created looking through the API definition catalog and Validation catalog. There is also flexibility for the user to enable or disable a specific rule. By default the rules are enabled.
Table 1:
Scenario Rule Enable
Test API Mandatory validation
Dependent mandatory validation
Length validation
RegEx validation
Y
Y
Y
Y
Mandatory validation: What is the response if the mandatory element is missing.
Dependent mandatory validation: What is the response if the API dependent condition is not met.
Length validation: What is the response if the max length is exceeded, or min length is not met.
RegEx validation: What is the response if the Regular expressions are violated.
Example 2: Once component test is completed, the platform test is triggered by the CI/CD pipeline. The package manager data model returns the distinct scenarios across all impacted packages. Say, in this example case it is ‘Place new order’. The system 100 looks for the scenario in the scenario registry to get the rule. In this case it refers to as ‘Specific function’, meaning that it must look at the orchestration registry. The orchestration registry has the steps for the scenario 'Place new order' with each step referring to an underlying functions in the functions registry.
[051] In the case of an enterprise testing (E2E testing) of an orchestrated ordering journey, the rule will look like as in Table 2. Table 2 is applicable for Platform test as well, not just for E2E test phase.
Table 2:
Scenario Rule Enable
Place new order Orchestration look up Y
[052] The building of the test data registry is depicted in FIG. 7. The test data registry generates test data based on a test data generation technique aligning to industry definition. The system 100 automatically creates template for the test data based on the specification document. The fields in the template will implicitly point to the path of the element in the message.
[053] FIG. 8A and FIG. 8B depict the flow for building of the validation catalog that performs API validation, UI validation and database validation based on prepopulated business errors and technical errors.
[054] The building of the orchestration registry is depicted in FIG. 9. While configuring the Orchestration registry specific to test phase (Component test or platform test), the steps outside the scope of the component or platform are configured as 'Stubbed'. This acts as a boundary for test step creation pertaining to a specific test phase.
[055] FIGS. 3 through 9 are better understood with the use case E2E example described later.
[056] Product catalog: Defines enterprise testing bundles with offered features. This will be refreshed automatically from the test environment to be used for testing. The function is written to parse and read the catalog as different organizations will have different structures.
[057] Package manager data model: Stores modelled relationship between packages and functions of each of the plurality of applications, wherein the packages and function list comprises Application Programming Interfaces (APIs), User Interface (UI) screens and the database (DB) tables. This module is used to manage the relationship between the packages / functions of the application and the scenarios in the scenario registry. During initial set up the relationship between all packages and the scenario must be established which is then used to identify the impacted scenarios when the code is committed, and only those will be executed.
[058] Resource inventory: This is the repository to manage the URLs, endpoint, credentials, host details of UI, API, Database etc.
[059] FIG. 2 is a flow diagram illustrating a method 200 for end-to-end low code automated test case creation, execution and reporting framework, using the system 100 depicted in FIG. 1, in accordance with some embodiments of the present disclosure.
[060] In an embodiment, the system 100 comprises one or more data storage devices or the memory 102 operatively coupled to the processor(s) 104 and is configured to store instructions for execution of steps of the method 200 by the processor(s) or one or more hardware processors 104. The steps of the method 200 of the present disclosure will now be explained with reference to the components or blocks of the system 100 as depicted in FIG. 1 and the steps of flow diagram as depicted in FIG. 2. Although process steps, method steps, techniques or the like may be described in a sequential order, such processes, methods, and techniques may be configured to work in alternate orders. In other words, any sequence or order of steps that may be described does not necessarily indicate a requirement that the steps be performed in that order. The steps of processes described herein may be performed in any order practical. Further, some steps may be performed simultaneously.
[061] Referring to the steps of the method 200, at step 202 of the method 200, the one or more hardware processors 104 are configured by the instructions to determine, for a committed code of each of a plurality of applications received from a continuous integration and continuous deployment (CI/CD) pipeline, a plurality of scenarios from a scenario registry for testing an impacted list comprising impacted packages and functions. The impacted packages and function identified and provided to the system 100 by the CI/CD pipeline. The scenarios are determined by the package manager data model executed by one or more hardware processors that stores modelled relationship between packages and functions of each of the plurality of applications. The impacted packages and function list comprises Application Programming Interfaces (APIs), User Interface (UI) screens and the database (DB) tables.
[062] At step 204 of the method 200, the component test phase execution controller executed by the one or more hardware processors 104 is configured by the instructions to execute the component test phase on receiving a trigger from the CI/CD pipeline. The component test phase for each application among the plurality of applications comprising steps of:
a. Segregating, the impacted list of each application into a plurality of groups based on whether the packages and functions impact the APIs, the UI screens, or the database.
b. Obtaining for each of the plurality of groups a set of rules defined in the scenario registry. The set of rules define a sequence of steps to be executed for testing the determined plurality of scenarios for each of the impacted APIs, impacted UIs, and impacted DB tables.
c. Generating one or more test cases for each rule among a set of rules for each of the impacted APIs, impacted UIs, and impacted DB tables. The number of test cases to be created is based steps configured for each rule in the scenario registry.
d. Defining an expected outcome for each of the test cases based on a validation catalog that stores expected outcome code and message configured against each element in the request/response. An element is part of XML or JSON.
e. Executing each test case by a test executor by identifying an associated scenario from the determined plurality of scenarios, corresponding test data from a test data registry and referring to an orchestration registry to follow a plurality of orchestration steps.
f. Once the component test phase is executed, the control is transferred back to the CI/CD pipeline post publishing the test results of each of the test cases.
[063] At step 206 of the method 200, the Platform test phase execution controller executed by the one or more hardware processors 104 is configured by the instructions to execute the platform test phase on receiving the trigger from the CI/CD pipeline. Each platform among a plurality of platforms on an enterprise is built on a set of applications among the plurality of applications tested in the component test phase. The platform test phase for each platform comprising steps of:
a. Identifying a platform associated with the set of applications tested during the component test phase.
b. Obtaining the plurality of scenarios affected by the platform using the package manager data model.
c. Obtaining for each of the platform the set of rules defined in the scenario registry specifying the sequence of steps for testing of each scenario.
d. Generating one or more test cases for each rule among a set of rules for the platform, wherein the number of test cases to be used is determined based steps configured for each rule.
e. Defining an expected outcome for each of the test cases based on the validation catalog that stores expected outcome and message configured against each element,.
f. Executing each test case by a test executor by identifying the associated scenario from among the plurality of scenarios affected by the platform using the scenario registry, corresponding test data from the test data registry and referring to the orchestration registry to follow the plurality of orchestration steps.
g. Transferring control back to the CI/CD pipeline post publishing the test results of each of the test cases.
[064] At step 208 of the method 200, the enterprise integration test phase execution controller executed by the one or more hardware processors 104 is configured by the instructions to execute the enterprise integration test phase for the enterprise testing on receiving the trigger from the CI/CD pipeline. The enterprise is built on the plurality of platforms.
[065] The enterprise integration test phase for the enterprise testing comprising steps of:
a. Obtaining the plurality of scenarios affected by the plurality platform using the package manager data model.
b. Obtaining for the enterprise the set of rules defined in the scenario registry defining the sequence of steps for testing of each scenario.
c. Generating one or more test cases for each rule among a set of rules for the platform using API definition catalog, data definition catalog and product catalog, wherein the number of test cases to be used is determined based steps configured for each rule.
d. Defining an expected outcome for each of the test cases based on the validation catalog that stores expected outcome and message configured against each element.
e. Executing each test case by a test executor by identifying an associated scenario from among the plurality of scenarios affected by the plurality of platforms using the scenario registry and referring to the orchestration registry to follow the plurality of orchestration steps.
f. Transferring control back to the CI/CD pipeline post publishing the test results of each of the test cases.
[066] FIG. 10 depicts test phase specific customization enabled by the system of FIG. 1, in accordance with some embodiments of the present disclosure.
[067] FIG. 11 depicts a test case creator flow followed by the system of FIG. 1, in accordance with some embodiments of the present disclosure. The solution will look at all registry and catalog such as scenario registry ,Orchestration registry, API catalog, Validation catalog, product catalog and create test cases and test steps
[068] The test case creator could be scheduled to refresh on a periodic basis or through a manual trigger.
[069] FIG. 3 through FIG. 11 is better understood by an end to end example of an enterprise level testing.
Described below are modules prior built by the system 100 (also referred to as TaaC framework or TaaC) serving as prerequisites:
Table 4:
Module Description
UI Inbuilt UI to manage all registries and catalogs of the application
Functions registry (FIG. 3) General The functions are written, and a keyword is automatically assigned (Can be renamed) and put into the functions registry along with the descriptions from the comment statement. This keyword needs to be used in Scenario registry and Orchestration registry
System function Which is used to processing of different key capabilities for operations of the system and in-built automation capabilities
Basic function Common re-usable generic functions across industries such as Parse message, API validation, UI validation etc.
Specific function These are the functions specific to an Industry / Organization and are highly customizable. This is the key for configuration of the orchestration registry. The functions required for automated creation of pre-requisites come along with the TaaC framework. Any new customizations or new functions to be created goes into specific functions category
Scenario registry
(FIG. 6) The scenario registry is empty by default however when setting up, the user could select auto mode that can create scenarios aligned to industry. The TaaC framework comes with solution to create these automated rules. For example, if this is to be used for Telecom sector then the solution is be based on TMForum model.
Orchestration registry (FIG. 9) Similar to scenario registry, this registry is also empty by default and could be auto populated based on industry standards by selecting ‘Industry standard’ mode during the initial set up
Test data registry
(FIG. 7) Same as above, this this registry is also empty by default and could be auto populated based on selecting ‘Auto’ during the initial set up. The TaaC framework is built with algorithm to generate test data aligning to Industry wide strategy or through scanning the existing test environment.
Package manager data module, also referred to as package manager The package manager data model template is inbuilt with the TaaC framework . The user needs to configure all the package/function name along with the impacted areas.
API definition catalog (FIG. 4) This is empty by default. The TaaC framework comes with capability to identify the type of specification uploaded, mechanism to parse and read the specification, convert them into a format required for the TaaC Framework
Data definition catalog (FIG. 5) This is empty by default. The TaaC framework comes with capability to identify the type of specification uploaded, mechanism to parse and read the specification, convert them into a format required for the TaaC Framework
Validation catalog
(FIG. 8A and 8B) This is empty by default. The TaaC framework comes with capability to generate API, UI and database validations based on API definition catalog for API, API definition catalog and element properties for UI and data definition catalog for database.
Test case creator
(FIG. 11) This is empty by default. The TaaC framework will come with solution to create all possible combinations test cases automatically based on different registries and catalog
Test executor The TaaC framework will come with solution to execute the test cases created by test case creator
[070] What is the set up to be done before execution of code testing for an enterprise? A telecom industry use case is considered to understand the TaaC framework flow.
[071] Problem statement: An Enterprise dealing with provisioning of broadband and VOIP wants to implement End to End automation for the entire ecosystem.
The enterprise consists of different platforms
• Order provisioning & fulfilment platform
• Network platform (for performing activations)
• Billing platform
• Assurance platform
Say, each platform has one or more component. For example, the Order provisioning and the fulfilment platform that has got the following components:
• An UI for the agents to capture the order from the customer
• Business to Business API services offered for other partners
• Internal API services that integrates UI and other internal and 3rd party components such as supplier(Basically an integration layer)
• Order orchestration through a BPM driven by database and API
The enterprise intends to have a single automation framework for performing component test, Platform test (Integration two or more components), enterprise test (End to end test integrating all the platforms). The enterprise follows proper (TMF) model and implemented all the solution aligning to the TMF model.
[072] Set up required: The first step is to set up the scenario registry using automatic mode that creates all different scenarios. For example herein, a subset (Show 2 scenarios) of different scenarios automatically created and configured in the scenario registry are provided below.
Table 5:
:Scenario Rule Enable
Test API Mandatory validation
Dependent mandatory validation
Length validation
RegEx validation Y
Y
Y
Y
Place new order Orchestration look up Y
Test API:
• Mandatory validation: What is the response if the mandatory element is missing.
• Dependent mandatory validation: What is the response if the API dependent condition is not met.
• Length validation: What is the response if the max length is exceeded, or min length is not met.
• RegEx validation: What is the response if the Regular expressions are violated.
[073] Place new order: The rule configured is ‘Orchestration look up’ and hence the TaaC framework looks up the orchestration registry for the sequence of steps.
1) Set up orchestration registry by selecting ‘Industry standard’ i.e. automatic creation of orchestration steps based on TMF model. From the above, it can be assumed that the TaaC framework is automatically creating orchestration registry entries for Place new order. The set up look like below:
Table 6:
Scenario Step (Function keyword) Step group Scenario rule Platform
Place new order searchAddress 1 Test API Provisioning & Fulfilment
searchAddress 1 Test UI Provisioning & Fulfilment
searchProduct 2 Test API Provisioning & Fulfilment
searchProduct 2 Test UI Provisioning & Fulfilment
selectProductFeature 3 Test API Provisioning & Fulfilment
selectProductFeature 3 Test UI Provisioning & Fulfilment
bookAppointment 4 Test API Provisioning & Fulfilment
bookAppointment 4 Test UI Provisioning & Fulfilment
placeOrder 5 Test API Provisioning & Fulfilment
placeOrder 5 Test UI Provisioning & Fulfilment
placeOrder 5 Test Database Provisioning & Fulfilment
allocateInventory 5 Test API Provisioning & Fulfilment
allocateInventory 5 Test Database Provisioning & Fulfilment
createSupplierOrder 6 Test API Provisioning & Fulfilment
createSupplieOrder 6 Test Database Provisioning & Fulfilment
supplierUpdateAccepted 7 Test API Provisioning & Fulfilment
supplierUpdateAccepted 7 Test Database Provisioning & Fulfilment
supplierUpdateAccepted 7 Test UI Provisioning & Fulfilment
….. ….. …..
activateService 15 Test API Network
activateServiceResponse 15 Test API Network
….. ….. …..
orderCompleted N Test API Provisioning & Fulfilment
orderCompleted N Test Database Provisioning & Fulfilment
orderCompleted N Test UI Provisioning & Fulfilment
Interpreting table 6: The ‘place new order scenario’ has N number of steps happening in a sequence. Each step requires one or more types of testing. For example
searchAddress – The Journey will start from UI where customer provides their post code, search and select address. When the customer clicks search it calls the underlying API and retrieves matching address through the API. So, for this step both API testing and UI testing is required. The group number says that it is a single step but requires different tests. The scenario rule will point to the scenario registry again. If the 1st record for searchAddress is taken, it says Test API which will pick the Test API rules such as Mandatory validation, Length validation etc.. from the scenario registry. The function (in the function registry) pertaining to ‘searchAddress’ will have the details of what catalogs to be referred. In this case, it will need to look at API definition catalog Item ‘searchAddress’ , validation catalog item ‘searchAddress’, Test data registry item ‘searchAddress’, and the logic of how to frame the test step.
The application id refers to the individual application within the group. During component testing, each application will be tested, during platform testing these applications will be integrated and tested
2) Configure all the different components in the package manager data model and map the scenario.
Table 7: Example
Package / Function name Component Platform Scenario
Package A UI Provision & Fulfilment Test UI
Package A UI Provision & Fulfilment Place new order
Package A UI Provision & Fulfilment Terminate service
Package B API Provision & Fulfilment Test UI
Package B API Provision & Fulfilment Test API
Package B API Provision & Fulfilment Place new order
Package B API Provision & Fulfilment Terminate service
Note: This is not the exact model. This is purely for understanding. The data model has multiple tables and will be modelled properly in the actual implementation.
There is package A which is part of UI build. If a change happens in package A then a standalone UI testing is required to happen. The screen built using package A is used in scenarios ‘ Place new order’ and ‘Terminate Service’ and hence these scenarios are required to be tested.
Similarly, Package B is part of a API and hence standalone API testing (Test API) is required. The package B is also consumed by Package UI as underlying API and hence the UI capabilities need to be tested (Test UI). Package B impacts the scenario ‘Place new order’ and ‘Terminate service’ and hence these are also required to be tested.
Likewise the package manager data model must be configured for the entire enterprise.
3) Set up API validation catalog. For example herein, it is assumed that Swagger is used as a specification. The user uploads the swagger document. The TaaC framework converts the definitions to the defined format and present to the user to make amendments. The user then can add any custom API rules such as dependent mandatory definitions
4) Set up data definition catalog. Assume the data definition is in the JSON format. The user uploads the file. The TaaC framework identifies all the entities, its relationship, column definitions, key constraints etc. and create the specification in the format required for TaaC. The user then is able to make any amendments.
5) Set up validation catalog: The user is able to create validation catalog by referring to API definition catalog and the data definition catalog for API and database validations. For UI validations, the user will require to capture the properties of the UI. This could be achieved through record & playback as the TaaC framework comes with integration for UI test automation tools. The user also has the option to manually enter the UI properties. The TaaC framework automatically predicts and map the UI elements with the underlying API elements to perform validations.
6) Create test project – For each new release, a project must be created with the different test phases configured against it. The CI / CD pipeline triggers the project as per the release name.
7) Set up test phase specific orchestration registry – The users navigates to the project and creates a sub project for Orchestration registry customization. The user then selects one of the test phase.
a. Select platform test – Let us assume that there is a change in the platform ‘Provision & fulfilment’ and as a part of platform testing, actual integration with the platform ‘Network’ is not needed. At the same time, the network integration happens midway in the ‘Place new order’ journey and hence the steps after the network integration that are part of provisioning & fulfilment needs input to proceed. So it is needed to set up stubbed response for the platform ‘Network’. The configuration look as below:
Table 8:
Scenario Step (Function keyword) Step group Scenario rule Platform Required
Place new order searchAddress 1 Test API Provisioning & Fulfilment Y
searchAddress 1 Test UI Provisioning & Fulfilment Y
searchProduct 2 Test API Provisioning & Fulfilment Y
searchProduct 2 Test UI Provisioning & Fulfilment Y
selectProductFeature 3 Test API Provisioning & Fulfilment Y
selectProductFeature 3 Test UI Provisioning & Fulfilment Y
bookAppointment 4 Test API Provisioning & Fulfilment Y
bookAppointment 4 Test UI Provisioning & Fulfilment Y
placeOrder 5 Test API Provisioning & Fulfilment Y
placeOrder 5 Test UI Provisioning & Fulfilment Y
placeOrder 5 Test Database Provisioning & Fulfilment Y
allocateInventory 5 Test API Provisioning & Fulfilment Y
allocateInventory 5 Test Database Provisioning & Fulfilment Y
createSupplierOrder 6 Test API Provisioning & Fulfilment Y
createSupplieOrder 6 Test Database Provisioning & Fulfilment Y
supplierUpdateAccepted 7 Test API Provisioning & Fulfilment Y
supplierUpdateAccepted 7 Test Database Provisioning & Fulfilment Y
supplierUpdateAccepted 7 Test UI Provisioning & Fulfilment Y
….. ….. ….. ….. ….
activateService 15 Test API Network Stubbed
activateServiceResponse 15 Test API Network Stubbed
….. ….. ….. …… …..
orderCompleted N Test API Provisioning & Fulfilment Y
orderCompleted N Test Database Provisioning & Fulfilment Y
orderCompleted N Test UI Provisioning & Fulfilment Y
The actual data model includes the URL for stub location as well.
b. Select Enterprise test – Let us assume that there is a change in the platform ‘Provision & fulfilment’ and as a part of platform testing, Because this is an End to End test for the entire enterprise, it is required to run test integrating with all platforms and hence there data is from the actual systems and not from the stub. The field ‘Required’ says ‘Y’ for all the steps
Note: There is no need to select Component test as it is a standalone application testing that does not require orchestration and is driven by the scenario registry itself.
8) Set up test data registry – In our example, the Enterprise implementation is based on TMF model and hence let us assume that the test data is to be created automatically based on industry standard. The TaaC framework comes with inbuilt algorithm to generate test data and do so for each scenario. While setting up test data the user could select a specific scenario or select all and generate. Every time a change to any project happens, this automatically refreshes at the backend. Basically, this creates test data for every function categorized as specific within the functions registry (as specific functions are nothing but primitive steps)
9) Set up resource inventory – The resource inventory needs to be populated with accessing details such as credentials, URL etc.. for each function configured as specific function within the functions registry. The set up allows to configure different test environments and assigs to a different test phases i.e. there could be dedicated test environment for component testing, a dedicated one for Platform testing and so on.
10) Set up product catalogue – The enterprise product and features are modelled based on the Enterprise product catalogue. This is required to be configured by the user as the product and features varies across organizations.
11) Set up test case creator – This is test phase specific and hence the project needs to be set up upfront. The user selects the master project for the release and create a new test case project for each test phase
a. Select component test – Just select all shown components such as API, UI, Database and submit.
b. Select Platform test – Select specific scenario or select all scenario and then select the appropriate orchestration registry project (Step 8 (a)) and submit
c. Select Enterprise test - Select specific scenario or select all scenario and then select the appropriate orchestration registry project (Step 8 (b)) and submit
For each of these test phase, the TaaC generates NXN possible combinations of test cases along with test steps.
For component test, the test creation is based on the type of component. For example, for API,
• create test cases for mandatory rules present and missing
• generate test cases for Dependent mandatory rules present and missing
• generate test cases for Length and Regex validations
• generate test cases for optional elements
• generate test cases for success response with mandatory elements missing in the response
• generate test cases for success response with length / Regex deviations in the response
• generate test cases for success response with optional elements present, optional elements missing in the response
• Alongside creating a test case, the test steps are created by looking at the validation catalog
• For different mandatory elements present / missing, look up the appropriate record in the validation catalog and define the expected response in the test step
• For different dependent mandatory elements present / missing, look up the appropriate record in the validation catalog and define the expected response in the test step
• For different Length and Regex validations, look up the appropriate record in the validation catalog and define the expected response in the test step
• Note: The wordings are templated with a dynamic variable set which is populated in run time based on the values retrieved from the API definition catalog and validation catalog)
[074] For Platforms and Enterprise test, the test steps have the steps created based on orchestration registry and each step points to the entry in the Orchestration registry to enabled execution in sequence. Sample automatically created test case and test steps for an enterprise test looks like:
Table 9:
Test ID Test case Test step
TC_001 Place an order for product X with feature combination A, B, C and complete it end to end • Search address using the postcode ‘AL1 3UL’
• Select a random address.
• Perform product availability check and see if product X is available.
• Get product catalogue to select the features A, B, C to order
• Get an available appointment.
• Book engineer appointment for a random date and time slot
• Place the order
• Verify the inventory allocation
• Submit the order to the supplier
• Validate acknowledgement from supplier
• Send acknowledgement to customer
• Validate order commitment from supplier
• Send commitment to customer
• Validate successful service activation
• Validate order completion
• Send order completion to customer
• Send billing activation to billing system
TC_002 Place an order for product X with feature combination A, B, C where product X is unavailable for the selected address • Search address using the postcode ‘M24 1DZ’
• Select an address.
• Perform product availability where no product is available to order for the selected address and validate the response code.
Note: The test case creation could be re-used from already created dump or could be created fresh every time.
[075] The execution flow of system 100 (method 200)
1) The developer commits the code
2) The CI / CD pipeline perform build activities, deploy it to component test environment and calls the TaaC (Taac Framework)by passing inputs such as Impacted package list, Test phase as component testing, release number. Assume Package A and Package B are impacted packages
3) The TaaC then picks the project matching the release number and all sub projects related to component test phase
4) The TaaC reviews the impacted package list with package manager and returns the mapping between the package, primitive functions, and the component category such as UI, API.
5) Assuming, test creation and other set up are done upfront, the TaaC calls the Test executor. The test executor picks the test cases created under component test for the specified release and picks only the impacted APIs and UI screens using the execution controller, executes them and validates the result with the expected result.
6) On successful completion, the control goes back to CI / CD pipeline for deployment into platform test environment. On approval to trigger the platform test, the CI / CD pipeline again calls the TaaC and pass the required inputs along with test phase as Platform test
7) The TaaC then picks the project matching the release number and all sub projects related to Platform test phase
8) The TaaC reviews the impacted package list with package manager and returns the mapping between the package, primitive functions, and the scenarios.
9) The TaaC then call the test executor and execute the already created test cases for the impacted scenarios and executes them. Wherever the orchestration registry says the step as stubbed, the call is made to the stub and a stubbed response is retrieved and passed on to the next step
10) On successful completion, the control goes back to CI / CD pipeline for deployment into Enterprise test environment. On approval to trigger the Enterprise test, the CI / CD pipeline again calls the TaaC and pass the required inputs along with test phase as Enterprise test
11) The TaaC then picks the project matching the release number and all sub projects related to Enterprise test phase
12) The TaaC reviews the impacted package list with package manager and returns the mapping between the package, primitive functions, and the scenarios.
13) The TaaC then calls the test executor and execute the already created test cases for the impacted scenarios and executes them. Wherever the orchestration registry says the step as stubbed, the call is made to the stub and a stubbed response is retrieved and passed on to the next step
14) On successful completion, the control goes back to CI / CD pipeline
Note: At the end of each phase, the results are stored in the selected test management tool for each test case and test step.
[076] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims.
[077] It is to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device which can be programmed including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g. hardware means like e.g. an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software processing components located therein. Thus, the means can include both hardware means, and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g. using a plurality of CPUs.
[078] The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various components described herein may be implemented in other components or combinations of other components. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[079] The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
[080] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[081] It is intended that the disclosure and examples be considered as exemplary only, with a true scope of disclosed embodiments being indicated by the following claims.
, Claims:We Claim:
1. A processor implemented method, the method comprising:
determining (202) via one or more hardware processors, for a committed code of each of a plurality of applications received from a continuous integration and continuous deployment (CI/CD) pipeline, a plurality of scenarios from a scenario registry for testing an impacted list comprising impacted packages and functions received from the CI/CD pipeline, wherein the scenarios are determined by a package manager data model executed by one or more hardware processors that stores modelled relationship between packages and functions of each of the plurality of applications, wherein the impacted packages and function list comprises Application Programming Interfaces (APIs), User Interface (UI) screens, and the database (DB) tables;
executing (204), by the one or more hardware processors, a component test phase on receiving a trigger from the CI/CD pipeline for the component test phase, the component test phase for each application among the plurality of applications comprising steps of:
segregating the impacted list of each application into a plurality of groups based on whether the packages and functions impact the APIs, the UI screens, or the database;
obtaining for each of the plurality of groups a set of rules defined in the scenario registry, wherein the set of rules define a sequence of steps to be executed for testing the determined plurality of scenarios for each of the impacted APIs, impacted UIs, and impacted DB tables;
generating one or more test cases for each rule among a set of rules for each of the impacted APIs, impacted UIs, and impacted DB tables, wherein the number of test cases to be created is based steps configured for each rule in the scenario registry;
defining an expected outcome for each of the test cases based on a validation catalog that stores expected outcome code and message configured against each element in the request/response, wherein an element is part of XML or JSON;
executing each test case by a test executor by identifying an associated scenario from the determined plurality of scenarios, associated test data from a test data registry and referring to an orchestration registry to follow a plurality of orchestration steps; and
transferring control back to the CI/CD pipeline post publishing the test results of each of the test cases;
executing (206), by the one or more hardware processors, a platform test phase to test each platform among a plurality of platforms of an enterprise on receiving the trigger from the CI/CD pipeline, wherein each platform is built using a set of applications among the plurality of applications tested in the component test phase, the platform test phase for each platform comprising steps of:
identifying a platform associated with the set of applications tested during the component test phase;
obtaining the plurality of scenarios affected by the platform using the package manager data model;
obtaining for each of the platform the set of rules defined in the scenario registry specifying the sequence of steps for testing of each scenario;
generating one or more test cases for each rule among a set of rules for the platform, wherein the number of test cases to be used is determined based steps configured for each rule;
defining an expected outcome for each of the test cases based on the validation catalog that stores expected outcome and message configured against each element;
executing each test case by a test executor by identifying the associated scenario from among the plurality of scenarios affected by the platform using the scenario registry, corresponding test data from the test data registry and referring to the orchestration registry to follow the plurality of orchestration steps; and
transferring control back to the CI/CD pipeline post publishing the test results of each of the test cases; and
executing (208), by the one or more hardware processors, an enterprise integration test phase for the enterprise on receiving the trigger from the CI/CD pipeline, wherein the enterprise is built on the plurality of platforms.
2. The method as claimed in claim 1, wherein the enterprise integration test phase comprising steps of:
obtaining the plurality of scenarios affected by the plurality of platforms using the package manager data model;
obtaining for each of the plurality of platforms, the set of rules defined in the scenario registry defining the sequence of steps for testing of each scenario;
generating one or more test cases for each rule among a set of rules for the platform using API definition catalog, data definition catalog and product catalog, wherein the number of test cases to be used is determined based steps configured for each rule;
defining an expected outcome for each of the one or more test cases based on the validation catalog that stores expected outcome and message configured against each element; and
executing each test case by a test executor by identifying an associated scenario from among the plurality of scenarios affected by the plurality of platforms using the scenario registry and referring to the orchestration registry to follow the plurality of orchestration steps; and
transferring control back to the CI/CD pipeline post publishing the test results of each of the one or more test cases.
3. The method as claimed in claim 1, comprising prompting a user to provide inputs for building a plurality of modules based on the inputs and extracting information from external sources, the plurality of modules comprising:
a functions registry based on a plurality of functions selected by a user, the plurality of functions comprising a system function category, a basic function category, and a specific function category, wherein each function is identified by a keyword identifier stored in the function registry and used by the scenario registry and the orchestration registry, and wherein
the system function category is used to process different key capabilities for operations of an enterprise and in-built automation capabilities,
the basic function category provides common re-usable generic functions across industries comprising Parse message, API validation, and UI validation, and
the specific function category comprises the functions specific to an Industry or Organization and are highly customizable, and is used for configuration of the orchestration registry;
an API definition catalog with capability to understand a type of API specification provided by the user and parse the type of API specification to identify the elements, path, rules for the elements;
the database definition catalog to understand the type of data model including the relationship between the entities, the definition of each entities with the column definition such as data type, length, and constraints;
the scenario registry comprising a plurality of scenarios and populated predefined rules for each of the plurality of scenarios, wherein the plurality of scenarios are created automatically based on type of industry received as input from the user;
the test data registry that generates test data based on a test data generation technique aligning to industry definition;
the validation catalog performs API validation, UI validation and database validation based on prepopulated business errors and technical errors;
the orchestration registry comprising industry specific orchestration steps; and
a product catalog defining enterprise bundles with offered features.
4. A system (100) comprising:
a memory (102) storing instructions;
one or more Input/Output (I/O) interfaces (106); and
one or more hardware processors (104) coupled to the memory (102) via the one or more I/O interfaces (106), wherein the one or more hardware processors (104) are configured by the instructions to:
determine, for a committed code of each of a plurality of applications received from a continuous integration and continuous deployment (CI/CD) pipeline, a plurality of scenarios from a scenario registry for testing an impacted list comprising impacted packages and functions received from the CI/CD pipeline, wherein the scenarios are determined by a package manager data model executed by one or more hardware processors that stores modelled relationship between packages and functions of each of the plurality of applications, wherein the impacted packages and function list comprises Application Programming Interfaces (APIs), User Interface (UI) screens, and the database (DB) tables;
execute a component test phase on receiving a trigger from the CI/CD pipeline for the component test phase, the component test phase for each application among the plurality of applications comprising steps of:
segregating the impacted list of each application into a plurality of groups based on whether the packages and functions impact the APIs, the UI screens, or the database;
obtaining for each of the plurality of groups a set of rules defined in the scenario registry, wherein the set of rules define a sequence of steps to be executed for testing the determined plurality of scenarios for each of the impacted APIs, impacted UIs, and impacted DB tables;
generating one or more test cases for each rule among a set of rules for each of the impacted APIs, impacted UIs, and impacted DB tables, wherein the number of test cases to be created is based steps configured for each rule in the scenario registry;
defining an expected outcome for each of the test cases based on a validation catalog that stores expected outcome code and message configured against each element in the request/response, wherein an element is part of XML or JSON;
executing each test case by a test executor by identifying an associated scenario from the determined plurality of scenarios, associated test data from a test data registry and referring to an orchestration registry to follow a plurality of orchestration steps; and
transferring control back to the CI/CD pipeline post publishing the test results of each of the test cases;
execute a platform test phase to test each platform among a plurality of platforms of an enterprise on receiving the trigger from the CI/CD pipeline, wherein each platform is built using a set of applications among the plurality of applications tested in the component test phase, the platform test phase for each platform comprising steps of:
identifying a platform associated with the set of applications tested during the component test phase;
obtaining the plurality of scenarios affected by the platform using the package manager data model;
obtaining for each of the platform the set of rules defined in the scenario registry specifying the sequence of steps for testing of each scenario;
generating one or more test cases for each rule among a set of rules for the platform, wherein the number of test cases to be used is determined based steps configured for each rule;
defining an expected outcome for each of the test cases based on the validation catalog that stores expected outcome and message configured against each element;
executing each test case by a test executor by identifying the associated scenario from among the plurality of scenarios affected by the platform using the scenario registry, corresponding test data from the test data registry and referring to the orchestration registry to follow the plurality of orchestration steps; and
transferring control back to the CI/CD pipeline post publishing the test results of each of the test cases; and
execute an enterprise integration test phase for the enterprise on receiving the trigger from the CI/CD pipeline, wherein the enterprise is built on the plurality of platforms.
5. The system (100) as claimed in claim 4, wherein the one or more hardware processors are configured to perform the enterprise integration test phase by:
obtaining the plurality of scenarios affected by the plurality of platforms using the package manager data model;
obtaining for each of the plurality of platforms, the set of rules defined in the scenario registry defining the sequence of steps for testing of each scenario;
generating one or more test cases for each rule among a set of rules for the platform using API definition catalog, data definition catalog and product catalog, wherein the number of test cases to be used is determined based steps configured for each rule;
defining an expected outcome for each of the one or more test cases based on the validation catalog that stores expected outcome and message configured against each element; and
executing each test case by a test executor by identifying an associated scenario from among the plurality of scenarios affected by the plurality of platforms using the scenario registry and referring to the orchestration registry to follow the plurality of orchestration steps; and
transferring control back to the CI/CD pipeline post publishing the test results of each of the one or more test cases.
6. The system (100) as claimed in claim 4, wherein the one or more hardware processors (104) are configured to prompt a user to provide inputs for building a plurality of modules based on the inputs and extracting information from external sources, the plurality of modules comprising:
a functions registry based on a plurality of functions selected by a user, the plurality of functions comprising a system function category, a basic function category, and a specific function category, wherein each function is identified by a keyword identifier stored in the function registry and used by the scenario registry and the orchestration registry, and wherein
the system function category is used to process different key capabilities for operations of an enterprise and in-built automation capabilities,
the basic function category provides common re-usable generic functions across industries comprising Parse message, API validation, and UI validation, and
the specific function category comprises the functions specific to an Industry or Organization and are highly customizable, and is used for configuration of the orchestration registry;
an API definition catalog with capability to understand a type of API specification provided by the user and parse the type of API specification to identify the elements, path, rules for the elements;
the database definition catalog to understand the type of data model including the relationship between the entities, the definition of each entities with the column definition such as data type, length, and constraints;
the scenario registry comprising a plurality of scenarios and populated predefined rules for each of the plurality of scenarios, wherein the plurality of scenarios are created automatically based on type of industry received as input from the user;
the test data registry that generates test data based on a test data generation technique aligning to industry definition;
the validation catalog performs API validation, UI validation and database validation based on prepopulated business errors and technical errors;
the orchestration registry comprising industry specific orchestration steps; and
a product catalog defining enterprise bundles with offered features.
Dated this 25th Day of October 2023
Tata Consultancy Services Limited
By their Agent & Attorney
(Adheesh Nargolkar)
of Khaitan & Co
Reg No IN-PA-1086
| # | Name | Date |
|---|---|---|
| 1 | 202321072615-STATEMENT OF UNDERTAKING (FORM 3) [25-10-2023(online)].pdf | 2023-10-25 |
| 2 | 202321072615-REQUEST FOR EXAMINATION (FORM-18) [25-10-2023(online)].pdf | 2023-10-25 |
| 3 | 202321072615-PROOF OF RIGHT [25-10-2023(online)].pdf | 2023-10-25 |
| 4 | 202321072615-FORM 18 [25-10-2023(online)].pdf | 2023-10-25 |
| 5 | 202321072615-FORM 1 [25-10-2023(online)].pdf | 2023-10-25 |
| 6 | 202321072615-FIGURE OF ABSTRACT [25-10-2023(online)].pdf | 2023-10-25 |
| 7 | 202321072615-DRAWINGS [25-10-2023(online)].pdf | 2023-10-25 |
| 8 | 202321072615-DECLARATION OF INVENTORSHIP (FORM 5) [25-10-2023(online)].pdf | 2023-10-25 |
| 9 | 202321072615-COMPLETE SPECIFICATION [25-10-2023(online)].pdf | 2023-10-25 |
| 10 | 202321072615-FORM-26 [19-01-2024(online)].pdf | 2024-01-19 |
| 11 | Abstract.1.jpg | 2024-02-06 |
| 12 | 202321072615-FORM-26 [12-11-2025(online)].pdf | 2025-11-12 |