Abstract: A computer-implemented system and method for automated functional test generation has been disclosed. The proposed system utilizes business process models, which depict an application"s functionality without going intoimplementation details for test case generation and eliminates the continuous dependency upon domain experts for ensuring test quality and coverage. Further, the system provides automated tools to test design teams to specify tests, structure the specification and verify its correctness and coverage. Moreover, the proposed system eliminates the manualeffort-intensive tasks of test data preparation and test script creation and also manages test script maintenance to keep scripts updated each time there is a change in the application. Thereby, the proposed system provides an end-to-end approach from test designto automation, usable by business analysts andtest teams alike.
FORM-2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
A COMPUTER-IMPLEMENTED SYSTEM FOR AUTOMATED FUNCTIONAL TEST GENERATION
TATA CONSULTANCY SERVICES LTD.,
an Indian Company
of Nirmal Building, 9th Floor,
Nariman Point, Mumbai - 400021,
Maharashtra, India
The Following Specification Particularly Describes The Invention And The Manner In Which
It Is To Be Performed
FIELD OF DISCLOSURE
The present disclosure relates to the field of quality assurance.
Particularly, the present disclosure relates to the field of functional testing of business applications.
DEFINITIONS OF TERMS USED IN THE SPECIFICATION
The term 'business process model' in this specification relates to a modeling technique which facilitates in capturing activities representing operational workflows followed by an enterprise for carrying out their business.
The term 'data model' in this specification relates to a modeling technique which facilitates in capturing various entities, their attributes, business rules and associations for an application under test or a particular domain.
The term 'functional testing' in this specification relates to a type of software testing in which the application under test is divided into scenarios/ parts based on the process flows and each scenario/part is tested by giving relevant input data to check if the scenario/part meets the required output / response.
The term 'process flow' in this specification relates to events that occur during the operation of a process model.
The term 'pruning' in this specification relates to selection of specific test cases from a test suite in order to limit the test coverage, to meet predetermined conditions like time and effort cost constraints.
The term 'use-case' in this specification is used to represent a sequence of User Interface (UI) actions required toperform a particular task. These UI actions are captured from the UI screens of an application under test.
BACKGROUND
Functionaltesting is a type of testing technique performed at several stagesduring the development and maintenance lifecycle of an application Misapplication testing, integration testing and user acceptance testing to evaluate the application'scompliance with its functional requirements. Functional testing is typically performed by generating test suites that are configured to test if applications are meeting predetermined requirements.
In the current scenario, there is no fixed technique forbuilding the content of functional test suites. Individual project teams define their own testmethodology and test cases. The quality andcoverage of these test suites is completelydependent upon the availability of expertise which is further restricted by time.
The onus of building a test suite thatachieves functional coverage including specialcases and optimizing these test cases is completely on the Test design team, which currently does not have the support of any tools for ensuring coverage and optimization of the test suites. Commercially availabletools offer test management and an automationplatform but do not generate testcontent.
Moreover, tests are conventionally captured indocument form, which is difficult to gothrough and maintain for large or complex applications. Further, it is hard to track coverageand check which functionality has been coveredunder which set of conditions, from documents, specifically when there is a change in the application. Thus, the document based tests, their variations and selection need tobe managed manually. There is no automatedmechanism to verify and test documents against therequirements.
Therefore, there is felt a need for a system that:
• eliminates the manual test design process as well as the dependency upon
domain experts for ensuring test quality and coverage;
• provides tools to help test design teams to specify tests, structure the
specification and verify it for correctness, coverage or efficiency;
• eliminates the manualeffort-intensive tasks of test data preparation and
test script creation; and
• managestest script maintenance to keep scripts updated each time there is
a change in the application.
OBJECTS
Some of the objects of the present disclosure, which at least one embodiment herein satisfies, are as follows:
An object of the present disclosure is to provide a system that eliminates the manual process of test design generation.
Another object of the present disclosure is to provide a system that eliminates continuous dependency upon domain experts for ensuring test quality and coverage.
Still another object of the present disclosure is to provide a system that facilitates in specifying tests, structuring the specification and verifyingthe tests for correctness, coverage or efficiency.
Yet another object of the presentdisclosure is to provide a system that eliminates the manualeffort-intensive tasks of test data preparation and test script creation.
One more object of the disclosure is to provide a system that manages test script maintenance to keep scripts updated, each time there is a change in the application.
Other objects and advantages of the present disclosure will be more apparent from the following description when read in conjunction with the accompanying figures, which are not intended to limit the scope of the present disclosure.
SUMMARY
In accordance with this disclosure, there is envisaged a computer-implemented system for automated functional test generation for an Application Under Test (AUT), the system comprising:
- a modeling unit for capturing business process flowsalong with an associated data model for the AUT, wherein the process flows are annotated with constraints;
- a test case generation unit for traversing and verifying the business process flows and solving the constraints to generate a plurality of test cases and further pruning the test cases by employing preliminary heuristics based techniques; and
- a test script generation unit to collate a recorded sequence of use-cases corresponding to each of the test cases to generate test scripts for automating the functional testing of the AUT.
Typically, the modeling unit comprises:
- modeling tools including Business Process Modeling Notation (BPMN) tool to capture the business process flows in the form of visual notations and UML Class Diagram notation tool to capture attributes in the data model;
- translation means to translate business rules and test conditions associated with the captured business process flow into constraints,
wherein the constraints are represented using Object Constraints Language notation; and
- annotating means to annotate the constraints to the corresponding
captured business process flows.
Preferably, the modeling unit is still further configured to:
- receive and reuse a set of business process flows as a part of the captured business process flows in the event that they are already created in the requirement phase of Software Development Lifecycle;
- accept manually translated business rules and test conditions associated with the captured business process flows in the form of constraints represented using Object Constraints Language notation; and
- facilitate manual annotation of the constraints to the corresponding captured business process flows.
Further, the test case generation unit comprises:
- path generation means to generate test scenarios using a depth-first search technique and further perform loop unrolling in the event that the process flow includes loops, wherein each scenario is represented as a sequence of task tuples and each task tuple consists of a task and its specified pre and post condition;
- verification means to convert each test scenario into a specification using Symbolic Analysis Laboratory tool and verify the test scenarios against the corresponding constraints to detect inconsistencies and eliminate infeasible scenarios;
- test data generation means adapted to generate test data for each of the scenarios by solving specified pre and post conditions for each task under a test scenario; and
- compilation means adapted to compile the test scenarios and their corresponding data into test cases, in the document form.
Still further, the test script generation unit further receives a mapping file to facilitate in test script generation, wherein the mapping file includes a mapping between attributes in the domain model and input fields specified in the use-cases.
Furthermore, the system comprises a central repository for storing data generated corresponding to an AUT, wherein the data is selected from the group consisting of the business process flows, the data model, the constraints, the test cases and the test scripts.
In accordance with this disclosure, there is provided a method for automated functional test generation for an Application Under Test (AUT), the method comprising the following steps:
- generating and capturing business process flowsalong with an associated data model for the AUT;
- annotating the process flows with constraints;
- traversing and verifying the business process flows and solving the constraints to generate a plurality of test cases;
- pruning the test cases by employing preliminary heuristics based techniques; and
- collating a recorded sequence of use-cases corresponding to each of
the test cases to generate test scripts for automating the functional
testing of the AUT.
Typically, the step of annotating the process flows with constraints includes the step of translating business rules and test conditions associated with a business process flow into the constraints using Object Constraints Language.
Preferably, the step of traversing and verifying the business process flows includes the following steps:
- generating test scenarios by traversing the business process flows
using a depth-first search technique, wherein each scenario is
represented as a sequence of task tuples and each task tuple consists of
a task and its specified pre and post condition;
performing loop unrolling in the event that the business process flow includes loops; and
- converting each test scenario into a specification using Symbolic
Analysis Laboratory tool and verifying the test scenarios against the
corresponding constraints to detect inconsistencies and eliminate
infeasible scenarios.
Additionally, the step of solving the constraints to generate a plurality of test cases includes the steps of generating test data for each of the scenarios by solving specified the pre and post conditions for each task under a test scenario; and compiling the test scenarios and their corresponding data into test cases in the document form.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The computer-implemented system and method for automated functional test generation will now be described with reference to the non-limiting, accompanying drawings, in which:
FIGURE 1 illustrates a block diagram of the system for automated functional test generation in accordance with this disclosure; and
FIGURE 2 is a flowchart showing the steps involved automated functional test generation in accordance with this disclosure.
DETAILED DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The computer-implemented system and method for automated functional test generation will now be described with reference to the accompanying drawings which do not limit the scope and ambit of the disclosure. The description provided is purely by way of example and illustration.
Today, the test design process is completely manual and dependent upon domain experts for ensuring test quality and coverage. There are no tools to help test design teams to specify tests, structure the specification or to verify it for correctness, coverage or efficiency. In addition, test data preparation and test script creation are both manual, effort-intensive tasks and require a team with platform programming skills for script creation. Further, test script maintenance is another major challenge since scripts need to change each time there is a change in the application.
To overcome the aforementioned challenges, the present disclosure envisages a computer-implemented functional test generation system that automates the test design and creation process and provides an end-to-end approach from test
design to automated script generation, whichis usable by business analysts andtest teams alike.
In accordance with this disclosure, the proposed system employs a business process model based test design and script generation technique. The business process model based technique enables creation of good quality functional test suites, guaranteesfiinctional coverage and enables efficient testing. Sample test dataand scripts to automate test execution are alsogenerated by the present disclosure. Thereby, the system proposed by the present disclosure eliminates manual, effort-intensive tasks of test design, test data creation, script generation and ensures easy enhancementand maintainability of the scripts. The present disclosure aims at providing a system for facilitating testing of applications from the business users' point of view.
In accordance with this disclosure, the functionality of the Application Under Test (AUT) is captured in the form of a business process model along with an associated domain data model for the AUT. The business process model can be easily specified by business analysts and does not need technical skills. In addition, test conditions and business rules can be specified as part of the model in the form of constraints using a Business Process Modeling Notation tool. This tool is a visual notation tool and provides a structuredmodel form to the test design. Moreover, the visual notations make it easy togo through process flows and build in the missing portions,enabling creation of a test suite that achievesgood functional coverage.
Further, the proposed system generates the test cases, in the document form, by traversing the business process model, wherein the business process flows are annotated with constraints including business rules andtest conditions. Test data is also generated by this disclosure, by solving the test condition and business rule constraints. Moreover, the present disclosure verifies the traversed paths
using model checking to ensure that redundant, errors or infeasible paths are eliminated.
Still further, to facilitate the process of test script generation, the present disclosure captures the application UI details by a one-time recording of user tasks. The end-to-end test scripts are then generated for the test cases onto an automation platform. Thus, the present disclosure facilitates automatic generation of test case documents and test scripts from these business process models.
The present disclosure also performs the task of test selection. Test selection is performed by this disclosure when exhaustive test coverage may not bedesirable due to time andeffort cost constraints.Hence, selection isprovided for test optimization.
Furthermore, the present disclosure employs a top-down approach to design tests, beginning with the high-level processmodel as opposed to having just use cases whichare at the leaf level, this helps developers to becognizant of the various usage scenarios upfront,avoiding late detection of gaps and defects whenend-to-end scenario testing is done for the firsttime during the application test.The approach offers some independencefrom team configuration changes over time, sincea lot of the necessary domain knowledge and testspecifications are captured in model form. Itoffers the possibility of reuse across applications.
Referring to the accompanying drawings, FIGURE 1 shows a block diagram of the computer-implemented system 100 for model based automated functional test generation. The system 100 comprises three main components including a modeling unitl02, a test case generation unit 112and a test script generation unit 122. In addition, the proposed system lOOcomprises a central repository llOfor storing data generated corresponding to an AUT, wherein the data includesthe
business process flows, the data model, the constraints, the test cases and the test scripts.
The modeling unit 102captures business process flowsalong with an associated data model for the AUT, wherein the process flows are annotated with constraints. The modeling unit 102 is configured to receive and reuse a set of business process flows in the event that they are already created in the requirement phase of Software Development Lifecycle (SDLC). The present disclosure proposes the use of business process flows since it is intuitive to think of functional testingas testing the Business process flows of thesystem.
The modeling unit 102 comprises modeling tools 104to capture the domain knowledge from the experts in the form of business process flows and the domain data. The modeling tools 104 include a Business Process Modeling Notation (BPMN) tool to capture and represent the business process flows in the form of visual notations. The advantage of using the BPMN tool is that the notation of this tool is understood by all stakeholders whetherrequirement analysts, software developers,managers or end-users. Thus, the BPMN tool eliminates the dependency on skilled programmers for designing the tests.
Further, the modeling tools 104 include a UML Class Diagram notation tool to capture the data model for the AUT. The UML Class Diagram notation tool captures the various entities, their attributesand associations. The present disclosure only captures the entities from the domain model and not the classes thatare created during application design andimplementation phase of SDLC.
The modeling unit 102 further comprises translation means 106 to translate business rules and test conditions associated with a business process flow into
constraints, wherein the constraints are represented using Object Constraints Language (OCL) notation. Alternatively, the translation means 106 can receive manually translated business rules and test conditions associated with a captured business process flow in the form of constraintsrepresented usingOCL. The constraints are then annotated to the corresponding business process flows by annotating means 108. Like the translation means 106, the annotating means 108also facilitates manual annotation of the constraints to the corresponding captured business process flows.
The business process flows and the constraints captured by the modeling unit 102are stored in the central repository llOand forwarded to a test case generation unit 112. The test case generation unit 112 traverses and verifies the business process flows and solves the constraints to generate a plurality of test cases and further prunes the test cases by employing preliminary heuristics based techniques.
The test case generation unit 112for performing the aforementioned functions includes path generation means 114 which generates test scenarios by traversing the business process flows. The path generation means 114uses branch coverageas the criteria for test scenario generation, all paths in the business process flow arecovered using a depth-first search and each path orbranch becomes one test scenario for theAUT.
In accordance with this disclosure, each test scenario is a sequence of Task tuples,each Task tuple consisting of the Task and itsspecified preand post conditions. For instance,
Test suite T = {Sl,S2....Sn} where S1,S2....Snare the scenarios.
Si = {tl,t2....tn} where tl,t2....tn are the Tasktuples in scenario Si
tj = {prej, Tj, postj} where prej, Tj, postj arethe pre-condition, Task body and post-conditionin task tuple tj.
The path generation means 114further performs loop-unrolling by traversing each loop exactly once in each path in the event that the business process flow includes loops.
The generated test scenarios are stored in the central repository 110 and are sent for verification to verification means 116of the test case generation unit 112. The verification means 116employs model checking to verifythe process specification against the businessrules, detect inconsistencies in the process flowitself and eliminate infeasible paths to optimize the number of scenarios generated.
In accordance with this disclosure, the verification means 116converts each generated test scenario into a specification using Symbolic Analysis Laboratory (SAL) tool where the pre and postconditionsare translated into logic expressions. The verification means 116verifies thespecification for consistency. The {Pre-condition,Task specification and Post-condition} are also verified to check if they are consistent with one another at each Task level,and also amongst the entire sequence of Tasksthat make up the scenario for the verification means 116to be able to find a path.Business rules are verified by the verification means 116amongstthemselves for consistency. Preand postconditionsin a path are also verified against theBusiness rules to ensure that there are noconflicts between them.When SAL fails to find a solution for anyspecification, there is a conflict. The cause of the conflict can be an error in
the specification or that theconditions within the scenario conflict i.e. thepath is invalid and can be eliminated.The error in the specification can be either a business rule being violated by acondition/ action or the conditions in thescenario conflicting among themselves. These errorsare reported by SAL so that they can be manually rectified. Solutions are then generated forall valid scenarios by the verification means 116,
The optimized scenarios are then passed to test data generation means 118 for generating test data for each of the scenarios by solving the specified pre and post conditions for each task under a test scenario. In accordance with this disclosure, the test data (input) generated for each Tasknot only satisfies the pre-conditions format Task but also of subsequent Tasks in theflow, so that the AUT is able to execute theentire scenario. Also, the test data generation means 118 takes into account the changes brought about by the preceding Task(s),i.e. from the post-state of theprevious Task.These criteria are naturally met by the test data generation means 118 using the constraint solving approach, whereas in theconventional approach, they have to be ensuredmanually.
Thetest case generation unit 112further comprisescompilation means 120 to compile the test scenarios and their corresponding data into test cases in the document form. The test scenarios, their corresponding data and the resultant test cases are stored in the central repository 110.
Furthermore, the test case generation unit 112performs the task of pruning the test cases by employing preliminary heuristics based techniques. Typically, the generated test cases are exhaustiveand the number can become really large due tocombinatorial explosion when the processhierarchy is large and complex.In
such a case, running all tests is neitherfeasible nor desirable. The test suite is therefore pruned by using preliminary heuristics such as, when certain conditions are not required to be checkedfor all possible paths for e.g. when an error condition mayneed to be checked only once. The test case generation unit 112provides selection on conditions, where conditions to be tested in only one or aspecified number of paths can be stated. Thisresults ina drastic reduction in number of testcases.
The present disclosure also provides a test script generation unit 122. The unit 122provides a tool that automates the scenarios bygenerating test scripts for them with thegenerated test data embedded in them. For automating, the test script generation the unit 122records a sequence of User Interface (UI) actions required toperform each atomic Task in a scenario. For each generated scenario, the recordedsequences for individual Tasks are threadedtogether, the generated input data is inserted and the test script is created for automating the functional testing of the AUT. It also takes as input, a simple mappingbetween the attributes in the domain model andthe input fields on the screen.
In accordance with the present disclosure, there is provided a method for automated functional test generation for an Application Under Test (AUT), the method comprising the following steps as seen in FIGURE 2: generating and capturing business process flowsalong with an associated data model for the AUT1000; annotating the process flows with constraints 1002; traversing and verifying the business process flows and solving the constraints to generate a plurality of test cases 1004; pruning the test cases by employing preliminary heuristics based techniques 1006; and collating a recorded sequence of use-cases corresponding to each of the test cases to generate test scripts for automating the functional testing of the AUT1008.
In accordance with this disclosure, the step of annotating the process flows with constraints includes the step of translating business rules and test conditions associated with a business process flow into the constraints using Object Constraints Language.
Further, the step of traversing and verifying the business process flows includes the following steps generating test scenarios by traversing the business process flows using a depth-first search technique, wherein the each scenario is represented as a sequence of task tuples and each task tuple consists of a task and its specified pre and post condition;performing loop unrolling in the event that the business process flow includes loops; andconverting each test scenario into a specification using Symbolic Analysis Laboratory tool and verifying the test scenarios against the corresponding constraints to detect inconsistencies and eliminate infeasible scenarios.
Furthermore, the step of solving the constraints to generate a plurality of test cases includes the steps of generating test data for each of the scenarios by solving specified the pre and post conditions for each task under a test scenario; and compiling the test scenarios and their corresponding data into test cases in the document form.
TECHNICAL ADVANTAGES
The technical advantages of the present disclosure include realization of a computer-implemented system and method for automated functional test generation. The proposed system uses business process models thatdepict functionality without going intoimplementation detail and in providing
aseamless, end-to-end approach from test designto automation, usable by business analysts andtest teams alike.
Further, the proposed system traverses the process model paths to generate scenarios which are further divided into tasks. The tasks include pre and postconditions. These pre and postconditions facilitate the proposed system to identify redundant paths and errorsand to also generate test data for each step of thescenario.
Still further, since the proposed system captures the AUT functionalities/domain knowledge from experts as a set of business process flows, continuous dependence on necessary domain knowledgefor ensuring test quality and coverageis eliminated.
Furthermore, the proposed system defines a structured model form represented using visual notations for test generation and selection. The structuredmodel form and visual notation make it easy togo through and build in the missing portions,enabling creation of a test suite that achievesgood functional coverage.
In addition, the proposed system provides test selection for optimizing the test generation in the event that exhaustive test coverage is notdesirable in a test cycle due to time andeffort cost constraints.
Also, the proposed system uses a top-down approach to testdesign beginning with the high-level processmodel as opposed to having just use cases whichare at the leaf level, it can help developers becognizant of the various usage scenarios upfront,avoiding late detection of gaps and defects whenend-to-end scenario testing is done for the firsttime during the application testing.
Throughout this specification the word "comprise", or variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated element, integer or step, or group of elements, integers or steps, but not
the exclusion of any other element, integer or step, or group of elements, integers or steps.
The use of the expression "at least" or "at least one" suggests the use of one or more elements or ingredients or quantities, as the use may be in the embodiment of the invention to achieve one or more of the desired objects or results.
Any discussion of documents, acts, materials, devices, articles or the like that has been included in this specification is solely for the purpose of providing a context for the invention. It is not to be taken as an admission that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant.to the invention as it existed anywhere before the priority date of this application.
While considerable emphasis has been placed herein on the particular features of this invention, it will be appreciated that various modifications can be made, and that many changes can be made in the preferred embodiment without departing from the principles of the invention. These and other modifications in the nature of the invention or the preferred embodiments will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter is to be interpreted merely as illustrative of the invention and not as a limitation.
We Claim:
1. A computer-implemented system for automated functional test generation
for an Application Under Test (AUT), said system comprising:
- a modeling unit for capturing business process flowsalong with an associated data model for the AUT, wherein said process flows are annotated with constraints;
- a test case generation unit for traversing and verifying said business process flows and solving said constraints to generate a plurality of test cases and further pruning said test cases by employing preliminary heuristics based techniques; and
- a test script generation unit to collate a recorded sequence of use-cases corresponding to each of said test cases to generate test scripts for automating the functional testing of the AUT.
2. The system as claimed in claim 1, wherein said modeling unit comprises:
- modeling tools including Business Process Modeling Notation (BPMN) tool to capture said business process flows in the form of visual notations and UML Class Diagram notation tool to capture attributes in said data model;
- translation means to translate business rules and test conditions associated with said captured business process flows into constraints, wherein said constraints are represented using Object Constraints Language notation; and
- annotating means to annotate said constraints to said corresponding captured business process flows.
3. The system as claimed in claim 1, wherein said modeling unit is still further
configured to:
- receive and reuse a set of business process flows as a part of said captured business process flows in the event that they are already created in the requirement phase of Software Development Lifecycle;
- accept manually translated business rules and test conditions associated with said captured business process flows in the form of constraints represented using Object Constraints Language notation; and
- facilitate manual annotation of said constraints to said corresponding captured business process flows.
4. The system as claimed in claim 1, wherein said test case generation unit
comprises:
- path generation means to generate test scenarios using a depth-first search technique and further perform loop unrolling in the event that the process flow includes loops, wherein each scenario is represented as a sequence of task tuples and each task tuple consists of a task and its specified pre and post condition;
- verification means to convert each test scenario into a specification using Symbolic Analysis Laboratory tool and verify said test scenarios against said corresponding constraints to detect inconsistencies and eliminate infeasible scenarios;
- test data generation means adapted to generate test data for each of said scenarios by solving specified pre and post conditions for each task under a test scenario; and
- compilation means adapted to compile said test scenarios and their corresponding data into test cases, in the document form.
5. The system as claimed in claim 1, wherein said test script generation unit further receives a mapping file to facilitate in test script generation, wherein said mapping file includes a mapping between attributes in said domain model and input fields specified in said use-cases.
6. The system as claimed in claim 1, wherein said system comprises a central repository for storing data generated corresponding to an AUT, wherein said data is selected from the group consisting of said business process flows, said data model, said constraints, said test cases and said test scripts.
7. A method for automated functional test generation for an Application Under Test (AUT), said method comprising the following steps:
- generating and capturing business process flowsalong with an associated data model for the AUT;
- annotating said process flows with constraints;
- traversing and verifying said business process flows and solving said constraints to generate a plurality of test cases;
- pruning said test cases by employing preliminary heuristics based techniques; and
- collating a recorded sequence of use-cases corresponding to each of
said test cases to generate test scripts for automating the functional
testing of the AUT.
8. The method as claimed in claim 7, wherein the step of annotating said process flows with constraints includes the step of translating business rules and test conditions associated with a business process flow into said constraints using Object Constraints Language.
9. The method as claimed in claim 7, wherein the step of traversing and verifying said business process flows includes the following steps:
- generating test scenarios by traversing said business process flows
using a depth-first search technique, wherein said each scenario is
represented as a sequence of task tuples and each task tuple consists of
a task and its specified pre and post condition;
performing loop unrolling in the event that said business process flow includes loops; and
- converting each test scenario into a specification using Symbolic
Analysis Laboratory tool and verifying said test scenarios against said
corresponding constraints to detect inconsistencies and eliminate
infeasible scenarios.
10.The method as claimed in claim 7 and 9, wherein the step of solving said constraints to generate a plurality of test cases includes the steps of generating test data for each of said scenarios by solving specified said pre and post conditions for each task under a test scenario; and compiling said
test scenarios and their corresponding data into test cases in the document form.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 1410-MUM-2012-US(14)-HearingNotice-(HearingDate-16-04-2021).pdf | 2021-10-03 |
| 1 | Other Patent Document [13-10-2016(online)].pdf | 2016-10-13 |
| 2 | 1410-MUM-2012-IntimationOfGrant19-05-2021.pdf | 2021-05-19 |
| 2 | ABSTRACT1.jpg | 2018-08-11 |
| 3 | 1410-MUM-2012-PatentCertificate19-05-2021.pdf | 2021-05-19 |
| 3 | 1410-MUM-2012-FORM 3.pdf | 2018-08-11 |
| 4 | 1410-MUM-2012-Written submissions and relevant documents [27-04-2021(online)].pdf | 2021-04-27 |
| 4 | 1410-MUM-2012-FORM 2.pdf | 2018-08-11 |
| 5 | 1410-MUM-2012-FORM 2(TITLE PAGE).pdf | 2018-08-11 |
| 5 | 1410-MUM-2012-Correspondence to notify the Controller [16-04-2021(online)].pdf | 2021-04-16 |
| 6 | 1410-MUM-2012-FORM-26 [16-04-2021(online)].pdf | 2021-04-16 |
| 6 | 1410-MUM-2012-FORM 18(28-6-2013).pdf | 2018-08-11 |
| 7 | 1410-MUM-2012-Response to office action [09-09-2020(online)].pdf | 2020-09-09 |
| 7 | 1410-MUM-2012-FORM 1.pdf | 2018-08-11 |
| 8 | 1410-MUM-2012-FORM 1(3-7-2012).pdf | 2018-08-11 |
| 8 | 1410-MUM-2012-ABSTRACT [24-06-2019(online)].pdf | 2019-06-24 |
| 9 | 1410-MUM-2012-CLAIMS [24-06-2019(online)].pdf | 2019-06-24 |
| 9 | 1410-MUM-2012-DRAWING.pdf | 2018-08-11 |
| 10 | 1410-MUM-2012-DESCRIPTION(COMPLETE).pdf | 2018-08-11 |
| 10 | 1410-MUM-2012-DRAWING [24-06-2019(online)].pdf | 2019-06-24 |
| 11 | 1410-MUM-2012-CORRESPONDENCE.pdf | 2018-08-11 |
| 11 | 1410-MUM-2012-FER_SER_REPLY [24-06-2019(online)].pdf | 2019-06-24 |
| 12 | 1410-MUM-2012-CORRESPONDENCE(3-7-2012).pdf | 2018-08-11 |
| 12 | 1410-MUM-2012-FER.pdf | 2019-05-22 |
| 13 | 1410-MUM-2012-ABSTRACT.pdf | 2018-08-11 |
| 13 | 1410-MUM-2012-CORRESPONDENCE(28-6-2013).pdf | 2018-08-11 |
| 14 | 1410-MUM-2012-CLAIMS.pdf | 2018-08-11 |
| 15 | 1410-MUM-2012-ABSTRACT.pdf | 2018-08-11 |
| 15 | 1410-MUM-2012-CORRESPONDENCE(28-6-2013).pdf | 2018-08-11 |
| 16 | 1410-MUM-2012-CORRESPONDENCE(3-7-2012).pdf | 2018-08-11 |
| 16 | 1410-MUM-2012-FER.pdf | 2019-05-22 |
| 17 | 1410-MUM-2012-FER_SER_REPLY [24-06-2019(online)].pdf | 2019-06-24 |
| 17 | 1410-MUM-2012-CORRESPONDENCE.pdf | 2018-08-11 |
| 18 | 1410-MUM-2012-DRAWING [24-06-2019(online)].pdf | 2019-06-24 |
| 18 | 1410-MUM-2012-DESCRIPTION(COMPLETE).pdf | 2018-08-11 |
| 19 | 1410-MUM-2012-CLAIMS [24-06-2019(online)].pdf | 2019-06-24 |
| 19 | 1410-MUM-2012-DRAWING.pdf | 2018-08-11 |
| 20 | 1410-MUM-2012-ABSTRACT [24-06-2019(online)].pdf | 2019-06-24 |
| 20 | 1410-MUM-2012-FORM 1(3-7-2012).pdf | 2018-08-11 |
| 21 | 1410-MUM-2012-FORM 1.pdf | 2018-08-11 |
| 21 | 1410-MUM-2012-Response to office action [09-09-2020(online)].pdf | 2020-09-09 |
| 22 | 1410-MUM-2012-FORM 18(28-6-2013).pdf | 2018-08-11 |
| 22 | 1410-MUM-2012-FORM-26 [16-04-2021(online)].pdf | 2021-04-16 |
| 23 | 1410-MUM-2012-Correspondence to notify the Controller [16-04-2021(online)].pdf | 2021-04-16 |
| 23 | 1410-MUM-2012-FORM 2(TITLE PAGE).pdf | 2018-08-11 |
| 24 | 1410-MUM-2012-FORM 2.pdf | 2018-08-11 |
| 24 | 1410-MUM-2012-Written submissions and relevant documents [27-04-2021(online)].pdf | 2021-04-27 |
| 25 | 1410-MUM-2012-PatentCertificate19-05-2021.pdf | 2021-05-19 |
| 25 | 1410-MUM-2012-FORM 3.pdf | 2018-08-11 |
| 26 | ABSTRACT1.jpg | 2018-08-11 |
| 26 | 1410-MUM-2012-IntimationOfGrant19-05-2021.pdf | 2021-05-19 |
| 27 | Other Patent Document [13-10-2016(online)].pdf | 2016-10-13 |
| 27 | 1410-MUM-2012-US(14)-HearingNotice-(HearingDate-16-04-2021).pdf | 2021-10-03 |
| 1 | 2019-05-2014-55-10_20-05-2019.pdf |