Sign In to Follow Application
View All Documents & Correspondence

Method And System For Automated Regression Testing

Abstract: A method and system for testing an application is presented. Failed test cases are identified from past executions of a test execution job for testing the application using a test automation system (100). One or more selected patterns associated with the failed test cases are determined. The selected patterns include one or more screens traversed by the test automation system (100) during execution of each of the failed test cases, one or more screen navigation patterns, one or more checkpoints defined for each of the screens, and/or one or more functional calls included in each of the failed test cases. One or more related test cases that include the determined patterns associated with the failed test cases are identified from a test suite. The related test cases are added to a subsequent test execution job. The test automation system (100).executes the subsequent test execution job to test the application. FIG.2

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
08 September 2016
Publication Number
11/2019
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
shery.nair@tataelxsi.co.in
Parent Application
Patent Number
Legal Status
Grant Date
2024-02-06
Renewal Date

Applicants

TATA ELXSI LIMITED
ITPB Road, Whitefield, Bangalore.

Inventors

1. SUNIL THARANGINI GOVINDARU
ITPB Road, Whitefield, Bangalore – 560048

Specification

DESC:
BACKGROUND

[0001] Embodiments of the present specification relate generally to test automation, and more particularly to a system and method for enhancing automated regression testing.
[0002] A software product undergoes numerous changes over its development lifecycle. However, as the software product enters later stages of its development, the software becomes more stable and entropy of changes in a corresponding user interface and other features is reduced significantly. Regression testing plays a very important role in ensuring product quality and stability in the later stages of software development. In particular, regression testing aids in verifying that software that was previously developed and tested still performs correctly after it was changed or interfaced with other software. These changes, for example, may include software enhancements, patches, and configuration changes.
[0003] With increasing use of rapid application development techniques such as agile models, software undergoes frequent changes that still need to be comprehensively, but rapidly tested. The testing aims to achieve desired functionality and to ensure that previously available functionality still operates accurately. Accordingly, most software development teams employ continuous integration approaches to streamline development and testing activities. Test automation may be included as an integral part of a continuous integration workflow to reduce the time taken for code changes to be incorporated into the final product, thereby minimizing cycle times and rolling out stable releases at a faster rate to the market.
[0004] Generally, test automation systems run an operator-defined set of test scripts when a software build becomes available. Some test automation strategists even prefer to have different sets of test scripts to run on different builds. Typically, the first set corresponds to a set of smoke test cases, which are completed in a few hours. Based on the outcome of the smoke test run, a continuous integration system decides whether to run subsequent larger batches of test scripts to have more automation coverage.
[0005] Generally, the test sets are defined manually in the continuous integration workflow, and therefore, the test cases, which form a part of each test set, are maintained manually according to the feature additions and user interface changes in the product under testing. After each regression run, there arises a need to enhance the test set by adding or removing test scripts based on the outcome of the previous regression runs and new features being integrated into the product under testing. For instance, if it is found that one test case which tests the feature “A” has failed in a regression run, a user may desire to check the regression batch and make sure that all tests which are related to feature “A” are included in the next regression run.
[0006] Certain systems for automating regression testing are known in literature. One such system is described in the patent application WO 2015118536 A1. The patent application WO 2015118536 A1 relates to debugging, and includes an artificial intelligence (AI) system that is trained on each bug and on the steps to resolve the bug. The machine learning used by the AI system may later be used to predict what method of debugging or diagnosis is to be planned for defects reported in the software system. However, the patent application WO 2015118536 A1 fails to provide an approach for enhancing regression coverage during automated testing.
[0007] Accordingly, it may be desirable to develop a test automation framework that can intelligently augment a regression test suite with relevant test cases for improved regression coverage for new and old features in a software application.

BRIEF DESCRIPTION

[0008] In accordance with an aspect of the present disclosure, a method for testing an application is presented. The method includes identifying one or more failed test cases from one or more past executions of a test execution job for testing the application using a test automation system. The method further includes determining one or more selected patterns associated with the failed test cases, wherein the selected patterns comprise one or more screens traversed by the test automation system during execution of each of the failed test cases, one or more screen navigation patterns, one or more checkpoints defined for each of the one or more screens, and/or one or more functional calls included in each of the failed test cases. The method also includes identifying one or more related test cases from a test suite that include the determined patterns associated with the failed test cases. Moreover, the method includes adding the related test cases to a subsequent test execution job and testing the application by executing the subsequent test execution job using the test automation system.
[0009] According to an aspect, the selected patterns comprise all screens traversed by the test automation system during execution of the failed test cases, a subset of the screens where an error is encountered, and/or an input screen corresponding to the subset of screens, an output screen corresponding to the subset of screens.
[0010] According to an aspect, the selected patterns comprise all checkpoints evaluated by the test automation system during execution of the failed test cases, and/or all checkpoints included in a subset of the screens where an error is encountered.
[0011] According to an aspect, the selected patterns comprise all functional calls executed by the test automation system during execution of the failed test cases, and/or all functional calls included in a subset of the screens where an error is encountered.
[0012] According to an aspect, adding the related test cases to the subsequent test execution job comprises presenting a list of the related test cases to a user on a user interface associated with the test automation system, and adding one or more of the related test cases selected by the user from the list to the subsequent test execution job.
[0013] According to another aspect, adding the related test cases to the subsequent test execution job comprises automatically adding the related test cases to the subsequent test execution job.
[0014] According to yet another aspect, adding the related test cases to the subsequent test execution job comprises adding the related test cases to the subsequent test execution job, to all subsequent test execution jobs, a selected existing test execution job, or a new test execution job based upon user input, and/or pre-programmed instructions.
[0015] According to another aspect, the subsequent test execution job comprises regression testing the application using the test automation system.
[0016] In accordance with an aspect of the present disclosure, a system for testing an application is presented. The system includes a data storage device configured to store one or more test cases in a test suite for use in testing the application. The system further includes a screen navigation engine communicatively coupled to the data storage device, wherein the screen navigation engine is configured to identify one or more failed test cases from one or more past executions of a test execution job for testing the application. The screen navigation engine is also configured to determine one or more selected patterns associated with the failed test cases, wherein the selected patterns comprise one or more screens traversed by the test automation system during execution of each of the failed test cases, one or more screen navigation patterns, one or more checkpoints defined for each of the one or more screens, and/or one or more functional calls included in each of the failed test cases. The screen navigation engine is further configured to identify one or more related test cases from the test suite that include the determined patterns associated with the failed test cases. Additionally, the screen navigation engine is configured to add the related test cases to a subsequent test execution job, and test the application by executing the subsequent test execution job.
[0017] According to an aspect, the system further includes a user interface communicatively coupled to the screen navigation engine. The screen navigation engine is configured to present a list of the related test cases to a user on the user interface, and add one or more of the related test cases selected by the user from the list to the subsequent test execution job.
[0018] According to an aspect, the user interface is configured to provide the user with input options to select one or more past executions of the test execution job to identify the failed test cases, one or more of the selected patterns for identifying the failed test cases, and/or one or more one or more subsequent test execution jobs to which the related test cases are to be added.
[0019] According to an aspect, the user interface is configured to provide the user with one or more input options to select all of the screens, the checkpoints, and the functional calls associated with the failed test cases, or a subset of one or more of the screens, the checkpoints, and the functional calls based upon a point of occurrence of an error during execution of the failed test cases

BRIEF DESCRIPTION OF THE FIGURES

[0020] These and other features, aspects, and advantages of the claimed subject matter will become better understood when the following detailed description is read with reference to the accompanying drawings, in which:
[0021] FIG. 1 depicts a flow diagram illustrating an exemplary method of data collection by a test automation framework, according to an embodiment of the present specification;
[0022] FIG. 2 depicts a flow diagram illustrating an exemplary method for suggesting test cases by the automation framework, according to an embodiment of the present specification;
[0023] FIG. 3 depicts a flow diagram illustrating an exemplary method for suggesting suitable test cases to be included in the regression testing, according to an embodiment of the present specification; and
[0024] FIGs. 4-8 depict an exemplary workflow of the test automation system of FIG. 1 when used to augment the regression test suite.

DETAILED DESCRIPTION

[0025] The following description presents exemplary systems and methods for enhancing regression coverage during automated testing. Particularly, the embodiments described herein disclose exemplary methods and systems that aid in augmenting a regression test suite with selected test cases from an associated repository for subsequent regression testing. The test cases are selected if at least a portion of their navigation path matches one or more navigation patterns determined from failed test cases during a previous regression run. An exemplary environment that is suitable for practicing various implementations of the present method and system is discussed in detail with reference to FIGs. 1-2.
[0026] FIG. 1 illustrates an exemplary test automation system (100) configured to intelligently augment a test suite for testing a software application (102) running on a graphical user interface (GUI) (104) associated with a computing device (106). A typical GUI-based software application (102) includes one or more screens, which offer different features and functionalities. A user needs to follow one or more defined navigation patterns to reach a desired screen for accessing a particular feature or functionality. The software application (102) may be tested extensively during the development stage to ensure that the software application (102) performs as expected to allow the user to access desired features and functionalities by using the defined navigation patterns.
[0027] To that end, the test automation system (100) may include a state-based screen navigation engine (108) that is configured to identify and navigate from each selected screen to one or more of the other screens included in the software application (102) as defined in a test script. Particularly, the state-based screen navigation engine (108) may be configured to identify a screen by identifying one or more checkpoints associated with the selected screen. In one embodiment, checkpoints are an encapsulation of reference data, for example, co-ordinates of a specific section of the screen and the data to be expected in the specific section that may be used by the state-based screen navigation engine (108) for identifying the screen and/or verifying corresponding features. The reference data used as the checkpoints, for example, may include a reference image or character based information such as a selected string or a regular expression. In certain embodiments, the state-based screen navigation engine (108) may be configured to compare the reference data to the text or image that may be extracted from one or more specific sections of the screen using Optical Character Recognition (OCR). The comparison allows the state-based screen navigation engine (108) to match the extracted text or image to the reference data to identify the current screen. Once the current screen is identified, the state-based screen navigation engine (108) may be configured to determine the navigation pattern to reach any other desired screen based on stored screen-to-screen navigation information for the software application (102). In certain embodiments, test automation developers need only specify the target screen in their scripts, while the state-based screen navigation engine (108) is configured to detect the current state (screen), calculate the optimum path to reach the target screen, and navigate using the optimum path from the current screen to the target screen defined in the test script.
[0028] According to aspects of the present disclosure, the screen navigation engine (108) may be configured to store screen navigation history, including the navigation pattern and corresponding test result, for each executed test script (110) during one or more previous test runs. In addition to the screen navigation history, the screen navigation engine (108) may also be configured to identify a list of checkpoints and/or features that were compared or verified in each screen during the test execution. To that end, the state-based screen navigation engine (108) may include, for example, using one or more general-purpose processors, specialized processors, graphical processing units, microprocessors, programming logic arrays, field programming gate arrays, and/or other suitable computing devices.
[0029] In certain embodiments, the screen navigation engine (108) may be configured to store the list of checkpoints that were verified and their corresponding verification result. Moreover, in certain embodiments, the screen navigation engine (108) may be configured to store test application programming interface (API) calls (112) executed during the test case execution. In one embodiment, the list of screen navigation patterns, the checkpoint/feature comparisons, and/or the test API calls (112) is stored against the test script name in at least one data storage device (114) that is communicatively coupled to the state state-based screen navigation engine (108). Additionally, the data storage device (114) also stores test result information indicative of the test scripts (110) that passed or failed during one or more previous regression runs. To that end, the data storage device (114) may include Random Access Memory (RAM), Read-Only Memory (ROM), volatile memory, non-volatile memory, hard drives, compact disk (CD) ROMs, Digital Versatile DVDs, flash drives, solid-state drives, and any other known physical storage media. According to aspects of the present disclosure, the test automation system (100) is configured to augment a test suite for testing the software application (102) in a subsequent regression run based on the navigation pattern of the test cases that failed during one or more previous regression runs. An exemplary method for intelligently augmenting the test suite based on the stored navigation pattern of the failed test cases is described in greater detail with reference to FIG. 2.
[0030] FIG. 2 illustrates a flow chart (200) depicting an exemplary method for intelligently suggesting test cases for inclusion in one or more subsequent regression runs. The exemplary method is illustrated as a collection of blocks in a logical flow chart, which represents operations that may be implemented in hardware, software, or combinations thereof. The various operations are depicted in the blocks to illustrate the functions that are performed during various phases of the exemplary method. In the context of software, the blocks represent computer instructions that, when executed by one or more processors, perform the recited operations. The order in which the exemplary method is described is not intended to be construed as a limitation, and any number of the described blocks may be combined in any order to implement the exemplary method disclosed herein, or an equivalent alternative method. Additionally, certain blocks may be deleted from the exemplary method or augmented by additional blocks with added functionality without departing from the spirit and scope of the subject matter described herein. For clarity, an embodiment of the present method will be described with reference to the system components depicted in FIG. 1.
[0031] The method begins at step (202), where the screen navigation engine (108) may be configured to identify all failed test cases from one or more previous regression runs. The failed test cases may be identified based on their corresponding test result status in one or more previous test executions. As previously noted, the screen navigation engine (108) may be configured to store the list of checkpoints that were verified and their corresponding verification result. Additionally, the screen navigation engine (108) may also store test API calls (112) executed during the test case execution. In one embodiment, the list of screen navigation patterns, the checkpoint/feature comparisons, and/or the test API calls (112) is stored against the test script name in at least one data storage device (114). Additionally, screen navigation also stores test result information indicative of the test scripts (110) that passed or failed during one or more previous regression runs.
[0032] According to aspects of the present disclosure, at step (204), the screen navigation engine (108) is configured to automatically identify a set of test cases from the test suite that include one or more of the screens traversed, the API calls (112) made, and the checkpoints evaluated by each of the failed test cases for inclusion in subsequent regression runs. As previously noted, the test automation system collects the list of test scripts (110) that failed in past executions for a particular batch of test cases. The system then extracts target screens navigated to during execution of the test script. The sequence of target screens navigated during the test script execution may be stored as the screen navigation pattern. In one embodiment, the screen navigation pattern may include all screens navigated to during the test script execution. In another embodiment, however, the screen navigation pattern may include only a subset of the screens navigated to during the test script execution. For example, in one embodiment, the screen navigation pattern may include only the screen that encounters an error, the error-prone screen and corresponding input screen, and/or the error-prone screen and one or more corresponding input and output screens. In certain embodiments, the screen navigation engine (108) may allow selection of a minimum of maximum number of screens to be stored in the screen navigation pattern.
[0033] As previously noted, in addition to the screen navigation pattern, the test automation system may also identify a list of checkpoints/features that are verified in each screen during the test execution along with corresponding verification results. In certain embodiments, the screen navigation engine (108) identifies those test cases from the system data storage that include a matching pattern of checkpoints/feature comparisons included in failed test cases. Similarly, the test automation system may also identify test cases from the system data storage that include test API calls (112) that match the API calls (112) in failed test cases. The stored list of to screen navigation pattern, checkpoint/feature comparison, and/or the test API calls (112) corresponding to failed test cases may be used to identify other related test cases that may be suitable for inclusion in subsequent regression runs to enhance the automated test coverage.
[0034] Further, at step (206), the screen navigation engine (108) may be configured to automatically add the identified test cases to a list of test cases to be included in one or more subsequent regression test runs. Alternatively, the screen navigation engine (108) may be configured to present the list of test cases identified for one or more of the previous regression runs as a suggestion to a user for inclusion in the subsequent regression run. FIGs. 3-7 depict an exemplary workflow corresponding to the screen navigation engine (108) of FIG. 1 when used to augment the regression test suite.
[0035] In particular, FIG. 3 depicts an exemplary GUI (300) that allows for creating a new job for evaluating test cases from a test suite for inclusion in subsequent regression runs. Additionally, the GUI (300) also includes options to set or select a desired recurrence trigger for scheduling the test execution job, for example, by selecting the Triggers tab (302). By selecting the Triggers tab (302), the user may schedule the job to reoccur after a defined time period, every day at a particular time, weekly, monthly, on certain days of the month, and so on. Furthermore, the GUI (300) may also include a Devices tab (304) to present the user with options to specify which devices in a particular batch of jobs are to be targeted for testing.
[0036] Further, FIG. 4 depicts a graphical representation of an exemplary screen (400) listing the job details for evaluating one or more test cases included in the test suite of FIG. 3. The screen (400) may include one or more options (402) to selectively add the test cases that may be evaluated for augmenting the test suite for a subsequent regression test run. Additionally, the screen (400) may include one or more other options (404) to move a test case to a higher order, a lower order, a random order, to remove the test case, and/or to navigate to the next screen that presents the user with a list of suggested test cases related to the failed test cases.
[0037] FIG. 5 depicts a graphical representation of an exemplary screen (500) that lists the past executions of the test suite. The screen (500) also depicts a comprehensive list of failed test cases included in the past executions in a designated portion (502), along with one or more options (504) to identify related test cases based on a screen navigation pattern, checkpoint/feature comparison, and/or test API calls (112) associated with each of the selected failed test cases. The screen (500) further includes an option, such as a button (506), that allows addition of all test cases automatically identified to be related to the failed test cases that are selected by the user to a subsequent testing job.
[0038] FIG. 6 depicts an exemplary screen (600) depicting a subset of test cases suggested by the screen navigation engine (108) for inclusion in a subsequent test run based on a pattern of checkpoint/feature comparisons identified from screens included in failed test cases. Specifically, the screen navigation engine (108) identifies all test cases in which the pattern of the checkpoint/feature comparisons made in the failed test cases substantially matches the pattern of the checkpoint/feature comparisons in other test cases in the test suite.
[0039] FIG. 7 depicts an exemplary screen (700) depicting a subset of test cases suggested by the screen navigation engine (108) for inclusion in a subsequent test run based on one or more screen navigation patterns included in failed test cases. Additionally, the screen (700) also provides a selectable option (702) that allow addition of the selected test cases to all testing jobs, a present job, another existing job, or to a new job based upon user input and/or pre-programmed instructions, thus easily augmenting the test suite for future regression runs.
[0040] Although, the embodiments described herein disclose a semi-automation augmentation of the regression test suite, in certain embodiments, the stored patterns may be used to automatically add suitable test cases to a regression suite for use in subsequent executions without any user inputs. Additionally, the stored patterns and/or user selections may be communicated to a machine learning system that may be used to refine the stored patterns and/or to identify new patterns and new test cases for augmenting the regression test suite.
[0041] Embodiments of the present systems and methods, thus, provide an efficient process for intelligently selecting test cases or test scripts from a repository of test cases or test scripts to optimize the automated test coverage. The test cases or test scripts are selected on the basis of past test execution and result data that the test automation system gathers over a period of time. Specifically, the present systems may be configured to detect and store patterns corresponding to screen navigations, checkpoint/feature verification, API calls (112), and/or other such patterns in test cases that failed during a past test run. Subsequently, the detected patterns may be matched with corresponding patterns in other test cases included in a test suite or repository to identify related test cases that may be added to a subsequent regression test run. Use of the present systems, thus, enhances the probability of finding related defects and ensures better test coverage, leading to shorter product development cycles.
[0042] Although specific features of various embodiments of the present systems and methods may be shown in and/or described with respect to the drawing and not in others, this is for convenience only. It is to be understood that the described features, structures, and/or characteristics, and any subset thereof, may be combined and/or used interchangeably in any suitable manner in the various embodiments.
[0043] While only certain features of the present systems and methods have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention.
,CLAIMS:

1. A method for testing an application, comprising:
identifying one or more failed test cases from one or more past executions of a test execution job for testing the application using a test automation system (100);
determining one or more selected patterns associated with the failed test cases, wherein the selected patterns comprise one or more screens traversed by the test automation system (100) during execution of each of the failed test cases, one or more checkpoints defined for each of the one or more screens, one or more functional calls included in each of the failed test cases, or combinations thereof;
identifying one or more related test cases from a test suite that include the determined patterns associated with the failed test cases;
adding the related test cases to a subsequent test execution job; and
testing the application by executing the subsequent test execution job using the test automation system (100).

2. The method of claim 1, wherein the selected patterns comprise all screens traversed by the test automation system (100) during execution of the failed test cases, a subset of the screens where an error is encountered, an input screen corresponding to the subset of screens, an output screen corresponding to the subset of screens, or combinations thereof.

3. The method of claim 1, wherein the selected patterns comprise all checkpoints evaluated by the test automation system (100) during execution of the failed test cases, all checkpoints included in a subset of the screens where an error is encountered, or a combination thereof.

4. The method of claim 1, wherein the selected patterns comprise all functional calls executed by the test automation system (100) during execution of the failed test cases, all functional calls included in a subset of the screens where an error is encountered, or a combination thereof.

5. The method as claimed in claim 1, wherein adding the related test cases to the subsequent test execution job comprises:
presenting a list of the related test cases to a user on a user interface (104) associated with the test automation system (100); and
adding one or more of the related test cases selected by the user from the list to the subsequent test execution job.

6. The method as claimed in claim 1, wherein adding the related test cases to the subsequent test execution job comprises automatically adding the related test cases to the subsequent test execution job.

7. The method as claimed in claim 1, wherein adding the related test cases to the subsequent test execution job comprises adding the related test cases to the subsequent test execution job, to all subsequent test execution jobs, a selected existing test execution job, or a new test execution job based upon user input, pre-programmed instructions, or a combination thereof.

8. The method as claimed in claim 1, wherein the subsequent test execution job comprises regression testing the application using the test automation system (100).

9. A system (100) for testing an application, comprising:
a data storage device (114) configured to store one or more test cases in a test suite for use in testing the application;
a screen navigation engine (108) communicatively coupled to the data storage device (114), wherein the screen navigation engine (108) is configured to:
identify one or more failed test cases from one or more past executions of a test execution job for testing the application;
determine one or more selected patterns associated with the failed test cases, wherein the selected patterns comprise one or more screens traversed by the test automation system during execution of each of the failed test cases, one or more screen navigation patterns, one or more checkpoints defined for each of the one or more screens, one or more functional calls included in each of the failed test cases, or combinations thereof;
identify one or more related test cases from the test suite that include the determined patterns associated with the failed test cases;
add the related test cases to a subsequent test execution job; and
test the application by executing the subsequent test execution job.

10. The system (100) as claimed in claim 9, further comprising a user interface (104) communicatively coupled to the screen navigation engine (108), wherein the screen navigation engine (108) is configured to:
present a list of the related test cases to a user on the user interface (104); and
add one or more of the related test cases selected by the user from the list to the subsequent test execution job.

11. The system (100) as claimed in claim 10, wherein the user interface (104) is configured to provide the user with one or more input options to select one or more past executions of the test execution job for identifying the failed test cases, one or more of the selected patterns for identifying the failed test cases, one or more subsequent test execution jobs to which the related test cases are to be added, or combinations thereof.

12. The system (100) as claimed in claim 10, wherein the user interface (104) is configured to provide the user with one or more input options to select all of the screens, the checkpoints, and the functional calls associated with the failed test cases, or a subset of one or more of the screens, the checkpoints, and the functional calls based upon a point of occurrence of an error during execution of the failed test cases.

Documents

Application Documents

# Name Date
1 Power of Attorney [08-09-2016(online)].pdf 2016-09-08
2 Form 5 [08-09-2016(online)].pdf 2016-09-08
3 Form 3 [08-09-2016(online)].pdf 2016-09-08
6 Description(Provisional) [08-09-2016(online)].pdf 2016-09-08
7 abstract 201641030687 .jpg 2016-10-27
8 201641030687-FORM 18 [08-09-2017(online)].pdf 2017-09-08
9 201641030687-DRAWING [08-09-2017(online)].pdf 2017-09-08
10 201641030687-COMPLETE SPECIFICATION [08-09-2017(online)].pdf 2017-09-08
11 Form5_After Filing_13-07-2018.pdf 2018-07-13
12 Form26_General Power of Attorney_13-07-2018.pdf 2018-07-13
13 Form1_After Filing_13-07-2018.pdf 2018-07-13
14 Declaration_GPA_13-07-2018.pdf 2018-07-13
15 Correspondence by Agent_Form1, Form5, GPA, Declaration_13-07-2018.pdf 2018-07-13
16 201641030687-FER.pdf 2020-07-16
17 201641030687-FORM-26 [15-01-2021(online)].pdf 2021-01-15
18 201641030687-FER_SER_REPLY [15-01-2021(online)].pdf 2021-01-15
19 201641030687-COMPLETE SPECIFICATION [15-01-2021(online)].pdf 2021-01-15
20 201641030687-CLAIMS [15-01-2021(online)].pdf 2021-01-15
21 201641030687-PatentCertificate06-02-2024.pdf 2024-02-06
22 201641030687-IntimationOfGrant06-02-2024.pdf 2024-02-06

Search Strategy

1 SearchStrategyE_15-07-2020.pdf

ERegister / Renewals