Abstract: A system and method for providing a model-based script generation approach that utilizes a recorder, model creator, test model, script generator, execution engine, model updater for producing and maintaining modularly structured scripts. The system and method described herein utilize a recorder that records and stores information such as GUI screens, fields, scenarios and the like. A script generator is utilized for creating the test scripts by reading the information recorded by the recorder. The information recorded by the recorder is stored as a test model. The test model may have a plurality of abstractions such as user interface structures, scenarios execution profile, and the like. The user interface actions and inputs provided may be automatically identified by the recorder. The system also provides a model editor for maintain the test scripts by exporting the test case information stored in test model to a user interface.
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
TITLE OF INVENTION:
"A SYSTEM AND METHOD FOR GENERATING AND MAINTAINING REGRESSION TEST SCRIPTS"
Applicant
TATA Consultancy Services Limited A company Incorporated in India under The Companies Act, 1956
Having address:
Nirmal Building, 9th Floor,
Nariman Point, Mumbai 400021,
Maharashtra, India
The following specification particularly describes the invention and the manner in which it is to be performed.
FIELD OF THE INVENTION
This invention relates in general to the field of software testing, particularly to software test automation. More particularly, the invention relates to a system and method for creating and maintaining automated regression test scripts for software systems having a graphical user interface.
BACKGROUND OF THE INVENTION
Testing of software systems requires creation and maintenance of test scripts that provide adequate coverage and effectiveness in terms of uncovering bugs is a very challenging and resource intensive activity. A number of organizations test software systems by compiling a list of test cases that are run against the software systems to ensure the working quality and validate the design interface of the software systems. Organizations may appoint employees or contractors to run tests manually or may invest in automating the tests so that, once automated, they can be run anytime without any manual intervention and thereby reducing the cost incurred in testing. Most of the organizations nowadays implement test automation as it results in a significant reduction in labor costs as test automation eliminates the need for staff for manual execution. Further, automation of testing also results in reduction of execution time thereby allowing faster turnaround on results and more accurate and consistent results.
Test case automation can be performed in a plurality of ways; a modern approach includes using a recorder. A recorder generally records Keyword Tests, Script Code, and Low-Level Procedures. Existing tools and methods provide recorders that directly generate scripts from user actions. According to such a methodology, test logic and test data are embedded in the test script. Any change in the test has to be made directly to the test scripts and involves manual skill and effort. Since most of the time the tester has to work on creating and modifying scripts it requires the tester to be skilled in the scripting
language. Such skills may not be always available. In existing tools the test knowledge is directly codified in the form of scripts. This causes a tight explicit binding between the tests and the tool/platform. In case, the tests have to be imported to another tool/platform it requires a complete re-development or certain critical time consuming modifications in the test scripts. Some of these tools have an object model such that a recorder creates a model, however, such models contain only the user interface (UI) objects and these models are used as a reference rather than an independent model. A change in the object model causes significant changes in scripts which increases the maintenance effort. Model based test design and automation has been extensively studied in the recent years. Most of the research has been focused around specifying system requirements as a model and generating test cases out of it. Model based approaches have been mostly applied for System Testing and there have been fewer attempts to apply it to regression testing. Most of these approaches are better suited for generating test cases rather than executing test scripts since they use design level models. Test models on the other hand need to be more closely tied to implementation, in order to achieve a higher level of automation.
Different Model based test approaches require different skills and these are needed ahead of usage. The main problem with the existing tools is minimizing the human skill required and simultaneously maximizing the automation level. The recorders of existing tools directly create scripts as they do not provide an option to generate a model, or a platform independent test script. Moreover, none of the existing tools use a model based approach for regression test automation. Specifically, in case of regression test scripts if a system is continuously changing, the regression test scripts also need to be changed to continue to work with the new system. Changes can be in the user interface, the business flow, test data/inputs, or the functionality itself and to quickly develop an updated test script for every change is a challenge. Further, existing tools provide very little support for test script maintenance. Another challenge is the shortage of time for executing all of the regression tests. Existing tools require significant design and programming effort during script development thus making automation a hard to finish task. Furthermore,
knowledge of test automation tools and methods of developing maintainable test suites is not very easily available.
In view of the issues explained above, there is a growing need for improved methods and systems for generating automated test scripts. There is a further need for improved methods and systems for generating test scripts using model based generation of test scripts from graphical user interface representations. The present specification by means of one or embodiments described in the following paragraphs provides a system and method for test script development and maintenance that makes the test script development faster as compared to existing methods. The described system and method are easier to learn and use as there is no design and programming effort involved. Further, by implementing the described system and method the maintenance of scripts becomes much faster and easier since any changes in the test are to be made to test models rather than test scripts. Therefore, the present application addresses the challenges mentioned above viz. Large development effort, Cumbersome maintenance and Shortage of skills. The present approach addresses said issues by automatically extracting the test-model and then allowing the tester to refine it.
OBJECTS OF THE INVENTION
The primary objective of the present application is automation of regression testing for software applications by generating a model that is a complete representation of the application model and user activity from a black-box perspective by extracting the test model at runtime using a proxy recorder.
Another major objective of the present application to provide an approach that ensures that the test knowledge is stored in a script independent model form and is independent of scripting syntax.
It is another objective of the present application to provide a system and method for efficient test script creation and maintenance that is testing platform independent.
Another objective of the present application to provide a script independent approach for regression test automation that requires any changes or updates to the test to be made on script independent models therefore the testers do not directly work on scripts.
Yet another object of the invention is to use the said model to store different components from the plurality of components comprising of structural component, behavioral component, data component, test-suit component, log component, which stores the test scripts in structured way which further provides different views to the data.
Yet another object of the invention is to provide a recorder that uses a proxy for capturing all the information of test data and action of the user interface screen.
Yet another object of the invention is to uniquely identify the objects on screen and store it in object library which is globally available.
SUMMARY
The embodiments of the present application provide an approach that ensures that test knowledge is stored in a model form and is independent of scripting syntax. Therefore, when testers have to make changes to a test they have to only change the model and not the test script thereby making test script maintenance much more efficient. Moreover, since testers do not directly work on test scripts they need not learn programming and can do the test automation with minimal programming skills. Moreover, the present approach requires the tests to be stored in a script independent model thereby making the tests platform independent with a possibility of migrating to other tools/platforms.
Embodiments of the present application are directed towards a system (100) for generating and maintaining regression test scripts comprising a recorder (102). a test model (106), a script generator (108), an execution engine (110). a model updater (112), and a model editor (114) wherein the recorder is adapted to uniquely identify and capture key information regarding objects on a test screen and store it in the test model (106), the test model (106) is adapted to store the key information captured by the recorder in a script independent modular form, the script generator (108) is adapted to read the test model (106) and generate test scripts according to a framework, the execution engine (110) is adapted to execute the test scripts generated by the script generator (108) and to generate a test log based on the test script execution, the model updater (112) is adapted to edit and modify the key information stored in the test model (106), and the model editor (114) is adapted to export the key information stored in the test model (106) to a user interface for editing the key information and to import the edited key information to the test model (106).
Another embodiment of the present application is directed towards a method for generating and maintaining regression test scripts, wherein the method comprises the steps of recording key information related to one or more objects on a test screen by a recorder; storing the key information in a script independent modular structure in a test model by a model creator; reading the key information stored in the test model and generating test scripts by a script generator; executing the test scripts and generating test logs by an execution engine; and, storing the test logs in the test model by a model updater.
The foregoing has outlined rather broadly the features and technical advantages of the present invention in order that the detailed description of the invention that follows may be better understood. Additional features and advantages of the invention will be described hereinafter which form the subject of the claims of the invention. It should be appreciated that the conception and specific embodiment disclosed may be readily
utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present invention. It should also be realized that such equivalent constructions do not depart from the invention as set forth in the appended claims. The novel features which are believed to be characteristic of the invention, both as to its organization and method of operation, together with further objects and advantages will be better understood from the following description when considered in connection with the accompanying figures. It is to be expressly understood, however, that each of the figures is provided for the purpose of illustration and description only and is not intended as a definition of the limits of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing summary, as well as the following detailed description of preferred embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, there is shown in the drawings example constructions of the invention; however, the invention is not limited to the specific system and method disclosed in the drawings:
Figure 1 represents a block diagram of a system for automated test script generation, execution and maintenance in accordance with an exemplary embodiment.
Figure 2 represents a general flowchart illustrating the steps of test script generation and maintenance in accordance with an exemplary embodiment.
Figure 3 represents a general flowchart illustrating alternative steps of test script maintenance in accordance with an exemplary embodiment.
DETAILED DESCRIPTION OF THE INVENTION
Some embodiments of this invention, illustrating all its features, will now be discussed in detail. The words "comprising," "having." "containing," and "including," and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items.
It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present invention, the preferred, systems and methods are now described.
Conventionally test scripts are generated by using a recorder to record the user activities based on the user's interaction with the graphical user interface. The method of recording user activities may be preferably applicable when the application to be tested does not go through multiple changes. The system may go through a plurality of changes such as changes in the user interface, business flow, test data/inputs, functionality, and the like. However, in case of a frequently changing system, the test scripts also have to be changed to continue to work with the changed system, therefore, scripts created through a recorder tend to be difficult to maintain.
Alternatively, test scripts may be programmed in a modular manner but for programming test scripts it is essential for the testing team to have in depth knowledge of programming languages that may not always be possible. A plurality of architectures may be chosen for programmed test script generation such as keyword driven architecture, data driven architecture, hybrid architecture, and the like. Keyword driven architecture focuses on creating test libraries where the commonly used actions such as login, search
transactions, and the like: are provided as functions that may be used to create bigger and more complex tests. Such a methodology enables localizing changes such as functionality changes and user interface changes. However, the keyword driven approach requires significant programming effort, requires the testing team to be skilled with using such tools, and it takes significant effort to create test scripts. The Data driven approach is somewhat easier for generating test scripts with the help of the available tools. In this method, the inputs to the test are parameterized, that is, converted to variables and read from data tables or data files. Such an approach may be useful when data changes are frequently required. Further, the hybrid approach provides the benefits of the keyword driven approach as well as data driven approach for test script creation.
The present application provides a model-based script generation approach that delivers the best of the recording approach by utilizing a recorder and the programming approach by producing modularly structured scripts. Such an approach may be delivered by utilizing the existing test automation tools. The system and method described herein utilize a recorder that records and stores information such as GUI screens, fields, scenarios and the like. A script generator is also utilized that creates the test scripts by reading the information recorded by the recorder. The information recorded by the recorder is stored in a test model that acts as an action library. The test model may have a plurality of abstractions such as user interface structures, scenarios execution profile, and the like. The user interface actions and inputs provided may be automatically identified by the recorder. For example, the recorder may record inputting a username, inputting a password and clicking of a Submit button. The test model may store such information and generate a Login function that may directly be called within a test script. The system may enable selecting the preferred kind of architecture for the test scripts such as keyword driven, data driven, hybrid and the like based on the dynamics of the system. For example, if a system being tested is changing frequently then a keyword-driven approach may be chosen, whereas for a stable system, a data-driven architecture may be chosen as the data inputs may be changed frequently for many more test cases. The
present system and method may enable a tester to choose the script architecture according to the need.
With reference now to the figures and in particular with reference to figure 1, a block diagram of a system for automated test script generation, execution and maintenance is provided in which illustrative embodiments may be implemented. It should be appreciated that the figures are only exemplary and are not intended to assert or imply any limitation with regard to the environments in which different embodiments may be implemented. Many modifications to the depicted environments may be made.
Figure 1 illustrates a system (100) for automated test script generation, execution and maintenance in accordance to an exemplary embodiment. In an embodiment, the system (100) may comprise of a Recorder (102), a model creator (104), a test model (106), a script generator (108), an execution engine (110), a model updater (112), a model editor (114), and a test editor (116). The recorder (102) utilizes a proxy that may enable capturing all the information related to a test such as test data and test actions. The recorder (102) may uniquely identify the objects on screen and store it in an object library. In an aspect, the recorder (102) may be adapted to capture complete information of GUI test screen along with the fields on which the test action is performed. For Example, if the test screen has 30 fields and in a particular test case an action takes place only on 5 fields, the recorder (102) is adapted to still capture information of all the 30 fields and store it in an object library. Such an object library may be utilized in future and re-effort of recording may be saved. So, the recorder (102) is adapted to capture the complete information of the GUI test screen once any instance is encountered and thus providing more information. The recorder (102) may be adapted to capture key information such as GUI screens, fields, behavior, test architecture, test data, scenarios, and the like. The recorder (102) stores such information to the test model (106) by means of the model creator (104). The model creator (104) utilizes the test case information recorded by the recorder (102) and stores the information in the test model in a plurality
of abstractions as required. In an aspect, the model creator (104) may be integrated within the test model (106). In an aspect, the recorder (102) may also enable adding asserts for the test scripts during the recording process. Further, a model editor (114) is also provided for editing the test model (106). In certain embodiments, the system (100) may also enable editing existing test scripts or add new test scripts through an interface. The test model (106) may be used as an interface to store test case information in structured way, hence providing different views to the data. The test model (106) may include a plurality of components including but not limited to structural component, behavioral component, data component, test-suite component, log component, and the like.
At the end of the recording process, the test model may contain abstractions of a plurality of types such as user interface structures, scenarios and execution profile. In an embodiment, the test model may enable generating a library of low level actions and a modularly structured test script. For example, a ready function to Login may be generated that inputs the username, password and clicks Submit, by the script generator. In case such a login function is required in a test script then it may be called from the generated test library. In certain embodiments the test model (106) may be based on a combination of Object Management Group (OMG) standards. For example, the test model (106) may be based on an amalgamation of UML 2.0 Testing Profile (U2TP) and the Knowledge Discovery Meta-model (KDM). In aspects, the high level structure of the test model (106) may be based on U2TP by implementing concepts from the test architecture, test behavior and test data sections. The U2TP may not provide means to specify details about the interface of a System under Test (SUT). Therefore, a Test Ul section may be added to model the GUI structure of the System under Test. The granular details such as the interactions model in the behavior section, the structure of the data pool in the test data section, UI model in the test UI section, and the like, may be adapted from relevant packages of the KDM. Moreover, the behavior section may contain test scenarios defined as a sequence of statements of various types such as UI actions, conditionals, assignments
and queries that make observations from the UI, database or environment, and the like. The test data may contain datasets organized as tabular structures of data elements.
The system (100) ensures that test scripts are not generated directly upon recording instead a test model (106) is created that stores test knowledge in a modular form and the test knowledge such as UI actions, conditionals, assignments and queries that make observations from the UI. database or environment, and the like is independent of scripting syntax. Such a test model (106) enables making changes to a test case by only changing the test model and not a test script. Such functionality may make test script maintenance much more efficient. Further, as the test cases stored in the test model (106) are stored in a script independent model this may make testing platform independent with a possibility of migrating to other tools/platforms. In an aspect, the system (100) may be integrated with a plurality of tools such as, test management tools, test data generation tools, scenario generation tools, and the like. Integration with such tools may enable the test information captured in the test model to be analyzed for a plurality of purposes such as for automatically identifying change impact, providing visual representations of system flow, showing coverage metrics such as fields used/unused and scenario coverage, and the like.
In an embodiment, the test model may be a highly detailed model with not only objects but other components as well such as structural model, behavioral model, data model, test model, log model that may enable getting a complete view of the application under test as well as the various test-cases. The test model (106) may also enable storing objects, maintaining object library and a high level logical component library.
Once the test model is populated by the recorder, the test model may be utilized for generating test scripts. The script generator (108) is adapted to read the test mode! (106) and generate test scripts in a variety of frameworks. In an aspect, the script generator (108) may enable generating scripts into a variety of frameworks that may be chosen by a
tester. In another aspect, the script generator (108) may generate test scripts in a keyword driven framework and may enable the scripts and test-cases to be easily edited from perspective of maintenance. Further, the script generator (108) may be implemented to generate scripts in different languages / automation platforms. Once, the test scripts have been generated in the desired programming language by the script generator (108), a compatible execution engine (110) executes the generated test scripts. The execution engine (110) is also adapted to generate test logs that enable saving a log history for a particular test case. Upon test script execution the model updater (112) stores test logs based on the execution of a particular test script. In an aspect, the model updater (112) is also adapted to store test reports, test case documents, and the like in the test model (106).
The system (100) also facilitates a model editor (114) that enables updating the test cases stored in the test model (106). In an aspect, the model editor (114) may also enable editing existing test cases or add new test cases through an interface. The model editor (114) may provide an interface to edit the test model (106). The model editor (114) may provide complete control over specifying flows / input data, and the like for maintenance purpose. The model editor also provides a test editor (116) that is a user interface for editing test-cases, test data, object library, and component library. The model editor (114) along with the test editor (116) may make the maintenance of regression test suite very easy and user friendly. The test editor (116) may also aid in providing an easy medium to tag validations to respective test-cases effectively. Test mode! (106) stores the data in a structured way and the model editor (114) may provide an export utility that exports test case information from the test model (106) to a Ul such as the test editor (116) that may display the data in UI such as Excel. The test editor (116) may enable editing test case information easily in terms of flows, input data, object library, function library, and the like. The test editor may also enable importing the edited test case information back into the test model (106) to persist the changes made to the test case.
Figure 2 explains the method (200) of test script generation and maintenance in accordance with an exemplary embodiment of the present invention. The method starts with recording (202) of key information inputted by a tester/user by a recorder such as the recorder (102) described above. The recording of key information may include capturing complete information of GUI test screen along with the fields on which the test action is performed. In an aspect, the recorded key information may include GUI screens, fields, behavior, test architecture, test data, scenarios, and the like. In an aspect, test scripts may not be directly generated upon the recording instead the recorded key information is stored (203) in a test model such as test model (106). The key information recorded by the recorder may be stored in the test model by means of a model creator such as model creator (104). In an aspect, the model creator may be integrated within the test model. Further, the key information related to test cases stored in the test model are stored in a script independent model that may make the testing platform independent with a possibility of migrating to other tools/platforms. Once the key information is stored in the test model the script generator (108) may read the information and generate test scripts. In an aspect, test scripts may be generated in a variety of frameworks by utilizing a suitable script generator. In another aspect, test scripts may be generated in a keyword driven framework by the script generator. Further, a suitable script generator may be implemented to generate scripts in different languages / automation platforms. Once the scripts have been generated by the script generator, the scripts are executed by an execution engine such as the execution engine (110). In an aspect, test logs may be generated at the time of the script execution. Such test logs may then be stored in the test model by means of a model updater such as the model updater (112).
In an embodiment, the method (200) may allow selecting a preferred kind of test suite architecture for generating the test scripts. The kind of test suite architecture may be chosen depending on the dynamics of the system. For example, if a system is changing frequently, then a keyword-driven framework would do well, whereas for a stable system, a data-driven architecture may suit.
Figure 3 represents a method (300) for test script maintenance in accordance with an exemplary embodiment. Model Editor may provide an interface to edit the test model. Editing the existing test cases or adding new test cases may be enabled by the test model (106) that is provided by the model editor (114). The test model stores the data in structured way and the model editor provides a test editor for exporting the test cases from the test model to a Ul that may display the test case information in a user interface such as Excel. The test case information may be edited easily in terms of flows, input data, object library, function library, and the like, and can be imported back into the test model to persist the changes. The test editor may also aid in providing an easy medium to tag validations to respective test-cases effectively. The test editor may enable editing test case information easily in terms of flows, input data, object library, function library, and the like. The test editor may also enable importing the edited test case information back into the test model to persist the changes made to the test case. Upon editing, the test cases are transmitted (306) to the model editor. The model editor then transmits (308) the edited test cases to the model updater along with details related to the changes made to the test cases. The model updater then stores (310) the edited test cases along with the information received from the model editor in the test model.
The methodology and techniques described with respect to the exemplary embodiments can be performed using a machine or other computing device within which a set of instructions, when executed, may cause the machine to perform any one or more of the methodologies discussed above. In some embodiments, the machine operates as a standalone device. In some embodiments, the machine may be connected (e.g., using a network) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client user machine in a server-client user network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
In the drawings and specification, there have been disclosed a typical preferred embodiment of the invention, and although specific terms are employed, the terms are used in a descriptive sense only and not for purposes of limitation. The invention has been described in considerable detail with specific reference to these illustrated embodiments. It will be apparent, however, that various modifications and changes can be made within the spirit and scope of the invention as described in the foregoing specification. For example, the exemplary embodiments of the present invention were primarily directed to generation of regression test scripts. However, one skilled in the art would recognize the applicability to a variety of test scripts.
Although the present invention and its advantages have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the invention as defined by the appended claims. Moreover, the scope of the present application is not intended to be limited to the particular embodiments of the process, machine, means, methods and steps described in the specification. As one will readily appreciate from the disclosure, processes, machines, means, methods, or steps, presently existing or later to be developed that perform substantially the same function or achieve substantially the same result as the corresponding embodiments described herein may be utilized. Accordingly, the appended claims are intended to include within their scope such processes, machines, means, methods, or steps.
We Claim:
1. A system (100) for generating and maintaining regression test scripts comprising
a recorder (102), a test model (106), a script generator (108). an execution engine
(110), a model updater (112), and a model editor (114) wherein
the recorder is adapted to uniquely identify and capture key information regarding objects on a test screen and store it in the test model (106), the test model (106) is adapted to store the key information captured by the recorder in a script independent modular form, the script generator (108) is adapted to read the test model (106) and generate test scripts according to a framework, the execution engine (110) is adapted to execute the test scripts generated by the script generator (108) and to generate a test log based on the test script execution, the model updater (112) is adapted to edit and modify the key information stored in the test model (106), and the model editor (114) is adapted to export the key information stored in the test model (106) to a user interface for editing the key information and to import the edited key information to the test model (106).
2. The system as claimed in claim 1, wherein the recorder (102) is adapted to capture complete information of a GUI test screen along with the fields on which a test action is performed, once any instance is encountered.
3. The system as claimed in claim 1, wherein the recorder (102) is adapted to capture information selected from a group comprising of GUI screens, fields, behavior, test architecture, test data, and scenarios.
4. The system as claimed in claim 1, wherein the recorder (102) is adapted to enable adding asserts for test cases during the recording process.
5. The system as claimed in claim 1, wherein the test model (106) includes a model creator (104) for storing the recorded key information in a modular form.
6. The system as claimed in claim 1, wherein the test model (106) includes a plurality of components selected from a group comprising of structural component, behavioral component, data component, test-suite component, and log component.
7. The system as claimed in claim 1, wherein the test model (106) is adapted to store one or more objects, and maintain an object library and a high level logical component library.
8. The system as claimed in claim 1, wherein the test model (106) is based on UML 2.0 Testing Profile and the Knowledge Discovery Meta-model.
9. The system as claimed in claim 1, wherein the model editor (114) is adapted to facilitate a test editor (116) for editing test case information easily in terms of flows, input data, object library, and function library.
10. The system as claimed in claim 1, wherein the test editor (116) is UI utility.
11. The system as claimed in claim 1, wherein the system (100) is adapted to be integrated with a plurality of tools selected from a group comprising of test management tools, test data generation tools, and scenario generation tools.
12. A method for generating and maintaining regression test scripts, wherein the method comprises the steps of
recording key information related to one or more objects on a test screen by a
recorder;
storing the key information in a script independent modular structure in a test model by a model creator;
reading the key information stored in the test model and generating test scripts by a script generator;
executing the test scripts and generating test logs by an execution engine; and, storing the test logs in the test model by a model updater.
13. The method as claimed in claim 12, wherein recording of the key information includes capturing complete information of GUI test screen along with the fields on which a test action is performed.
14. The method as claimed in claim 12, wherein generating test scripts includes selecting the script generator based on a preferred test suite architecture.
15. The method as claimed in claim 12, further comprising:
generating a UI utility by a model editor;
importing the key information stored in the test model to the UI utility; editing the key information imported in the UI utility by the model editor; and, exporting and storing the edited key information from the UI utility to the test model.
| # | Name | Date |
|---|---|---|
| 1 | 1474-MUM-2012-OTHERS [29-06-2018(online)].pdf | 2018-06-29 |
| 1 | 1474-MUM-2012-RELEVANT DOCUMENTS [30-09-2023(online)].pdf | 2023-09-30 |
| 2 | 1474-MUM-2012-FER_SER_REPLY [29-06-2018(online)].pdf | 2018-06-29 |
| 2 | 1474-MUM-2012-IntimationOfGrant21-05-2021.pdf | 2021-05-21 |
| 3 | 1474-MUM-2012-PatentCertificate21-05-2021.pdf | 2021-05-21 |
| 3 | 1474-MUM-2012-COMPLETE SPECIFICATION [29-06-2018(online)].pdf | 2018-06-29 |
| 4 | 1474-MUM-2012-Written submissions and relevant documents [05-10-2020(online)].pdf | 2020-10-05 |
| 4 | 1474-MUM-2012-CLAIMS [29-06-2018(online)].pdf | 2018-06-29 |
| 5 | 1474-MUM-2012-Correspondence to notify the Controller [20-09-2020(online)].pdf | 2020-09-20 |
| 5 | 1474-MUM-2012-ABSTRACT [29-06-2018(online)].pdf | 2018-06-29 |
| 6 | ABSTRACT1.jpg | 2018-08-11 |
| 6 | 1474-MUM-2012-FORM-26 [20-09-2020(online)].pdf | 2020-09-20 |
| 7 | 1474-MUM-2012-Response to office action [20-09-2020(online)].pdf | 2020-09-20 |
| 7 | 1474-MUM-2012-FORM 3.pdf | 2018-08-11 |
| 8 | 1474-MUM-2012-US(14)-HearingNotice-(HearingDate-21-09-2020).pdf | 2020-08-18 |
| 8 | 1474-MUM-2012-FORM 26(29-5-2012).pdf | 2018-08-11 |
| 9 | 1474-MUM-2012-ABSTRACT.pdf | 2018-08-11 |
| 9 | 1474-MUM-2012-FORM 2.pdf | 2018-08-11 |
| 10 | 1474-MUM-2012-CLAIMS.pdf | 2018-08-11 |
| 10 | 1474-MUM-2012-FORM 2(TITLE PAGE).pdf | 2018-08-11 |
| 11 | 1474-MUM-2012-CORRESPONDENCE(29-5-2012).pdf | 2018-08-11 |
| 11 | 1474-MUM-2012-FORM 18.pdf | 2018-08-11 |
| 12 | 1474-MUM-2012-CORRESPONDENCE(4-6-2012).pdf | 2018-08-11 |
| 12 | 1474-MUM-2012-FORM 1.pdf | 2018-08-11 |
| 13 | 1474-MUM-2012-CORRESPONDENCE.pdf | 2018-08-11 |
| 13 | 1474-MUM-2012-FORM 1(4-6-2012).pdf | 2018-08-11 |
| 14 | 1474-MUM-2012-DESCRIPTION(COMPLETE).pdf | 2018-08-11 |
| 14 | 1474-MUM-2012-FER.pdf | 2018-08-11 |
| 15 | 1474-MUM-2012-DRAWING.pdf | 2018-08-11 |
| 16 | 1474-MUM-2012-DESCRIPTION(COMPLETE).pdf | 2018-08-11 |
| 16 | 1474-MUM-2012-FER.pdf | 2018-08-11 |
| 17 | 1474-MUM-2012-FORM 1(4-6-2012).pdf | 2018-08-11 |
| 17 | 1474-MUM-2012-CORRESPONDENCE.pdf | 2018-08-11 |
| 18 | 1474-MUM-2012-FORM 1.pdf | 2018-08-11 |
| 18 | 1474-MUM-2012-CORRESPONDENCE(4-6-2012).pdf | 2018-08-11 |
| 19 | 1474-MUM-2012-CORRESPONDENCE(29-5-2012).pdf | 2018-08-11 |
| 19 | 1474-MUM-2012-FORM 18.pdf | 2018-08-11 |
| 20 | 1474-MUM-2012-CLAIMS.pdf | 2018-08-11 |
| 20 | 1474-MUM-2012-FORM 2(TITLE PAGE).pdf | 2018-08-11 |
| 21 | 1474-MUM-2012-ABSTRACT.pdf | 2018-08-11 |
| 21 | 1474-MUM-2012-FORM 2.pdf | 2018-08-11 |
| 22 | 1474-MUM-2012-FORM 26(29-5-2012).pdf | 2018-08-11 |
| 22 | 1474-MUM-2012-US(14)-HearingNotice-(HearingDate-21-09-2020).pdf | 2020-08-18 |
| 23 | 1474-MUM-2012-FORM 3.pdf | 2018-08-11 |
| 23 | 1474-MUM-2012-Response to office action [20-09-2020(online)].pdf | 2020-09-20 |
| 24 | 1474-MUM-2012-FORM-26 [20-09-2020(online)].pdf | 2020-09-20 |
| 24 | ABSTRACT1.jpg | 2018-08-11 |
| 25 | 1474-MUM-2012-Correspondence to notify the Controller [20-09-2020(online)].pdf | 2020-09-20 |
| 25 | 1474-MUM-2012-ABSTRACT [29-06-2018(online)].pdf | 2018-06-29 |
| 26 | 1474-MUM-2012-Written submissions and relevant documents [05-10-2020(online)].pdf | 2020-10-05 |
| 26 | 1474-MUM-2012-CLAIMS [29-06-2018(online)].pdf | 2018-06-29 |
| 27 | 1474-MUM-2012-PatentCertificate21-05-2021.pdf | 2021-05-21 |
| 27 | 1474-MUM-2012-COMPLETE SPECIFICATION [29-06-2018(online)].pdf | 2018-06-29 |
| 28 | 1474-MUM-2012-IntimationOfGrant21-05-2021.pdf | 2021-05-21 |
| 28 | 1474-MUM-2012-FER_SER_REPLY [29-06-2018(online)].pdf | 2018-06-29 |
| 29 | 1474-MUM-2012-RELEVANT DOCUMENTS [30-09-2023(online)].pdf | 2023-09-30 |
| 29 | 1474-MUM-2012-OTHERS [29-06-2018(online)].pdf | 2018-06-29 |
| 1 | 1474_MUM_2012_21-12-2017.pdf |