Sign In to Follow Application
View All Documents & Correspondence

A Test Automation Framework System And A Method Thereof

Abstract: ABSTRACT A TEST AUTOMATION FRAMEWORK SYSTEM AND A METHOD THEREOF The present disclosure relates to a test automation framework system (100). The system (100) includes a scenario creator (110) to create at least one scenario based on the set of user requirements; a test creator (112) to create a test block having a set of test cases generated based on the created scenario using the set of predefined rules, a test task creator (114) to create a set of test tasks for each of the created test cases; a test workflow creator (116) to process the set of test cases to associate each test case with one workflow pattern or with connected multiple workflow patterns; a tab creator (118) to create at least one application tab in each of the test cases based on the set of user requirements; and a test run module (120) to execute the test tasks of each of the test cases of the scenario.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
18 August 2021
Publication Number
08/2023
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
dewan@rkdewanmail.com
Parent Application

Applicants

INFOVISION LABS INDIA PRIVATE LIMITED
TEERTH TECHNOSPACE, B-804 A, B-805, B-806 & C-807 S. NO. 103, BANER, PUNE, MAHARASHTRA-411045, INDIA

Inventors

1. SAWALAPURKAR, Saurabh Shripad
Jaystambh Square New Town Badnera, Amravati, Maharashtra, 444701, India

Specification

DESC:FIELD
The present disclosure generally relates to an automation framework. More particularly, the present disclosure relates to a test automation framework system and a method thereof.

DEFINITIONS
As used in the present disclosure, the following terms are generally intended to have the meaning as set forth below, except to the extent that the context in which they are used indicates otherwise.
Automation Framework– The term “automation framework” hereinafter refers to an is not a single tool or process, but it is a collection of tools and processes working together to support automated testing of any application. It integrates various functions like libraries, test data, and various reusable modules.
Fail- The term “fail” hereinafter refers to results that are not matched with the expected outcome.
Snapshot-The term “snapshot” refers to screen capture.
Wait- The term “wait” refers to the unit of measurement in seconds.
Log report- The term “log report” refers to a system Log.
Excel- The term “excel” refers to a Microsoft Excel document.
BACKGROUND
The background information herein below relates to the present disclosure but is not necessarily prior art.
Software testing tasks are defined to perform certain actions in software workflows. These tasks may be performed manually by software testers or through an automated software testing tool. An automated software testing tool is a piece of software that performs these actions with little or no human involvement that helps reduce human possible errors, maximize efficiency and improve overall performance.
Generally, a software code is written to build a software application in accordance with the end users’ requirements. Once the software application is ready, the software testers verify the functionality of the application by executing various tests. This leads to a series of modifications or alterations in the code, in turn leading to frequent and continuous testing. For this, there is a need for an automated software testing tool with a robust framework to automate the testing actions, verify the results, and meet the expected conditions defined in the software requirement specification document or defined by the end users.
Software Tester takes a call to define unsaid workflows, error conditions, and various data combinations. Software testing is a process of analyzing a software item to detect the differences between existing and required conditions. It aims to figure out defects/errors/bugs in code and to evaluate the features of the software item. This is a time-consuming job. Hence, there is a need for a test automation framework to save time and provide accuracy.
Whenever software application is changed or new features are added, software testing is required. Traditionally, the software tester identifies the area of impact and executes limited test cases. The cost of software testing is low during the initial phases of software development but increases multifold with the development phase. Therefore, small changes to the application can have a large impact on the software testing efforts, particularly, in the later phase of software development. Because of the increase in testing efforts, sometimes, it becomes impossible to test an application software in a given timeframe.
Therefore, there is a need for a test automation framework system and a method thereof that alleviates the aforementioned drawbacks.

OBJECTS
Some of the objects of the present disclosure, which at least one embodiment herein satisfies, are as follows:
It is an object of the present disclosure to overcome one or more problems of the prior art or to at least provide a useful alternative.
An object of the present disclosure is to provide a test automation framework system and a method thereof.
Another object of the present disclosure is to provide a system that facilitates accurate detection and evaluation of bugs and errors.
Yet another object of the present disclosure is to provide a system to integrate Behavior Driven Development (BDD) approach with Test-Driven Development (TDD) for testing software applications.
Still another object of the present disclosure is to provide a system that solves real-time issues by enhancing and modifying the existing approaches with a newly designed mechanism.
Another object of the present disclosure is to provide a system that can be used for an automated application test data processing system.
Yet another object of the present disclosure is to provide a system wherein multiple users can interact and produce test cases of different test scenarios.
Still another object of the present disclosure is to provide an efficient, fast, and user-friendly process that makes test creations, test updates, analysis, debugging, migration, and overall maintenance process easy and effective.
Another object of the present disclosure is to provide a system that provides a customized approach and helps in the effective configuration and execution of automation flows.
Yet another object of the present disclosure is to provide a system that supports a dynamic test workflow creation approach with ease and reusability.
Still another object of the present disclosure is to provide easy data handling, customized page features, multi-application support, multi-environment support, easy migration, improve time to market, multi-way execution, and multi-reporting formats.
Other objects and advantages of the present disclosure will be more apparent from the following description when read in conjunction with the accompanying figures, which are not intended to limit the scope of the present disclosure.
SUMMARY
The present disclosure envisages a test automation framework system. The system includes a data repository, a scenario creator, a test creator, a test task creator, a test workflow creator, a tab creator, a test run module, a test case display, and a test report generator.
In an aspect, the test automation framework system is implemented as and by one or more processors.
The data repository is configured to store a set of user requirements and a set of predefined rules.
The scenario creator is configured to cooperate with the data repository to create at least one scenario based on the set of user requirements.
The test creator is configured to cooperate with the scenario creator to create a test block having a set of test cases generated based on the created scenario using the set of predefined rules, where each of the test cases includes a set of test tasks.
The test task creator is configured to cooperate with the test creator to create a set of test tasks for each of the created test cases.
The test workflow creator is configured to cooperate with the test creator to process the set of test cases to associate each test case with one workflow pattern or with connected multiple workflow patterns;
The tab creator is configured to cooperate with the test workflow creator to create at least one application tab in each of the test cases based on the set of user requirements.
The test run module is configured to cooperate with the tab creator to execute the test tasks of each of the test cases of the scenario; and
The test case display is configured to cooperate with the test run module to display a live preview of the execution of each of the test tasks.
The test report generator is configured to cooperate with the test case display to generate at least one report, along with the log report and excel report, based on the execution of each of the test cases.
In an aspect, the application tab is divided into two main blocks:
• a first block containing all global parameters; and
• a second block containing functional or web elements to perform operations.
In an aspect, the application tab contains standard attributes including one or more of an element name, a page name, an object type, an object identification, an action taken, a failure suggestion, a mark object, a load-in-report, a snapshot, a wait, and a description.
In an aspect, the object type includes a global parameter, a page description, a web element, and a function block.
The present disclosure further envisages a method for implementing a test automation framework. The method includes the following steps:
• creating, by a scenario creator of a test automation framework system, at least one scenario based on a set of user requirements retrieved from a data repository;
• creating, by a test creator of the system, a test block having a set of test cases generated based on the created scenario using the set of predefined rules;
• creating, by a test task creator of the system, a set of test tasks for each of the created test cases;
• processing, by a test workflow creator of the system, the set of test cases to associate each test case with one workflow pattern or with connected multiple workflow patterns;
• creating, by a tab creator of the system, at least one application tab in each of the test cases based on the set of user requirements;
• executing, by a test run module of the system, the test tasks of each of the test cases of the scenario;
• displaying, by a test case display of the system, a live preview of execution of each of the test tasks; and
• generating, by a test report generator of the system, at least one report, along with the log report and excel report, based on the execution of each of the test cases.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWING
A test automation framework system and a method thereof of the present disclosure will now be described with the help of the accompanying drawing, in which:
FIG. 1 illustrates a block diagram of a test automation framework system, in accordance with an embodiment of the present disclosure;
FIG. 2A to 2H illustrate exemplary screenshots depicting the working of the system of FIG. 1, in accordance with an embodiment of the present disclosure; and
FIG. 3A illustrates a method for implementing a test automation framework, in accordance with an embodiment of the present disclosure.
LIST OF REFERENCE NUMERALS
100 System
104 User device
106 Processing Engine
108 Data repository
110 Scenario creator
112 Test creator
114 Test task creator
116 Test workflow creator
118 Tab creator
120 Test run module
122 Test case display
124 Test report generator
300 Method

DETAILED DESCRIPTION
Embodiments, of the present disclosure, will now be described with reference to the accompanying drawing.
Embodiments are provided to thoroughly and fully convey the scope of the present disclosure to the person skilled in the art. Numerous details are set forth, relating to specific components, and methods, to provide a complete understanding of embodiments of the present disclosure. It will be apparent to the person skilled in the art that the details provided in the embodiments should not be construed to limit the scope of the present disclosure. In some embodiments, well-known processes, well-known apparatus structures, and well-known techniques are not described in detail.
The terminology used, in the present disclosure, is only for the purpose of explaining a particular embodiment and such terminology shall not be considered to limit the scope of the present disclosure. As used in the present disclosure, the forms "a,” "an," and "the" may be intended to include the plural forms as well, unless the context suggests otherwise. The terms “including,” and “having,” are open-ended transitional phrases and therefore specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not forbid the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. The particular order of steps disclosed in the method and process of the present disclosure is not to be construed as necessarily requiring their performance as described or illustrated. It is also to be understood that additional or alternative steps may be employed.
A preferred embodiment of a test automation framework system (hereinafter referred to as “system 100”) and a method thereof (hereinafter referred to as “method 300”) of the present disclosure are now being described in detail with reference to the FIGS. 1 to 3.
In the test automation framework system 100 proposed herein, both Behavior Driven Development (BDD) and Test-Driven Development (TDD) are merged.
Referring to FIG. 1, the test automation framework system 100 may comprise a framework installable in a user device 104. In an aspect, the user device 104 may be selected from the group consisting of, but not limited to, a laptop, a tablet, an iPad, a personal computer, a tablet, and so forth.
Although the test automation framework system 100 is shown to be implemented on a standalone user device (104) in FIG. 1, the test automation framework system 100 can be implemented on the server side in a client-server configuration. In the said client-server configuration, a server component may be a computer or virtual system where the framework is installed, while a client component may be a computer or virtual system where test execution may happen.
In another aspect, the framework may be selected from the group consisting of, but not limited to, an application, a third-party software, and a tool, for implementation in a standalone device or a client-server configuration.
Referring back to FIG. 1, The user device 104 may be implemented as a standalone computing system communicatively connected through a network to other devices. The user device 104 includes an interface(s) 104-1 and a memory 104-2. The interface(s) 104-1 may include a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, network devices, and the like. The interface(s) 104-1 facilitate communication with various computing devices connected in a networked environment. The interface(s) 104-1 may also provide a communication pathway for one or more components of the user device 104. Examples of such components include, but are not limited to, input devices such as keyboards and computer mice.
The memory 104-2 may store one or more computer-readable instructions, which may be fetched and executed to set a system state as one of wakeable through a trigger and non-wakeable. The memory 104-2 may include any non-transitory computer-readable medium including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
The user device(s) 104 further includes the test automation framework system 100.
Further, the test automation framework system 100 includes a processing engine 106 and a data repository 108.
The data repository 108 is configured to store a set of user requirements and a set of predefined rules, provided by the client or user device 104.
The processing engine 106 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine 106. In the examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine 106 may be processor-executable instructions stored on a non-transitory machine-readable storage medium, and the hardware for the processing engine 106 may include a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine 106. In such examples, the user device 104 may include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to the user device 104 and the processing resource. In other examples, the processing engine 106 may be implemented by electronic circuitry.
In an example, the processing engine 106 includes a scenario creator 110, a test creator 112, a test task creator 114, a test workflow creator 116, a tab creator 118, a test run module 120, a test case display 122, and a test report generator 124.
The scenario creator 110 may be configured to create at least one scenario based on the set of pre-stored requirements. The test creator 112 may be configured to facilitate the tester to create a set of test cases based on each of the generated scenarios to generate a test block as shown in FIG. 2A. The test task creator 114 may be configured to create a set of test tasks for each of the test cases generated.
The test workflow creator 116 may be configured to facilitate a tester to design a test workflow pattern based on the requirements of the user. The test workflow may be a reusable component for other test cases and can be reused and updated for other test cases based on other test cases. The test workflow creator 116 may be further configured to create at least one test and can design tests with one workflow pattern or with connected multiple workflow patterns. These workflow patterns help users to understand flow from a high-level page perspective or internal detailed element level perspective.
The tab creator 118 may be configured to facilitate the tester to create at least one application tab in the test case based on the user requirement and test cases as shown in FIGS. 2B-2D. The application tab may be divided into two main blocks, namely:
• First block containing all Global Parameters; and
• Second block containing Function/Web elements to perform operations.
Each application tab may contain the below standard attributes:
o Element Name: Unique element name to identify operation.
o Page Name: Page contains a number of elements linked, traversing from left to right.
o Object Type:
? Global Parameters: Helps to mark provided element attribute as a global parameter, accessible everywhere.
? Page Description: Page level description, used to provide page-specific operation details.
? Web Element: It means, Object is an element type and can be directly handled with provided existing actions like Set, Click, Select.
? Function: It means, the Object type is of function block and the user needs to configure the function and then integrate it with the framework.
o Object Identification: The user can pass on a unique object path in provided format and then the framework handle it properly.
o Action Taken: The user can either provide exiting action blocks like a set, click, select, or can add a newly created function here.
o On Fail: it suggests operations need to perform if the respective element got failed, help with soft and hard assertions.
o Mark Object: Highlight provided object with yellow block, if marked as Yes.
o Load In Report: It’s a requirement-specific attribute, where we can add respective elements in an HTML report.
o Snapshot: Based on user inputs, the framework can retrieve snapshots and integrate them with an HTML report.
o Wait: It’s a dynamic synchronization mechanism, where based on requirement user can provide implicit and explicit wait.
o Description: It’s an element-level description, used to explain operations performed on provided objects or functions.
The test run module 120 may be configured to run the test tasks of each of the test cases for each scenario. The test case display 122 may be configured to display the live action preview of each of the test tasks. The result of the test case can either be a success, as shown in FIG. 2C, or a failure as referred in FIG. 2D. The test case display 122 may be configured to display the test result, test case name, test case description, and the root cause of failure.
The test report generator 124 may be configured to generate at least one report based on each of the test cases executed, along with the log report and excel report as shown in FIGS. 2E, 2F, and 2G.
In an exemplary aspect, referring to FIG. 2H, in case there is a plurality of different web applications, and they are to be triggered with the workflow to pass on an output from 1st application to 2nd, 2nd to 3rd, 3rd to 4th, and 4th to 5th application, the user can configure the scenario easily within a single application tab or by creating multiple application tabs and can generate single test execution report containing all used applications, pages, modules, and required details, with the implementation of the test automation framework system 100 proposed in the present disclosure.
In an aspect, the test automation framework system 100 may be configured to test the web applications, mobile applications, and application programming interfaces (APIs).
In an aspect, the test automation framework system 100 may be implemented as or by one or more processors.
Thus, with the implementation of the test automation framework system 100, minimum human interaction is required, where multiple users can use the framework’s multiple tabs to add their respective test cases or can create an entirely different file with the same framework source code. The test automation framework system 100 uses the executor to access different test case sets and trigger them separately on different systems.
FIG. 2 illustrates the method 300 for implementing a test automation framework, in accordance with an embodiment of the present disclosure. The order in which the method 300 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any appropriate order to carry out the method 300 or an alternative method. Additionally, individual blocks may be deleted from the method 300 without departing from the scope of the subject matter described herein.
In step 302, the method 300 includes creating, by a scenario creator 110 of a test automation framework system 100, at least one scenario based on a set of user requirements retrieved from a data repository 108.
In step 304, the method 300 includes creating, by a test creator 112 of the system 100, a test block having a set of test cases generated based on the created scenario using the set of predefined rules, wherein each of the test cases includes a set of test tasks.
In step 306, the method 300 includes creating, by a test task creator 114 of the system 100, a set of test tasks for each of the created test cases.
In step 308, the method 300 includes processing, by a test workflow creator 116 of the system 100, the set of test cases to associate each test case with one workflow pattern or with connected multiple workflow patterns.
In step 310, the method 300 includes creating, by a tab creator 118 of the system 100, at least one application tab in each of the test cases based on the set of user requirements.
In step 312, the method 300 includes executing, by a test run module 120 of the system 100, the test tasks of each of the test cases of the scenario;
In step 314, the method 300 includes displaying, by a test case display 122 of the system 100, a live preview of execution of each of the test tasks; and
In step 316, the method 300 includes generating, by a test report generator 124 of the system 100, at least one report, along with the log report and excels report, based on the execution of each of the test cases.
The foregoing description of the embodiments has been provided for purposes of illustration and is not intended to limit the scope of the present disclosure. Individual components of a particular embodiment are generally not limited to that particular embodiment, but, are interchangeable. Such variations are not to be regarded as a departure from the present disclosure, and all such modifications are considered to be within the scope of the present disclosure.
TECHNICAL ADVANCEMENTS AND ECONOMIC SIGNIFICANCE
The present disclosure described herein above has several technical advantages including, but not limited to, the realization of, a test automation framework system and a method thereof, that:
• provide accurate detection and evaluation of bugs and errors;
• integrate Behavior Driven Development (BDD) approach with Test-Driven Development (TDD) for testing software applications;
• solve real-time issues by enhancing, and modifying the existing approaches with newly designed mechanisms;
• use an automated application test data processing system;
• facilitate multiple users to interact and produce test cases of different test scenarios;
• provide an efficient, fast, and user-friendly process that makes test creations, test updates, analysis, debugging, migration, and overall maintenance process easy and effective;
• provide a customized approach and helps in effective configuration and execution of automation flows;
• support dynamic test workflow creation approach with ease and reusability; and
• provide easy data handling, customized page features, multi-application support, multi-environment support, easy migration, improve time to market, multi-way execution, and multi-reporting formats.
The embodiments herein and the various features and advantageous details thereof are explained with reference to the non-limiting embodiments in the following description. Descriptions of well-known components and processing techniques are omitted to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
The foregoing description of the specific embodiments so fully reveals the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.
The use of the expression “at least” or “at least one” suggests the use of one or more elements or ingredients or quantities, as the use may be in the embodiment of the disclosure to achieve one or more of the desired objects or results.
Any discussion of documents, acts, materials, devices, articles, or the like that has been included in this specification is solely to provide a context for the disclosure. It is not to be taken as an admission that any or all of these matters form a part of the prior art base or were common general knowledge in the field relevant to the disclosure as it existed anywhere before the priority date of this application.
The numerical values mentioned for the various physical parameters, dimensions, or quantities are only approximations and it is envisaged that the values higher/lower than the numerical values assigned to the parameters, dimensions or quantities fall within the scope of the disclosure, unless there is a statement in the specification specific to the contrary.
While considerable emphasis has been placed herein on the components and parts of the preferred embodiments, it will be appreciated that many embodiments can be made and that many changes can be made in the preferred embodiments without departing from the principles of the disclosure. These and other changes in the preferred embodiment, as well as other embodiments of the disclosure, will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter is to be interpreted merely as illustrative of the disclosure and not as a limitation
,CLAIMS:WE CLAIM:
1. A test automation framework system (100) comprising:
a data repository (108) configured to store a set of user requirements and a set of predefined rules;
a scenario creator (110) configured to cooperate with the data repository (108) to create at least one scenario based on the set of user requirements;
a test creator (112) configured to cooperate with the scenario creator (110) to create a test block having a set of test cases generated based on the created scenario using the set of predefined rules,
a test task creator (114) configured to cooperate with the test creator (112) to create a set of test tasks for each of the created test cases;
a test workflow creator (116) configured to cooperate with the test creator (114) to process the set of test cases to associate each test case with one workflow pattern or with connected multiple workflow patterns;
a tab creator (118) configured to cooperate with the test workflow creator (116) to create at least one application tab in each of the test cases based on the set of user requirements;
a test run module (120) configured to cooperate with the tab creator (118) to execute the test tasks of each of the test cases of the scenario; and
a test case display (122) configured to cooperate with the test run module (120) to display a live preview of the execution of each of the test tasks.
2. The test automation framework system (100) as claimed in claim 1, wherein the system (100) includes a test report generator (124) configured to cooperate with the test case display (122) to generate at least one report, along with the log report and excel report, based on the execution of the each of the test cases.
3. The test automation framework system (100) as claimed in claim 1, wherein the test automation framework system (100) is implemented by one or more processors.
4. The test automation framework system (100) as claimed in claim 1, wherein the application tab is divided into two main blocks:
• a first block containing all global parameters; and
• a second block containing functional or web elements to perform operations.
5. The test automation framework system (100) as claimed in claim 1, wherein the application tab contains standard attributes including one or more of an element name, a page name, an object type, an object identification, an action taken, a failure suggestion, a mark object, a load-in-report, a snapshot, a wait, and a description.
6. The test automation framework system (100) as claimed in claim 1, wherein the test case display (122) is further configured to display the test result, test case name, test case description, and the root cause of failure.
7. The test automation framework system (100) as claimed in claim 5, wherein the object type includes a global parameter, a page description, a web element, and a function block.
8. A method for implementing a test automation framework, comprising:
creating, by a scenario creator (110) of a test automation framework system (100), at least one scenario based on a set of user requirements retrieved from a data repository (108);
creating, by a test creator (112) of the system (100), a test block having a set of test cases generated based on the created scenario using the set of predefined rules, wherein each of the test cases includes a set of test tasks;
creating, by a test task creator (114) of the system (100), a set of test tasks for each of the created test cases;
processing, by a test workflow creator (116) of the system (100), the set of test cases to associate each test case with one workflow pattern or with connected multiple workflow patterns;
creating, by a tab creator (118) of the system (100), at least one application tab in each of the test cases based on the set of user requirements;
executing, by a test run module (120) of the system (100), the test tasks of each of the test cases of the scenario;
displaying, by a test case display (122) of the system (100), a live preview of the execution of each of the test tasks; and
generating, by a test report generator (124) of the system (100), at least one report, along with the log report and excel report, based on the execution of each of the test cases.

Dated this 18th day of August, 2022

_______________________________
MOHAN RAJKUMAR DEWAN, IN/PA – 25
of R.K.DEWAN & CO.
Authorized Agent of Applicant

Documents

Application Documents

# Name Date
1 202121037501-STATEMENT OF UNDERTAKING (FORM 3) [18-08-2021(online)].pdf 2021-08-18
2 202121037501-PROVISIONAL SPECIFICATION [18-08-2021(online)].pdf 2021-08-18
3 202121037501-PROOF OF RIGHT [18-08-2021(online)].pdf 2021-08-18
4 202121037501-POWER OF AUTHORITY [18-08-2021(online)].pdf 2021-08-18
5 202121037501-FORM 1 [18-08-2021(online)].pdf 2021-08-18
6 202121037501-DRAWINGS [18-08-2021(online)].pdf 2021-08-18
7 202121037501-DECLARATION OF INVENTORSHIP (FORM 5) [18-08-2021(online)].pdf 2021-08-18
8 202121037501-ENDORSEMENT BY INVENTORS [18-08-2022(online)].pdf 2022-08-18
9 202121037501-DRAWING [18-08-2022(online)].pdf 2022-08-18
10 202121037501-COMPLETE SPECIFICATION [18-08-2022(online)].pdf 2022-08-18
11 Abstract1.jpg 2022-08-30