Sign In to Follow Application
View All Documents & Correspondence

Systems And Methods For Executing Tests Based On Usability Style Guide Validations

Abstract: Systems and methods for executing tests based on usability style guide validations is provided. The system identifies UI elements in HTML files, assigns an identifier and text description for each of the UI elements, categorizes the UI elements with reference to a style guide, and tags each UI element with a keyword, an expected result and an execution flag. The system further derives verification check points as tests based on the tagged UI elements, and categorizes the tests into modular entities. The system further generating usability style guide validations for the HTML files by configuring the tests with interactive steps which are identified based on occurrence of the tagged UI elements in the HTML pages. A master configuration file is then configured with environment data prerequisites based on the generated usability style guide validations, and the categorized tests are executed along with the master configuration file.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 July 2016
Publication Number
04/2018
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
iprdel@lakshmisri.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-01-11
Renewal Date

Applicants

TATA CONSULTANCY SERVICES LIMITED
Nirmal Building, 9th Floor, Nariman Point, Mumbai, Maharashtra 400021, India

Inventors

1. CHANGKAKOTI, Suryasikha
Tata Consultancy Services Limited, Kalinga Park- SEZ IT/ITES Special Economic Zone, Plot - 35, Chandaka Industrial Estate, Bhubaneswar - 751 024, Odisha, India

Specification

FORM 2
THE PATENTS ACT, 1970 (39 of 1970) & THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)
1. Title of the invention: SYSTEMS AND METHODS FOR EXECUTING TESTS BASED
ON USABILITY STYLE GUIDE VALIDATIONS
2. Applicant(s)
NAME NATIONALITY ADDRESS
TATA CONSULTANCY Indian Nirmal Building, 9th Floor,
SERVICES LIMITED Nariman Point, Mumbai,
Maharashtra 400021, India
3. Preamble to the description
COMPLETE SPECIFICATION
The following specification particularly describes the invention and the manner in which it
is to be performed.

TECHNICAL FIELD [0001] The disclosure herein generally relate to usability style guides and tests execution, and, more particularly, to systems and methods for executing tests based on usability style guide validations.
BACKGROUND
[0002] Usability testing is entirely dependent on manual resources. Despite product design being completed at an earlier stage, no attempt can be made to standardize and automate the use of standardized Cascading Style Sheets (CSS) properties of Hyper Text Markup Language (HTML) screens and elements as benchmark. Reason being, each product design has different specification requirements, and the like. As such fundamental design flaws are detected at a much later stage in the Software Development Life Cycle (SDLC) phase, impacting the delivery timelines/schedules. HTML elements like a grid structure which is built with the fundamental tag table include standard UI components (e.g., UI buttons, input text boxes, select drop downs) which are derived from a basic template. Since components are derived from a common standardized template, the process impacts a huge percentage of the finished product.
[0003] The common classes, CSS applied to HTML elements are not standardized/implemented uniformly throughout an application. However, the number of HTMLs where a single component is likely to be re-utilized is uncertain and may vary depending upon various applications. Implementation of the standard checkpoints as such cannot be validated against all the screens or logical flows. Usage of the common classes/CSS ascertains the standard checkpoints or user experience (UX) standards being implemented. Any deviation from the implementation of the common classes, CSS requires to be observed meticulously and extensively.
[0004] Accurate validation with scaling tools can be taxing and not up to the mark. The measurement provided by scaling tools is approximate and may produce inaccurate results during validation. This appropriation may provide false positive status/statuses for the validation. In case of % and unit conversion,

the manual calculation may produce incorrect actual value thereby adding to the false status. Also, there may be dependency on average tools. The interpretation of generated results through manual observation may be complex. This compromises the quality of the UX validation process, which also impacts the test coverage. Further, usage of multiple tools for different types of verification criteria may affect execution productivity and efficiency.
SUMMARY [0005] Embodiments of the present disclosure present technological improvements as solutions to one or more of the above-mentioned technical problems recognized by the inventors in conventional systems. For example, in one aspect, a method is provided. The method comprising identifying, by one or more hardware processors, one or more user interface (UI) elements in one or more Hypertext Markup Language (HTML) files, each of the one or more HTML files comprises one or more style guides; assigning an identifier and text description for each of the one or more UI elements in the one or more HTML files; categorizing the one or more UI elements with reference to the one or more style guides; tagging each categorized UI element with a keyword, an expected result and an execution flag; deriving a set of verification check points as one or more tests based on one or more tagged UI elements; categorizing the one or more tests into one or more modular entities based on generalization of the one or more tagged UI elements or one or more HTML pages from the one or more HTML files; generating one or more usability style guide validations for at least a subset of the one or more HTML files by configuring the one or more tests being categorized into the one or more modular entities with one or more identified interactive steps, wherein the one or more interactive steps are identified based on occurrence of the one or more tagged UI elements in the one or more HTML pages; setting up a master configuration file with one or more environment data prerequisites based on the one or more generated usability style guide validations; and executing the one or more categorized tests with the master configuration file.

[0006] In an embodiment, the method may further include generating one or more test reports upon executing the one or more categorized tests with the master configuration file. In an embodiment, the one or more test reports comprise at least one of one or more execution statuses pertaining to each UI element and one or more corresponding UI element snapshots. In an embodiment, the one or more test reports further comprise a comparative analysis pertaining to occurrence of the one or more UI elements, one or more attributes being checked for the one or more UI elements.
[0007] In another aspect, a system is provided. The system comprising: a memory storing instructions; one or more communication interfaces; and one or more hardware processors coupled to the memory using the one or more communication interfaces, wherein the one or more hardware processors are configured by the instructions to: identify one or more user interface (UI) elements in one or more Hypertext Markup Language (HTML) files, each of the one or more HTML files comprises one or more style guides, assign an identifier and text description for each of the one or more UI elements in the one or more HTML files, categorize the one or more UI elements with reference to the one or more style guides, tag each categorized UI element with a keyword, an expected result and an execution flag, derive a set of verification check points as one or more tests based on one or more tagged UI elements, categorize the one or more tests into one or more modular entities based on generalization of the one or more tagged UI elements or one or more HTML pages from the one or more HTML files, generate one or more usability style guide validations for at least a subset of the one or more HTML files by configuring the one or more tests being categorized into the one or more modular entities with one or more identified interactive steps, wherein the one or more interactive steps are identified based on occurrence of the one or more tagged UI elements in the one or more HTML pages, set up a master configuration file with one or more environment data prerequisites based on the one or more generated usability style guide validations, and execute the one or more categorized tests with the master configuration file.

[0008] In an embodiment, the one or more hardware processors are further configured by instructions to generate one or more test reports upon executing the one or more categorized tests with the master configuration file. In an embodiment, the one or more test reports comprise at least one of one or more execution statuses pertaining to each UI element and one or more corresponding UI element snapshots. In an embodiment, the one or more test reports further comprise a comparative analysis pertaining to occurrence of the one or more UI elements, one or more attributes being checked for the one or more UI elements.
[0009] In yet another aspect, one or more non-transitory machine readable information storage mediums comprising one or more instructions is provided. The one or more instructions which when executed by one or more hardware processors causes identifying one or more user interface (UI) elements in one or more Hypertext Markup Language (HTML) files, each of the one or more HTML files comprises one or more style guides; assigning an identifier and text description for each of the one or more UI elements in the one or more HTML files; categorizing the one or more UI elements with reference to the one or more style guides; tagging each categorized UI element with a keyword, an expected result and an execution flag; deriving a set of verification check points as one or more tests based on one or more tagged UI elements; categorizing the one or more tests into one or more modular entities based on generalization of the one or more tagged UI elements or one or more HTML pages from the one or more HTML files; generating one or more usability style guide validations for at least a subset of the one or more HTML files by configuring the one or more tests being categorized into the one or more modular entities with one or more identified interactive steps, wherein the one or more interactive steps are identified based on occurrence of the one or more tagged UI elements in the one or more HTML pages; setting up a master configuration file with one or more environment data prerequisites based on the one or more generated usability style guide validations; and executing the one or more categorized tests with the master configuration file.

[0010] In an embodiment, the instructions may further include generating one or more test reports upon executing the one or more categorized tests with the master configuration file. In an embodiment, the one or more test reports comprise at least one of one or more execution statuses pertaining to each UI element and one or more corresponding UI element snapshots. In an embodiment, the one or more test reports further comprise a comparative analysis pertaining to occurrence of the one or more UI elements, one or more attributes being checked for the one or more UI elements.
[0011] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles:
[0013] FIG. 1 illustrates an exemplary block diagram of a system for usability style guide validations based execution of one or more tests in accordance with an embodiment of the present disclosure.
[0014] FIG. 2 is a flow diagram illustrating a processor implemented method using the system of FIG. 1 according to an embodiment of the present disclosure.
[0015] FIG. 3A is a user interface view of a HTML file according to an embodiment of the present disclosure.
[0016] FIG. 3B is a user interface view of a HTML file depicting one or more user interface elements identified based on properties in style guide according to an embodiment of the present disclosure.
[0017] FIG. 4A is a graphical representation illustrating a comparison of traditional system(s) and the proposed system of FIG. 1 with respect to processor usage in accordance with an embodiment of the present disclosure.

[0018] FIG. 4B is a graphical representation illustrating a comparison of traditional system(s) and the proposed system of FIG. 1 with respect to physical memory usage in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION OF EMBODIMENTS
[0019] Exemplary embodiments are described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. It is intended that the following detailed description be considered as exemplary only, with the true scope and spirit being indicated by the following claims.
[0020] Systems and methods for executing tests based on usability style guide validations is provided. The embodiments of the present disclosure implement a system that is configured to act a user experience (UX) Framework for Style-Guide specification verification with Hyper Text Markup Language (HTML) screens in applications. This data-driven framework is intended to be utilized so that style-guide check points and specified values for attributes of user interface (UI) elements, location of UI elements can be parametrized as data and checked against an actual HTML developed for comparison. This keyword-driven framework facilitates re-use of the same keyword to derive variety of outputs by simply changing the combination of inputs.
[0021] Referring now to the drawings, and more particularly to FIGS. 1 through 4B, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method.
[0022] FIG. 1 illustrates an exemplary block diagram of a system for

usability style guide validations based execution of one or more tests in accordance with an embodiment of the present disclosure. The system 100 comprises a memory 102, a hardware processor 104, and an input/output (I/O) interface 106. Although the exemplary block diagram and the associated description refers to a memory, a hardware processor, and an input/output communication interface, it may be understood that one or more memory units, one or more hardware processors, and/or one or more communication interfaces may be comprised in the system 100. The memory 102 may further includes one or more functional modules (not shown in FIG. 1). The memory 102, the hardware processor 104, the input/output (I/O) interface 106, and/or the modules may be coupled by a system bus or a similar mechanism. The system 100 one or more categorized tests with a master configuration file setup and a bat file.
[0023] The memory 102, may store instructions, any number of pieces of information, and data, used by a computer system, for example the system 100 to implement the functions of the system 100. The memory 102 may include for example, volatile memory and/or non-volatile memory. Examples of volatile memory may include, but are not limited to volatile random access memory (RAM). The non-volatile memory may additionally or alternatively comprise an electrically erasable programmable read only memory (EEPROM), flash memory, hard drive, or the like. Some examples of the volatile memory includes, but are not limited to, random access memory, dynamic random access memory, static random access memory, and the like. Some example of the non-volatile memory includes, but are not limited to, hard disks, magnetic tapes, optical disks, programmable read only memory, erasable programmable read only memory, electrically erasable programmable read only memory, flash memory, and the like. The memory 102 may be configured to store information, data, instructions or the like for enabling the system 100 to carry out various functions in accordance with various example embodiments.
[0024] Additionally or alternatively, the memory 102 may be configured to store instructions which when executed by the hardware processor 104 causes the system 100 to behave in a manner as described in various embodiments. The

memory 102 stores the functional modules and information, for example, information (e.g., one or more user interface (UI) elements in one or more Hypertext Markup Language (HTML) files, each of the one or more HTML files comprises one or more style guides).
[0025] The hardware processor 104 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Further, the hardware processor 104 may comprise a multi-core architecture. Among other capabilities, the hardware processor 104 is configured to fetch and execute computer-readable instructions or modules stored in the memory 102. The hardware processor 104 may include circuitry implementing, among others, audio and logic functions associated with the communication. For example, the hardware processor 104 may include, but are not limited to, one or more digital signal processors (DSPs), one or more microprocessor, one or more special-purpose computer chips, one or more field-programmable gate arrays (FPGAs), one or more application-specific integrated circuits (ASICs), one or more computer(s), various analog to digital converters, digital to analog converters, and/or other support circuits.
[0026] The hardware processor 104 thus may also include the functionality to encode messages and/or data or information. The hardware processor 104 may include, among others a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the hardware processor 104. Further, the hardware processor 104 may include functionality to execute one or more software programs, which may be stored in the memory 102 or otherwise accessible to the hardware processor 104.
[0027] FIG. 2, with reference to FIG. 1, is a flow diagram illustrating a processor implemented method using the system 100 according to an embodiment of the present disclosure. The steps of the method of the present disclosure will now be explained with reference to the components of the system 100 as depicted in FIG. 1. The hardware processor 104 is configured by the instructions stored in the memory 102. The hardware processor 104 when

configured by the instructions executes one or more categorized tests as described hereinafter. In an embodiment, at step 202, the hardware processor 104 identifies one or more user interface (UI) elements in one or more Hypertext Markup Language (HTML) files, wherein each of the one or more HTML files comprises one or more style guides. In an embodiment, the one or more user interface elements are stored in a properties file. In an embodiment of the present disclosure, at step 204, the hardware processor 104 assigns an identifier and text description for each of the one or more UI elements in the one or more HTML files. In an embodiment of the present disclosure, the text description comprises a manual description for each step for understanding and traceability to the step objective. In an embodiment of the present disclosure, at step 206, the hardware processor 104 categorizes the one or more UI elements with reference to the one or more style guides comprised in the one or more HTML files.
[0028] In an embodiment of the present disclosure, at step 208, the hardware processor 104 tags each categorized UI element with a keyword, an expected result and an execution flag. In an embodiment of the present disclosure, the keyword refers to an action that needs to be performed. The keyword(s) is/are created in the keywords file and contains the code for the driver action. The expected result (also referred hereinafter as data) is data/external input with which or against which keywords may be executed. The UI element(s) refer to an input element or parameters that are provided for the keyword function. In case of multiple parameters, delimiter approach is used to separate the parameters. This way the same keyword can be re-used to derive variety of outputs by simply changing the combination of inputs. Execution flag refers to a status or flag that provides the flexibility to set whether execution is required for a particular step or sequence of steps.
[0029] At step 210, the hardware processor 104 derives a set of verification check points as one or more tests based on one or more tagged UI elements. At step 212, the hardware processor 104 categorizes the one or more tests into one or more modular entities based by generalizing the one or more tagged UI elements or one or more HTML pages from the one or more HTML

files. Generalization of test cases (or UI elements) gives a unique set of validation points (e.g., the set of verification check points) which can be repeated for a specific UI component. These verification points are common to the UI component type and can be repeated /standardized for multiple screens and throughout the application. In an example embodiment of the present disclosure, for the UI component Grid, the header texts should always be bold. This test case is common to Grid element and may be validated for each occurrence of the element throughout the application under test. At step 214, the hardware processor 104 generates one or more usability style guide validations for at least a subset of the one or more HTML files by configuring the one or more tests being categorized into the one or more modular entities with one or more identified interactive steps. In other words, the one or more usability style guide validations are generated for at least a subset of the one or more HTML files by configuring the one or more categorized tests with one or more identified interactive steps (e.g., mouse over, click, type, and the like). For example, the required interactive steps may be designed or re-used for navigation along with the one or more categorized tests to form the usability style guide validations (e.g., screen level validations) for ‘x’ number of HTMLs. In an embodiment of the present disclosure, the one or more interactive steps are identified based on occurrence of the one or more tagged UI elements in the one or more HTML pages.
[0030] At step 216, the hardware processor 104 sets up a master configuration file with one or more environment data prerequisites based on the one or more generated usability style guide validations. In other words, a master configuration file is setup with test execution and environment data prerequisites. For selenium tests to run or to be executed, one or more environment data prerequisites comprise of identifying a browser in which the test cases may be run/executed, providing the required source paths (properties file, test file) for the input test framework files along with the destination paths for the output reports and screen-shots. At step 218, the hardware processor 104 executes the one or more categorized tests with the master configuration file and a batch file. In an example embodiment, the batch file is an unformatted text file that contains

one or more commands and has a *.bat or *.cmd file name extension. When a file name is entered at the command prompt, Cmd.exe runs the commands sequentially as they appear in the file. In this *.bat file, the command for starting the jar is written for the execution start up.
[0031] Upon executing the one or more categorized tests with the master configuration file and the bat file, the hardware processor 104 generates one or more test reports generating one or more test reports upon executing the one or more categorized tests with the master configuration file. In an embodiment, the one or more test reports comprise at least one of one or more execution statuses pertaining to each UI element and one or more corresponding UI element snapshots. In a further embodiment, the one or more test reports further comprise a comparative analysis pertaining to occurrence of the one or more UI elements, one or more attributes being checked for the one or more UI elements. The one or more attributes may be associated with a corresponding UI element from the one or more UI elements.
[0032] FIG. 3A, is a user interface view of a HTML file according to an embodiment of the present disclosure. More particularly, FIG. 3A depicts one or more UI elements. FIG. 3A depicts an exemplary static HTML built with the style guide. In an embodiment of the present disclosure, using a HTML Cascading Style Sheets (CSS)/XPATH generator, the static HTMLS/dynamic web uniform resource locators with style guides may be executed to generate and list all the elements in the properties file with the best possible representation of locator value using all html attributes of the element. For example, a UI element “XYZ systems” pertaining logo of an entity is identified from a HTML file. In an embodiment of the present disclosure, an img tag may be used to uniquely identify the logo in the Login page as illustrated below by way of examples: xpath=//img tagName=img
[0033] It is to be understood that the above tags are not limited to identifying one or more UI elements. Any structural locators can be used for the Logo image. Below are the following values that may be used for the different

locator types(attribute based/structural based) for identifying the one or more UI elements:
id – Id attribute of the element className – Class attribute of the element name – Name attribute of the element tagName – Tag name of the element link – Full text inside an anchor tag element partlink – Partial text inside an anchor tag element css - CSS selector for the element xpath – Absolute / relative xpath of the element [0034] This gives an unique idea as to which element attributes can be ignored in forming the unique categorization of elements. These elements are then categorized based on HTML structures/attributes and a description is provided to their unique keys. For e.g., there are input text boxes with locator value combination input[type][id][class]. It could be analyzed that all input text boxes can be targeted by the simple combination of input[type] which can be then generalized into a category. This general value may be then used to define the text box element throughout the application. Any exceptions to this general value for input text box when detected may indicate failure. Similarly all general category of elements like grid rows (even, odd), grid headers, etc. may be recognized using their least but most common combination of attributes.
[0035] Following is an exemplary HTML text from which one or more UI elements (e.g., logo for XYZ systems) are identified in the HTML file depicted FIG. 3A.

TSFP-Login

window.history.forward();
function noBack() {
window.history.forward();
}
function popNewWindow() {
popupWin = window.open('orgReg.action', 'name',
'height=650,width=1000');
if (window.focus) {
popupWin.focus();
}
return false;
}
$( document ).ready(function() {
/* Forgot Password Modal */
var dialog = $("#dialog-form").dialog({
autoOpen : false,
resizable: false,
height : 500,
width : 500,
modal : true,
});
form = dialog.find("form").on("submit", function(event) {
event.preventDefault();

addUser();
});
$(".forgotPwdlink").on("click", function() {
dialog.dialog("open");
$("#forgotpwdForm")[0].reset();
$(".errorlogin").html("");
$(".successModallogin").html("");
$("input").removeClass("error");
$("#submitButtId").removeAttr('disabled');
$("#submitButtId").css({
'background-color' : '#f67a00',
'cursor' : 'pointer'
});
});
/* Reset Password Modal */
var dialogReset = $( "#dialog-form-reset" ).dialog({
autoOpen: false,
resizable: false,
height: 500,
width: 500,
modal: true,
});
form = dialog.find( "form" ).on( "submit", function( event ) {
event.preventDefault();
addUser();
});
// populateQuestionDropDown();
$(".pwdrules").hover(
function(){
$(".rulebox").show();
},

function(){
$(".rulebox").hide();
}
);
…..
…..

body{display:none !important;}

if (self === top) {
var antiClickjack = document.getElementById("antiClickjack");
antiClickjack.parentNode.removeChild(antiClickjack);
} else {
top.location = self.location;
}







  • Please Enter Username.

…. ….
$("select").uniform();
[0036] Following is an exemplary css Style tab text where the highlighted xt (in BOLD) is the particular element: element.style { }
#login #nay img { margin-left: 108px; margin-top: 10px; display: block;
}
Pseudo ::scrollbar element
::-webkit-scrollbar {
width: 12px;
}

Pseudo ::scrollbar-thumb element
::-webkit-scrollbar-thumb {
border-radius: 10px;
-webkit-box-shadow: inset 0 0 6px rgba(0, 0, 0, 0.5);
}
Pseudo ::scrollbar-track element
::-webkit-scrollbar-track {
-webkit-box-shadow: inset 0 0 6px rgba(0, 0, 0, 0.3);
border-radius: 10px;
}
[0037] Each identified UI element may be assigned with an identifier and a text description. For example, in the properties file the identified UI element is added with a unique key and description, separated by the delimiter '=' as below:
Logo = Logo Image in Login Page = xpath = //img
[0038] Here, ‘Logo’ is the unique property key, ‘Logo Image in Login Page’ is the text description of the UI element, ‘xpath’ is the locator type and ‘//img’ is the locator value. This is how each and every UI element under test is configured and stored in the properties file with an associated description, locator type and value.
[0039] Further, for the UI element under test, it is determined whether there any CSS property or attribute which needs to be verified. For example, for this element the following CSS properties may be observed.
#login #nay image custom.css:50
{
margin-left: 108px;
margin-top: 10px;
display: block;
}
[0040] These properties are further associated with a unique key to be referenced to in the scripts. For example, Left_Margin = margin-left. The key provided uses underscore ‘_’ so that when the system 100 reads it, the system 100

automatically converts the separator ‘_’ to ‘ ’ and uses that as a description in the results. All the CSS properties whose value is required to be validated are associated with a key in the described manner in the properties file. Once the elements, the CSS properties and the attributes under test are configured and mentioned in the properties file, a test design file is opened to design the script and the column headers (e.g., Sl. No, Description, Keyword, Data, Element, Execution Flag) are provided by way of examples. The expressions ‘CSS properties’ and ‘attributes’ may be interchangeably used hereinafter. One or more required steps are identified by the system 100, for example, but are not limited to, opening a browser and entering a target URL, identifying the css property Left_Margin for the Logo UI element, Close the browser, and the like.
[0041] Each of the categorized UI elements are then tagged with a keyword, an expected result and an execution flag as illustrated below by way of example in the Table 1:

STEP (DESCRIPTION) KEYWORD DATA
(Expected
Result) ELEMENT
Open the browser OPENPAGE Target URL Screen Name
and enter the target
URL
Identify the css FINDATTRIBUTE 108 px Logo,Left_Mar
property gin
Left_Margin for the
Logo UI element
Close the browser CLOSE The target path where the screen level html is to be stored.
TABLE 1

[0042] As can be seen from above Table 1, 108 px is the expected value for the left-margin of the Logo, Logo is the unique property key used to define the element in the properties file, Left_Margin is the unique property key used to define the CSS property margin-left in the properties file.
[0043] Similarly the following may also be verified based on the style guides depicted in FIG. 3B:
[0044] The ‘margin-top’ CSS property value.
[0045] The ‘dimensions of the Logo image i.e. ‘Height’ and ‘Width’.
[0046] The scroll of the page is absent in both horizontal and vertical directions.
[0047] The 'src' attribute of the Logo image.
[0048] FIG. 3B, with reference to FIGS. 1 through 3A, is a user interface view of a HTML file depicting one or more user interface elements identified based on properties in style guide according to an embodiment of the present disclosure. Below is Table 2, which depicts UI elements being tagged with a keyword, an expected result, and an execution flag status:

STEP (DESCRIPTION) KEYWORD DATA
(Expected
Result) ELEMENT
Open the browser and hit the target URL OPENPAGE Ta rg e t URL Screen Name
Find the CSS property Left_Margin for the Logo UI element. FINDATTRIBUTE 108 px Logo,Left_ Margin
Find the CSS property Top_Margin for the Logo UI element.
Find the height of the Logo Image FINDATTRIBUTE DIMENSION 10 px 156 Logo,Top_ Margin
Logo,Heig ht
Find the width of the Logo Image DIMENSION 156 Logo,Widt h
The page level scroll is absent in the horizontal direction HORIZONTALSCROLL

The page level scroll is absent in the vertical direction VERTICALSCROLL
Find the image src partial value i.e. only the image name and format IMAGESRC ABC_Syst
emsLogo circle.pn g Logo
Close the browser CLOSE The target path where the screen level html is to be stored.
TABLE 2
[0049] The one or more tests are categorized into one or more modular entities based on generalization of the one or more tagged UI elements or one or more HTML pages from the one or more HTML files as shown in table 1 and 2 above. Once the UI elements are identified/categorized, usability rules may be set up for each category of UI elements. These rules may be in the form of constant values, range of values, or an exception value set to the general rule. This facilitates the use of UI standards/guidelines with a set amount of tolerance/margin of error. For example, the dimension of a text box is X*Y, but if required or set, it can allow a tolerance level of ±1 i.e. (X+1)*(Y+1) or (X-1)*(Y-1). Similarly the same can be set for color RGB codes with opacity, margin, padding and other attributes.
[0050] In an embodiment of the present disclosure, one or more usability style guide validations are generated for at least a subset of the one or more HTML files by configuring the one or more tests being categorized into the one or more modular entities with one or more identified interactive steps. Examples of usability style guide validations may include, but are not limited to, “the CSS property “margin” of a Logo image should be 10 px or the width of a Logo image should be 156 px”. In an embodiment of the present disclosure, these validation points are identified from the style-guide designed for the Logo image. Interactive steps may comprise, but are not limited to, for example, the URL may

comprise one or more pages which includes a standard Logo image. The one or more pages may be accessible through particular header links in Home page. For navigating through these pages, it is required to click on the links and navigate to the specific page under test. These steps are the interactive which may be added prior to the validation tests written for Logo image. In an embodiment of the present disclosure, the element behavior and rules with and without interactive steps are set up in proper sequence. The required attributes i.e., CSS properties (padding, left-padding, margin, border) may be referenced with the help of a key set up in the properties file which enables the usage of the same keyword by a simple change of the CSS attribute key, thereby not affecting the basic framework code of the system 100.
[0051] In an embodiment of the present disclosure, the UI element behavior and rules which can be re-utilized/re-iterated may be moved to the common sheet (e.g., named Common) and provided a unique name in Setting column like (header i.e., element description etc.). In case, these common sets of modular tests needs to be referenced to in the required screens, a keyword may be used with a comma separated combination of all the setting names provided to the modular tests. This facilitates the re-use of same UI standards/guidelines for the categorized elements across all screens.
[0052] In an embodiment of the present disclosure, a master configuration file is setup with one or more environment data prerequisites based on the one or more generated usability style guide validations. Below is an exemplary master configuration file setup with environment pre-requisites illustrated in Table 2:

Property Name Va l u e Default value
object *A path location for object properties file where all the UI elements are stored with unique name (or identifier) and text description object.properties
inputfilepath *A path location for the input test file sheet where the tests are designed for none

these UI elements
drivertype A driver type for the browser in which If empty,points to
the test cases will be run webdriver.chrome.driver – browser1 webdriver.ie.driver- browser 2 browser 3
driverlocale *A path for the driver server for the respective browser type (excluding browser 3) Browser1Driver.exe
outputFile *A path location for the output excel sheet/summary level HTMLS Output_Results.xls
commonfilename The name of the sheet containing the modular test groups like header, footer, input text box common
logfilepath *A folder location for the screen level log files giving an actual result for each and every test Log Files\screenshotfoldern *A folder location where all the screen
ame shots (or snapshots) are stored for the corresponding failing elements to be linked to the test results in the screen level HTML.
TABLE 2 [0053] The master configuration file is further setup with the following details:

object=TSFP.properties
inputfilepath=C:\\Users\\ABC\\Downloads\\UX Framework TSFP Scripts.xls
driver type=webdriver.browser1.driver
driver locale=C:\\Users\\ABC\\Downloads\\Driver\\browser1driver.exe
outputFile=D: \\XYZ\\reports\\Outputs .xls
logfilepath=Log
Files\\commonfilename=Commonscreenshotfoldername=reports\\selenium\\test
data=data\\Test Data Excel.xls

[0054] Once the master configuration file setup is done, the bat file is executed which opens the command prompt window and a prompt for the location of the master configuration file is triggered for executing the one or more categorized tests. The command prompt window requests for the Master Configuration file name with location as illustrated by way of example below:

OS [Version 6.1.7601]
Copyright (c) 2009 ABC Corporation. All rights reserved.
D:\UX_Framework>call start.bat
D:\UX _Framework>java -jar UX_FrameWorkExecution.jar
Enter the Master Configuration file name with location:
[0055] An input pertaining to the master configuration file name with location may be received (e.g., from user) as illustrated below by way of example:

OS [Version 6.1.7601]
Copyright (c) 2009 ABC Corporation. All rights reserved.
D:\UX_Framework>call start.bat
D:\UX_Framework>java -jar UX_FrameWorkExecution.jar
Enter the Master Configuration file name with location: UI_Config.properties
[0056] Upon the location of the master configuration file being specified as an input, the jar reads the script steps (or the categorized tests) from the excel sheet one by one and executes the same. Each OPENPAGE and CLOSE block corresponds to a particular screen and generates the screen level HTML and log file simultaneously. An exemplary output is shown in a console during execution of the one or more categorized tests as below:

OS [Version 6.1.7601]
Copyright (c) 2009 ABC Corporation. All rights reserved.
D:\UX_Framework>call startUX
D:\UX_Framework>java -jar UX_FrameworkExecution.jar
Enter the Master Configuration file name with location:
UI_Config.properties
data\Test_Data_Excel.xls
StartingBrowser1Driver 2.13.307647
(5a7d0541ebc58e69994a6fb2ed930f45261f3c29) on port 4888
Only local connections are allowed.
Jun 17, 2016 12:39:21 PM Engine.Keywords logging
INFO: Opening the URL for page http://172.18.228.210:8085/tsfp1.2/ in the

location Login Page is done
Jun 17, 2016 12:39:21 PM Engine.Keywords logging
INFO: The horizontal scroll bar is absent
Jun 17, 2016 12:39:22 PM Engine.Keywords logging
INFO: The vertical scroll bar is absent
Jun 17, 2016 12:39:22 PM Engine.Keywords logging
INFO: The image src of Logo Image in Login Page contains:
ABC_Systems_Logo_circle.png
Jun 17, 2016 12:39:23 PM Engine.Keywords logging
INFO: The Height of Logo Image in Login Page is: 156
Jun 17, 2016 12:39:24 PM Engine.Keywords logging
INFO: The Width of Logo Image in Login Page is: 156
Jun 17, 2016 12:39:25 PM Engine.Keywords logging
INFO: The Left Margin of Logo Image in Login Page is: 108px
Jun 17, 2016 12:39:25 PM Engine.Keywords logging
INFO: The Top Margin of Logo Image in Login Page is: 10px
D: \\Protex\\reports\\Sheet 1//

[0057] Upon executing the one or more categorized tests, one or more test reports and HTMLs are generated. An exemplary output of a test report is illustrated in below Table 4:

Sequence Column Header Description
1. Screen Name The name of the screen to which the test case belongs
2. Actual Result Actual Result- the attribute value in string, the dimension/position in pixels
3. Input Expected Result- the attribute value in string,
Data/Expected the dimension/position in pixels that should be
Result present for the element. This values can be extracted from Style-guide inputs or standardized htmls.
4. Test Case Status The status of the test
case(PASS,FAIL,SKIPPED)
TABLE 4 [0058] Using these Screen-level report, one or more summarized reports may be generated with different views, in one example embodiment. Following are different types of test reports:
1. Test Attribute Summary Report - The Elements and corresponding attributes are distributed with respect to PASS, FAIL and SKIPPED status.
2. Test Element Summary Report - The Elements are distributed with respect to PASS, FAIL and SKIPPED status.
3. Test Elements Report - The Screens and corresponding elements are distributed with respect to PASS, FAIL and SKIPPED status.
4. Test Sheet Report - The Screens are distributed with respect to PASS, FAIL and SKIPPED status.
5. Test Case Report - The summarized view with respect to PASS, FAIL

and SKIPPED status. [0059] Similarly, the following HTMLs may be generated, in one example embodiment:
1. Element Summary HTML provides summarized report for the element versus attribute-level-breakdown and is linked to Element HTML.
2. Element HTML provides summarized report for the element-level-breakdown at screen-level in corresponding sheets in the input test file and is linked to Screen HTML.
3. Screen HTML provides summarized report for the screens in corresponding sheets in the input test file and is linked to Sheet HTML.
4. Sheet HTML provides summarized report for the sheets in the input test file.
5. Folders are created with respect to the sheets in the input test file and in each sheet folder, the HTMLs for all the screens corresponding to the sheet are available which are linked to the Screen HTML.
6. All the summary and detailed level reports are grouped in one excel in multiple sheets. The summary level reports are illustrated as follows:
a. Test Attribute Summary Report: This report focuses on the
distribution of failures of the application throughout all the CSS
properties verified for e.g., padding, font-color and shows the
comparison as to which CSS property is failing the most.
b. Test Element Summary Report: This report focuses on the
distribution of failures of the application throughout all the
unique elements verified for e.g., input text box, select drop
downs and shows the comparison as to which element is failing
the most.
c. Test Elements Report: This report focuses on the distribution of
failures of the application throughout all the unique elements
corresponding to screens verified for e.g., input text box, select
drop downs and shows the comparison as to which screen and
corresponding elements are failing the most.

d. Test Sheet Report: This report focuses on the distribution of
failures of the application based on the modular division of test
cases as sheets in excel.
e. Test Case Report: This report gives an overall summary of all
the test cases for the application.
[0060] The detailed level reports for each sheet may comprise the following information, but are not limited to:
1. Screen name – The HTML name which is under test
2. Element name – The Element under test
3. Element/Attribute name – The Attribute of the element under test or the element to which the first element is compared.
4. Test case name – The auto-generated test case name describing the verification point.
5. Test case description – A user understandable description of the actual output.
6. Actual result – The actual result i.e., the value which is derived from the html during test.
7. Input data/Expected Result – The expected result i.e., the value which should be derived from the html during test.
8. Test case status – The status of the test case i.e. PASS, FAIL or SKIPPED.
[0061] In an embodiment of the present disclosure, all the summary level reports may be represented in HTML format with links to the lower level reports. The screen level HTML reports may be generated which are linked to the corresponding failure(s).
[0062] FIG. 4A, with reference to FIGS. 1 through 3B, is a graphical representation illustrating a comparison of traditional system(s) and the proposed system 100 with respect to processor usage in accordance with an embodiment of the present disclosure. FIG. 4B, with reference to FIGS. 1 through 4A, is a graphical representation illustrating a comparison of traditional system(s) and the proposed system 100 with respect to physical memory usage in accordance with

an embodiment of the present disclosure. As can be seen from the FIGS. 4A-4B, the proposed system 100 in comparison with the traditional systems substantially reduces the processor usage and memory consumption when executing test cases. More particularly, FIG. 4A depicts approximately 40% of the processor usage by the traditional system(s) as compared to the proposed system 100 that utilizes approximately 17%. 40% of the processor usage by the traditional system(s) is distributed as 20% being utilized for inspection testing, and 20% being utilized for report generation. FIG. 4B depicts approximately 0.26 GB of the physical memory by the traditional system(s) as compared to the proposed system 100 that utilizes approximately 0.1 GB of the physical memory. 0.26 GB of the physical memory usage by the traditional system(s) is distributed as 0.25 GB% being utilized for inspection testing, and 0.1 GB being utilized for report generation. FIGS. 4A-4B are clearly indicative of the proposed system 100 being more effective than the traditional system(s).
[0063] The embodiments of the present disclosure provide a test execution system (e.g., the system 100) that can be implemented in Usability Style-guide Verification Testing which allows to establish a re-usable architecture for the application enabling execution of uniform rules without any chance of inspection/detection error. The embodiments of the present disclosure further enable extending the coverage to ‘X’ number of screens implemented in a build with simple re-use of modular scripts/tests. The embodiments of the present disclosure further enable the functional tests (re-utilized from another keyword-driven excel framework) to be integrated in between to allow navigation to all the required number of screens. In scenarios, where there are exceptions to the uniform rules, these can also be added in the tests.
[0064] The embodiments of the present disclosure provide the system 100 to be implemented in Usability Style-guide Verification during test case execution, where an exact number of test cases in terms of elements/CSS properties/screens can be measured without any additional analysis. Also, if a single test for an input text box is designed, the embodiment of the present disclosure enable that test to be repeated for each and every input text box

available in the page irrespective of how many, indexing the elements in the order of appearance and re-iterating the set of validation rules for all. In scenarios of exceptions, the embodiments of the present disclosure enable easy split of the indexes and thereby include the exceptions. This enables the system 100 to identify the exact number of occurrences of an UI element in a particular screen, the expected versus actual result comparison analysis result thereby producing a comprehensive screen-element level breakdown of tests.
[0065] In Usability Style-guide Verification, uniform set of rules identified can be very specific or varying within a specified range based on browser behavior. The embodiments of the present disclosure enable the system 100 to setup exception rule for the test thereby permitting appropriate amount of fluctuation in actual output due to browser effect. Further, in Usability Style-guide Verification, in case of failures, unlike traditional systems where there is no linkage, embodiments of the present disclosure enable the system 100 to display the snapshot of the required element with its parent vicinity linking it to the detailed screen level html report.
[0066] The embodiments of the present disclosure further enable the system 100 to properly categorize based on HTML tags/class/type/value/title/alt/for and other element attributes that allow uniform re-use of elements and the tied rules across all pages/screens of the application. Since, the elements are generated from static HTMLs/dynamic web pages automatically, there is a uniformity in the representation of the elements thereby enabling categorization. The system 100 generates a console level report which gives the number of similar elements present in the page represented by the particular element value. This generalizes the categorization of elements. The screen level validation and execution of the one or more categorized test by the system 100 enables analysis of the consecutively generated log files and HTML reports parallel to execution thereby facilitating failure analysis along with execution.
[0067] The embodiments of the present disclosure further enable the system 100 to set the modular element level scripts with standard rules as part of

common test case sheet with a unique identifier which then can be referenced to in the required screens for re-execution.
[0068] The interactive steps like mouse over, click, type, and the like are not part of the execution report but are merely setup as a controlled set of prerequisite for the consequent usability tests. The status reporting of interactive steps is not required as the fundamental purpose of the UX reporting is covered by the usability style-guide verification points. As such, the execution reports exempt the status of these steps to provide a clear and concise report from usability perspective, and only the log files carry the status of these steps to facilitate analysis during failure scenarios when compared to traditional systems.
[0069] The embodiments of the present disclosure of the present disclosure enables the system 100 for style comparison of individual elements, comparison of spacing between two elements like header and sub-header. Since usability framework needs to target unique CSS properties/dimensions (height/width)/compare spacing (vertical/horizontal/without padding, etc.) the system 100 has be technically enhanced to include the ‘n’ number of multiple dimensions required as inputs to a function in a single column. The embodiments of the present disclosure further enables the system 100 to execute the same set of tests in multiple browsers and track comparison of CSS property/dimension/spacing changes due to inherent browser behavior.
[0070] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims.
[0071] It is to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program

runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device which can be programmed including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g. hardware means like e.g. an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means can include both hardware means and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g. using a plurality of CPUs.
[0072] The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various modules described herein may be implemented in other modules or combinations of other modules. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[0073] The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.

Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
[0074] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[0075] It is intended that the disclosure and examples be considered as exemplary only, with a true scope and spirit of disclosed embodiments being indicated by the following claims.

I/We Claim:
1. A method comprising:
identifying, by one or more hardware processors, one or more user interface (UI) elements in one or more Hypertext Markup Language (HTML) files, each of said one or more HTML files comprises one or more style guides;
assigning an identifier and text description for each of said one or more UI elements in said one or more HTML files;
categorizing said one or more UI elements with reference to said one or more style guides;
tagging each categorized UI element with a keyword, an expected result and an execution flag;
deriving a set of verification check points as one or more tests based on one or more tagged UI elements;
categorizing said one or more tests into one or more modular entities based on generalization of said one or more tagged UI elements or one or more HTML pages from said one or more HTML files;
generating one or more usability style guide validations for at least a subset of said one or more HTML files by configuring said one or more tests being categorized into said one or more modular entities with one or more identified interactive steps;
setting up a master configuration file with one or more environment data prerequisites based on said one or more generated usability style guide validations; and
executing said one or more categorized tests with said master configuration file.
2. The method of claim 1, further comprising generating one or more test
reports upon executing said one or more categorized tests with said master
configuration file.

3. The method of claim 2, wherein said one or more test reports comprise at least one of one or more execution statuses pertaining to each UI element and one or more corresponding UI element snapshots.
4. The method of claim 3, wherein said one or more test reports further comprise a comparative analysis pertaining to occurrence of said one or more UI elements, one or more attributes being checked for said one or more UI elements.
5. The method of claim 1, wherein said one or more interactive steps are identified based on occurrence of said one or more tagged UI elements in said one or more HTML pages.
6. A system comprising:
a memory storing instructions; one or more communication interfaces; and
one or more hardware processors coupled to said memory using said one or more communication interfaces, wherein said one or more hardware processors are configured by said instructions to:
identify one or more user interface (UI) elements in one or more Hypertext Markup Language (HTML) files, each of said one or more HTML files comprises one or more style guides,
assign an identifier and text description for each of said one or more UI elements in said one or more HTML files,
categorize said one or more UI elements with reference to said one or more style guides,
tag each categorized UI element with a keyword, an expected result and an execution flag,
derive a set of verification check points as one or more tests based on one or more tagged UI elements,
categorize said one or more tests into one or more modular entities based on generalization of said one or more tagged UI elements or one or

more HTML pages from said one or more HTML files,
generate one or more usability style guide validations for at least a
subset of said one or more HTML files by configuring said one or more
tests being categorized into said one or more modular entities with one or
more identified interactive steps,
set up a master configuration file with one or more environment
data prerequisites based on said one or more generated usability style
guide validations, and
execute said one or more categorized tests with said master
configuration file.
7. The system of claim 6, wherein said one or more hardware processors are further configured by instructions to generate one or more test reports upon executing said one or more categorized tests with said master configuration file.
8. The system of claim 7, wherein said one or more test reports comprise at least one of one or more execution statuses pertaining to each UI element and one or more corresponding UI element snapshots.
9. The system of claim 8, wherein said one or more test reports further comprise a comparative analysis pertaining to occurrence of said one or more UI elements, one or more attributes being checked for said one or more UI elements.
10. The system of claim 6, wherein said one or more interactive steps are identified based on occurrence of said one or more tagged UI elements in said one or more HTML pages.

Documents

Application Documents

# Name Date
1 201621024744-IntimationOfGrant11-01-2024.pdf 2024-01-11
1 Form 5 [19-07-2016(online)].pdf 2016-07-19
2 Form 3 [19-07-2016(online)].pdf 2016-07-19
2 201621024744-PatentCertificate11-01-2024.pdf 2024-01-11
3 Form 18 [19-07-2016(online)].pdf_96.pdf 2016-07-19
3 201621024744-Written submissions and relevant documents [29-12-2023(online)].pdf 2023-12-29
4 Form 18 [19-07-2016(online)].pdf 2016-07-19
4 201621024744-FORM-26 [13-12-2023(online)].pdf 2023-12-13
5 Drawing [19-07-2016(online)].pdf 2016-07-19
5 201621024744-Correspondence to notify the Controller [14-11-2023(online)].pdf 2023-11-14
6 Description(Complete) [19-07-2016(online)].pdf 2016-07-19
6 201621024744-US(14)-HearingNotice-(HearingDate-14-12-2023).pdf 2023-11-03
7 Other Patent Document [05-08-2016(online)].pdf 2016-08-05
7 201621024744-CLAIMS [22-09-2020(online)].pdf 2020-09-22
8 Form 26 [05-08-2016(online)].pdf 2016-08-05
8 201621024744-COMPLETE SPECIFICATION [22-09-2020(online)].pdf 2020-09-22
9 ABSTRACT1.JPG 2018-08-11
9 201621024744-FER_SER_REPLY [22-09-2020(online)].pdf 2020-09-22
10 201621024744-FER.pdf 2020-03-23
10 201621024744-Power of Attorney-090816.pdf 2018-08-11
11 201621024744-Correspondence-090816.pdf 2018-08-11
11 201621024744-Form 1-100816.pdf 2018-08-11
12 201621024744-Correspondence-100816.pdf 2018-08-11
13 201621024744-Correspondence-090816.pdf 2018-08-11
13 201621024744-Form 1-100816.pdf 2018-08-11
14 201621024744-FER.pdf 2020-03-23
14 201621024744-Power of Attorney-090816.pdf 2018-08-11
15 201621024744-FER_SER_REPLY [22-09-2020(online)].pdf 2020-09-22
15 ABSTRACT1.JPG 2018-08-11
16 201621024744-COMPLETE SPECIFICATION [22-09-2020(online)].pdf 2020-09-22
16 Form 26 [05-08-2016(online)].pdf 2016-08-05
17 201621024744-CLAIMS [22-09-2020(online)].pdf 2020-09-22
17 Other Patent Document [05-08-2016(online)].pdf 2016-08-05
18 201621024744-US(14)-HearingNotice-(HearingDate-14-12-2023).pdf 2023-11-03
18 Description(Complete) [19-07-2016(online)].pdf 2016-07-19
19 201621024744-Correspondence to notify the Controller [14-11-2023(online)].pdf 2023-11-14
19 Drawing [19-07-2016(online)].pdf 2016-07-19
20 Form 18 [19-07-2016(online)].pdf 2016-07-19
20 201621024744-FORM-26 [13-12-2023(online)].pdf 2023-12-13
21 Form 18 [19-07-2016(online)].pdf_96.pdf 2016-07-19
21 201621024744-Written submissions and relevant documents [29-12-2023(online)].pdf 2023-12-29
22 Form 3 [19-07-2016(online)].pdf 2016-07-19
22 201621024744-PatentCertificate11-01-2024.pdf 2024-01-11
23 Form 5 [19-07-2016(online)].pdf 2016-07-19
23 201621024744-IntimationOfGrant11-01-2024.pdf 2024-01-11

Search Strategy

1 2020-03-0417-32-44E_04-03-2020.pdf

ERegister / Renewals

3rd: 08 Feb 2024

From 19/07/2018 - To 19/07/2019

4th: 08 Feb 2024

From 19/07/2019 - To 19/07/2020

5th: 08 Feb 2024

From 19/07/2020 - To 19/07/2021

6th: 08 Feb 2024

From 19/07/2021 - To 19/07/2022

7th: 08 Feb 2024

From 19/07/2022 - To 19/07/2023

8th: 08 Feb 2024

From 19/07/2023 - To 19/07/2024

9th: 08 Feb 2024

From 19/07/2024 - To 19/07/2025

10th: 09 Jul 2025

From 19/07/2025 - To 19/07/2026