Sign In to Follow Application
View All Documents & Correspondence

Computer Implemented System For Dynamic Generation Of User Interfaces

Abstract: COMPUTER-IMPLEMENTED SYSTEM FOR DYNAMIC GENERATION OF USER INTERFACES ABSTRACT A computer-implemented system (100) for dynamic generation of user interfaces (UI) is disclosed. The system (100) comprises an input unit (102) to receive user input data. A processing unit (104) to receive the user input data from the input unit (102); trigger a UI generation engine (106) to generate a base UI based on the received user input data; trigger an automated A/B testing engine (108) to generate multiple UI variants from the base UI; deploy the generated UI variants and collect real-time interaction metrics; generate performance scores for UI elements by analyzing the real-time interaction metrics using a heatmap analysis engine (110); and adjust the UI elements in real-time based on the performance scores of the UI elements using an adaptive accessibility engine (112). The system (100) collects and processes user interaction data continuously, enabling immediate updates and improvements to the UI without human intervention. Claims: 10, Figures: 3 Figure 1 is selected.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
13 May 2025
Publication Number
22/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SR University
SR University, Ananthasagar, Hasanparthy, Warangal, Telangana- 506371, patent@sru.edu.in 08702818333

Inventors

1. Dr. V. Shobha Rani
Assistant Professor (CS&AI), SR University, Ananthasagar, Hasanparthy, Warangal, Telangana- 506371
2. Mr. Ashok Rachapalli
Assistant Professor (CS&AI), SR University, Ananthasagar, Hasanparthy, Warangal, Telangana- 506371
3. Mr. Nakka Nikhil
UG Student, SR University, Ananthasagar, Hasanparthy, Warangal, Telangana- 506371

Specification

Description:BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to a user interface development and particularly to a computer-implemented system for dynamic generation of user interfaces (UI).
Description of Related Art
[002] User interface (UI) and user experience (UX) design processes have seen considerable advancement due to increasing demand for intuitive and efficient digital interaction. Traditional design tools enable the creation of prototypes and structured layouts to approximate the intended user interface. These methods often depend on expert evaluation and subjective judgment to determine effectiveness. While the software landscape continues to improve in aesthetics and functionality, conventional workflows still lack mechanisms to reflect user interaction in real time.
[003] Separate methods exist to evaluate performance and guide optimization of interfaces. Designers often rely on comparative analysis between multiple interface versions to assess user preference. Visual interaction maps serve as a way to identify areas of high and low engagement across digital screens. Despite offering valuable insights, these processes demand manual effort for configuration, observation, and interpretation. The resulting data does not directly influence updates to interface design, which leads to delays and inefficiencies.
[004] Accessibility assessment frameworks help ensure that interfaces meet compliance and usability standards for users with diverse needs. These frameworks detect gaps in conformity but fall short of enabling automatic remediation based on individual user contexts. UI/UX development practices today often involve disjointed systems and tools that do not communicate with each other. This fragmented approach creates challenges in achieving real-time optimization, seamless adaptation, and inclusive design at scale.
[005] There is thus a need for an improved and advanced computer-implemented system for dynamic generation of user interfaces (UI) that can administer the aforementioned limitations in a more efficient manner.
SUMMARY
[006] Embodiments in accordance with the present invention provide a computer-implemented system for dynamic generation of user interfaces (UI). The system comprising an input unit configured to receive user input data. The system further comprising a processing unit in communication with the input unit. The processing unit is configured to receive the user input data from the input unit; trigger a UI generation engine to generate a base UI based on the received user input data; trigger an automated A/B testing engine to generate multiple UI variants from the base UI; deploy the generated UI variants and collect real-time interaction metrics. The real-time interaction metrics is generated based on a user interaction data selected from click-through rates, session durations, conversion rates, or combination thereof; generate performance scores for UI elements by analysing the real-time interaction metrics using a heatmap analysis engine; and adjust the UI elements in real-time based on the performance scores of the UI elements using an adaptive accessibility engine.
[007] Embodiments in accordance with the present invention further provide a method for dynamic generation of user interfaces (UI). The method comprising steps of receiving user inputs through an input unit; generating a base user interface (UI) using a UI generation engine, based on the received user inputs; automatically generating multiple UI variants from a base UI using an automated A/B testing engine; deploying the generated UI variants and collecting real-time interaction metrics, wherein the real-time interaction metrics is generated based on a user interaction data selected from click-through rates, session durations, conversion rates, or combination thereof; analysing the real-time interaction metrics using a heatmap analysis engine to generate performance scores of UI elements; and optimizing the generated UI variants by adjusting the UI elements in real-time using an adaptive accessibility engine based on the performance score of the UI elements.
[008] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide a computer-implemented system for dynamic generation of user interfaces (UI).
[009] Next, embodiments of the present application may provide a system for generation and optimization of user interfaces (UI) that integrates UI generation, automated A/B testing, heatmap analysis, and accessibility adaptation into a single seamless platform, removing the need for multiple disjointed tools.
[0010] Next, embodiments of the present application may provide a system for generation and optimization of user interfaces (UI) that collects and processes user interaction data continuously, enabling immediate updates and improvements to the UI without human intervention.
[0011] Next, embodiments of the present application may provide a system for generation and optimization of user interfaces (UI) that automatically adjusts UI elements such as font size, color contrast, and spacing based on each user’s accessibility needs, enhancing usability for all.
[0012] Next, embodiments of the present application may provide a system for generation and optimization of user interfaces (UI) that independently creates test variants, distributes them, and processes engagement metrics, reducing manual setup and analysis.
[0013] Next, embodiments of the present application may provide a system for generation and optimization of user interfaces (UI) that incorporates a closed feedback mechanism that uses real user data to evolve and refine interface designs over time, resulting in higher user engagement and satisfaction.
[0014] These and other advantages will be apparent from the present application of the embodiments described herein.
[0015] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0017] FIG. 1 illustrates a schematic block diagram of a system for generation and optimization of user interfaces (UI), according to an embodiment of the present invention;
[0018] FIG. 2 illustrates a block diagram of a processing unit, according to an embodiment of the present invention; and
[0019] FIG. 3 depicts a flowchart of a method for dynamic generation of user interfaces (UI), according to an embodiment of the present invention.
[0020] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0021] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0022] In any embodiment described herein, the open-ended terms "comprising", "comprises”, and the like (which are synonymous with "including", "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", “consists essentially of", and the like or the respective closed phrases "consisting of", "consists of”, the like.
[0023] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0024] FIG. 1 illustrates a schematic block diagram of a system 100 for generation and optimization of user interfaces (UI), according to an embodiment of the present invention. In an embodiment of the present invention, the system 100 may be adapted to generate user interfaces (UI) and/or user experience pages (UX). The generated user interfaces (UI) and/or user experience pages (UX) may be displayed on an electronic device (not shown) for enabling a user to interact and operate the electronic device. The system 100 may further adjust and optimize to generated user interfaces (UI) and/or user experience pages (UX) in accordance with the electronic device, for enabling a smooth and jitter free user-electronic device interaction.
[0025] The generated user interfaces (UI) and/or user experience pages (UX) may be applied on a system level platform. The system level platform may be, but not limited to, a homepage of an operating system, a communication page of a telephonic system, a viewfinder of a camera system, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the system level platform, including known, related art, and/or later developed technologies.
[0026] The generated user interfaces (UI) and/or user experience pages (UX) may be applied on an application-level platform. The application-level platform may be, but not limited to, a social media application, a gaming application, a content viewing application, an essential application, an educational application, an e-book application, a banking application, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the application-level platform, including known, related art, and/or later developed technologies.
[0027] The electronic device may be, but not limited to, a smartphone, a laptop, a tablet, a desktop, a wearable device, a refrigerator, a microwave oven, a kiosk, a handheld device, toys, a television, an air conditioner, a Virtual Reality (VR) device, an Augmented Reality (AR) device, a Mixed Reality (MR) device, a gaming console, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the electronic device, including known, related art, and/or later developed technologies.
[0028] According to the embodiments of the present invention, the system 100 may incorporate non-limiting hardware components to enhance the processing speed and efficiency such as the system 100 may comprise an input unit 102, a processing unit 104, a UI generation engine 106, an automated A/B testing engine 108, a heatmap analysis engine 110, and an adaptive accessibility engine 112. In an embodiment of the present invention, the hardware components of the system 100 may be integrated with computer-executable instructions for overcoming the challenges and the limitations of the existing systems.
[0029] In an embodiment of the present invention, the input unit 102 may be the electronic device that may be used by the user and/or a UI developer. The input unit 102 may be configured to receive user input data. The received user input data may further enable the system 100 to generate, deploy, and optimize the UI. The user input data may be, but not limited to, a screen resolution of the electronic device, an aspect ratio of the electronic device, a targeted demography of the UI, a colour palette, a supported language of the UI, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the user input data, including known, related art, and/or later developed technologies.
[0030] In an embodiment of the present invention, the processing unit 104 in communication with the input unit 102. The processing unit 104 may further be configured to execute computer-executable instructions to generate an output relating to the system 100. The processing unit 104 may be, but not limited to, a Programmable Logic Control (PLC) unit, a microprocessor, a development board, and so forth. The processing unit 104 may be a Raspberry Pi. Embodiments of the present invention are intended to include or otherwise cover any type of the processing unit 104 including known, related art, and/or later developed technologies. In an embodiment of the present invention, the processing unit 104 may further be explained in conjunction with FIG. 2.
[0031] FIG. 2 illustrates a block diagram of the processing unit 104, according to an embodiment of the present invention. The processing unit 104 may comprise the computer-executable instructions in form of programming modules such as a data receiving module 200, a data activation module 202, a data deployment module 204, a data generation module 206, and a data adjustment module 208.
[0032] In an embodiment of the present invention, the data receiving module 200 may be configured to receive the user input data from the input unit 102. The data receiving module 200 may be configured to transmit the user input data to the data activation module 202.
[0033] The data activation module 202 may be activated upon receipt of the user input data from the data receiving module 200. In an embodiment of the present invention, the data activation module 202 may be configured to trigger the UI generation engine 106 to generate a base UI based on the received user input data. The base UI may be a raw iteration of the final UI that may be generated by the user. The base UI may comprise essential elements of the UI. The essential elements of the UI may enable an overall functionality and workability of the electronic device. However, the base UI may lack ornamental elements, decorative elements, animative elements, and so forth. The essential elements of the UI may be, but not limited to, a startup page, an icon, a hamburger menu, sliders, selectors, checkboxes, radio buttons, togglers, and so forth.
[0034] The UI generation engine 106 may be configured to employ machine learning algorithms to generate the base UI based on historical user preferences, behaviour patterns, a demography, a region, and so forth of the user. The UI generation engine 106 may further be configured to incorporate user-specific accessibility preferences into the generated base UI. The user-specific accessibility preferences may be, but not limited to, translation of texts in the UI, generation of haptic feedback for enabling usability by visual impaired users, addition for support of hearing aid devices, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the user-specific accessibility preferences, including known, related art, and/or later developed technologies.
[0035] Further, the base UI may preferably generated in an English language. Embodiments of the present invention are intended to include or otherwise cover any language for generation of the base UI, including known, related art, and/or later developed technologies.
[0036] In an embodiment of the present invention, the data activation module 202 may be configured to trigger the automated A/B testing engine 108 to generate multiple UI variants from the base UI. The multiple UI variants generated from the base UI may be a rearranged combination of the essential elements of the UI along with an adaptive combination of the ornamental elements, the decorative elements, the animated elements, and so forth.
[0037] For example, an exemplary UI variant ‘v1.0’ may be generated from the base UI that may comprise all the essential elements of the UI with a colored schema. Further, a ‘v1.1’ of said exemplary UI variant may be generated that may in a portrait orientation favoring mobile devices. Similarly, a ‘v1.2’ of said exemplary UI variant may be generated that may in a landscape orientation favoring tablets, laptops, televisions, and so forth. Furthermore, an exemplary UI variant ‘v2.0’ may be generated from the base UI that may comprise all the essential elements of the UI with a greyscale schema. Further, a ‘v2.1’ of said exemplary UI variant may be generated that may in a portrait orientation favoring mobile devices. Similarly, a ‘v2.2’ of said exemplary UI variant may be generated that may in a landscape orientation favoring tablets, laptops, televisions, and so forth.
[0038] The data activation module 202 may be configured to transmit the generated multiple UI variants to the data deployment module 204.
[0039] The data deployment module 204 may be activated upon receipt of the generated multiple UI variants from the data activation module 202. In an embodiment of the present invention, the data deployment module 204 may be configured to deploy the generated UI variants on the electronic device(s). The data deployment module 204 may further be configured to collect real-time interaction metrics from the deployed generated UI variants. The real-time interaction metrics may be, but not limited to, click-through rates, session durations, conversion rates, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the real-time interaction metrics, including known, related art, and/or later developed technologies. The data deployment module 204 may be configured to transmit the real-time interaction metrics to the data generation module 206.
[0040] The data generation module 206 may be activated upon receipt of the real-time interaction metrics from the data deployment module 204. In an embodiment of the present invention, the data generation module 206 may be configured to engage the heatmap analysis engine 110. The heatmap analysis engine 110 may be configured for generation of performance scores for UI elements. The performance scores for the UI elements may be generated by analysis of the real-time interaction metrics using the heatmap analysis engine 110.
[0041] Further, the heatmap analysis engine 110 may be configured to visualize user interaction patterns to identify areas of high and low engagement within the UI. The identification of the areas of high and low engagement within the UI by the heatmap analysis engine 110 may further be stored in a heatmap repository. The heatmap repository may indicate the UI elements with the high engagement, further, the UI generation engine 106 may prefer the UI elements with the high engagement and may avoid the UI elements with the low engagement, as stored in the heatmap repository.
[0042] For example, an exemplary ‘Log In’ button may be an exemplary UI element of one of the generated UI variants. However, a font colour of ‘Log In’ text may be same as a fill colour of the button. The invisibility of the ‘Log In’ text may reduce the click-through rates, thus reducing the performance scores for the exemplary ‘Log In’ button.
[0043] Similarly, an exemplary interactive kids’ application for learning of alphabets and numerals may be paired with colourful UI elements along with rhyming audio playback track. The immersive experience developed by the UI elements and the rhyming audio playback track may increase the session durations, thus increasing the performance scores for the exemplary interactive kids’ application. The data generation module 206 may be configured to transmit the generated performance scores to the data adjustment module 208.
[0044] The data adjustment module 208 may be activated upon receipt of the generated performance scores from the data generation module 206.
[0045] In an embodiment of the present invention, the data adjustment module 208 may be configured to compare the generated performance scores with a benchmark score. Upon comparison, if the generated performance scores may be less than the benchmark score, then data adjustment module 208 may be configured to engage the adaptive accessibility engine 112. Else, the data adjustment module 208 may be configured to execute a feedback loop. The adaptive accessibility engine 112 may be configured to adjust the UI elements in real-time based on the performance scores of the UI elements. The adaptive accessibility engine 112 may further be configured to adjust the UI elements by modifying UI attributes such as, but not limited to, font size, colour contrast, inter-element spacing, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the UI attributes, including known, related art, and/or later developed technologies, that may be modified by the adaptive accessibility engine 112.
[0046] Further, upon adjustment of the UI elements, the data adjustment module 208 may be configured to reactivate the data generation module 206 to reevaluate and regenerate the performance scores. The data adjustment module 208 may be configured to compare the regenerated performance scores with the benchmark score. Upon comparison, if the generated performance scores are greater than the benchmark score, then the data adjustment module 208 may be configured to execute the feedback loop. Else, the data adjustment module 208 may be configured to reengage the adaptive accessibility engine 112 to adjust the UI elements.
[0047] In an embodiment of the present invention, the data adjustment module 208 may be configured to execute the feedback loop. The feedback loop may be configured to enable iterative refinement and optimization the UI when the performance scores are greater than the benchmark score. The refinement and optimization of the UI elements may be carried out by dynamically adjusting the UI elements. The execution of the feedback loop enables the system 100 to operate in a standalone mode. Further, the data adjustment module 208, along with the feedback loop, may interact with a cloud-based artificial intelligence model with front-end libraries, real-time analytics tooling, auto-deployment pipelines, and so forth to enable continuous optimization with minimal human touch.
[0048] In an embodiment of the present invention, the data adjustment module 208 may further be configured to reengage the automated A/B testing engine 108. The automated A/B testing engine 108 may be configured to determine an optimal UI variant from one of the generated UI variants, based on statistical analysis of the real-time interaction metrics and the performance scores of the UI elements. Further, the data adjustment module 208 may be configured to automatically update and deploy the optimal UI variant in the electronic device without manual intervention. The deployment of the optimal UI variant in the electronic device may be carried out by an Over-The-Air (OTA) update that may be pushed using a communication network (not shown) such as an Internet-based network.
[0049] FIG. 3 depicts a flowchart of a method 300 for the dynamic generation of the user interfaces (UI), according to an embodiment of the present invention.
[0050] At step 302, the system 100 may receive the user input data from the input unit 102.
[0051] At step 304, the system 100 may trigger the UI generation engine 106 to generate the base UI based on the received user input data.
[0052] At step 306, the system 100 may trigger the automated A/B testing engine 108 to generate the multiple UI variants from the base UI.
[0053] At step 308, the system 100 may deploy the generated UI variants and collect the real-time interaction metrics.
[0054] At step 310, the system 100 may generate the performance scores for the UI elements by analysing the real-time interaction metrics using the heatmap analysis engine 110.
[0055] At step 312, the system 100 may compare the generated performance scores with the benchmark score. Upon comparison, if the generated performance scores may be less than the benchmark score, then the method 300 may proceed to a step 314. Else, the method 300 may proceed to a step 318.
[0056] At step 314, the system 100 may adjust the UI elements in real-time based on the performance scores of the UI elements using the adaptive accessibility engine 112.
[0057] At step 316, the system 100 may compare the generated performance scores with the benchmark score. Upon comparison, if the generated performance scores may be greater than or equal to the benchmark score, then the method 300 may proceed to the step 318. Else, the method 300 may revert to the step 314.
[0058] At step 318, the system 100 may generate the feedback loop to iteratively refine and optimize the UI based on the performance score of the UI elements upon dynamically adjusting the UI elements.
[0059] At step 320, the system 100 may update and deploy the optimal UI variant to the user without manual intervention.
[0060] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0061] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
I/We Claim:
1. A computer-implemented system (100) for dynamic generation of user interfaces (UI), comprising:
an input unit (102) configured to receive user input data; and
a processing unit (104) in communication with the input unit (102); characterized in that the processing unit (104) is configured to:
receive the user input data from the input unit (102);
trigger a UI generation engine (106) to generate a base UI based on the received user input data;
trigger an automated A/B testing engine (108) to generate multiple UI variants from the base UI;
deploy the generated UI variants and collect real-time interaction metrics, wherein the real-time interaction metrics is generated based on a user interaction data selected from click-through rates, session durations, conversion rates, or a combination thereof;
generate performance scores for UI elements by analysing the real-time interaction metrics using a heatmap analysis engine (110); and
adjust the UI elements in real-time based on the performance scores of the UI elements using an adaptive accessibility engine (112).
2. The system (100) as claimed in claim 1, wherein the UI generation engine (106) is further configured to incorporate user-specific accessibility preferences into the base UI generation process.
3. The system (100) as claimed in claim 1, wherein the automated A/B testing engine (108) is configured to determine an optimal UI variant based on statistical analysis of the real-time interaction metrics.
4. The system (100) as claimed in claim 1, wherein the heatmap analysis engine (110) is further configured to visualize user interaction patterns to identify areas of high and low engagement within the UI.
5. The system (100) as claimed in claim 1, wherein the adaptive accessibility engine (112) is configured to adjust the UI elements by modifying UI attributes selected from font size, colour contrast, inter-element spacing, or a combination thereof.
6. The system (100) as claimed in claim 1, wherein the processing unit (104) is configured to automatically update and deploy an optimal UI variant without manual intervention.
7. The system (100) as claimed in claim 1, wherein the processing unit (104) is configured to employ machine learning algorithms to generate the base UI based on historical user preferences, behaviour patterns, or a combination thereof, of the user.
8. A computer-implemented method (300) for dynamic generation of user interfaces (UI), the method (300) is characterised by steps of:
receiving user inputs through an input unit (102);
generating a base user interface (UI) using a UI generation engine (106), based on the received user inputs;
automatically generating multiple UI variants from a base UI using an automated A/B testing engine (108);
deploying the generated UI variants and collecting real-time interaction metrics, wherein the real-time interaction metrics are generated based on a user interaction data selected from click-through rates, session durations, conversion rates, or a combination thereof;
analysing the real-time interaction metrics using a heatmap analysis engine (110) to generate a performance score of UI elements; and
optimizing the generated UI variants by adjusting the UI elements in real-time using an adaptive accessibility engine (112) based on the performance score of the UI elements.
9. The method (300) as claimed in claim 8, comprising a step of generating a feedback loop to iteratively refine and optimize the UI based on the performance score of the UI elements upon dynamically adjusting the UI elements.
10. The method (300) as claimed in claim 8, comprising a step of continuously updating and deploying an optimal UI variant to the user without manual intervention.
Date: May 12, 2025
Place: Noida

Nainsi Rastogi
Patent Agent (IN/PA-2372)
Agent for the Applicant

Documents

Application Documents

# Name Date
1 202541045851-STATEMENT OF UNDERTAKING (FORM 3) [13-05-2025(online)].pdf 2025-05-13
2 202541045851-REQUEST FOR EARLY PUBLICATION(FORM-9) [13-05-2025(online)].pdf 2025-05-13
3 202541045851-POWER OF AUTHORITY [13-05-2025(online)].pdf 2025-05-13
4 202541045851-OTHERS [13-05-2025(online)].pdf 2025-05-13
5 202541045851-FORM-9 [13-05-2025(online)].pdf 2025-05-13
6 202541045851-FORM FOR SMALL ENTITY(FORM-28) [13-05-2025(online)].pdf 2025-05-13
7 202541045851-FORM 1 [13-05-2025(online)].pdf 2025-05-13
8 202541045851-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [13-05-2025(online)].pdf 2025-05-13
9 202541045851-EDUCATIONAL INSTITUTION(S) [13-05-2025(online)].pdf 2025-05-13
10 202541045851-DRAWINGS [13-05-2025(online)].pdf 2025-05-13
11 202541045851-DECLARATION OF INVENTORSHIP (FORM 5) [13-05-2025(online)].pdf 2025-05-13
12 202541045851-COMPLETE SPECIFICATION [13-05-2025(online)].pdf 2025-05-13