Sign In to Follow Application
View All Documents & Correspondence

Design Intelligence And Automation Framework (Diaf)

Abstract: A user interface design automation system comprises an intelligence module, an adaptation module, and a performance analytics-based module. The intelligence module is inputted with design data comprising design objectives and strategies associated with the user interface to generate a design for the user interface. The adaptation module builds the user interface design based on the design of the UI with different resolutions and languages based on requirements of different channels for the user interface design and launches the user interface design in the different channels. The performance analytics-based module monitors and controls the launched user interface design. A decision whether to one of scale up, scale down, and discontinue the launched user interface design is made based on monitoring parameters used by the performance analytics-based module.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
31 May 2022
Publication Number
48/2023
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Play Games24x7 Private Limited
5th Floor, Central (B) Wing, Nesco IT Park, Tower 4, Western Express Highway, Goregaon (East), Mumbai - 400063, Maharashtra, India

Inventors

1. Manasa Kolla
TF 04, Sai Pearl Apartment, 9th Cross, Kalyan Nagar, Dharwad – 580007, India
2. Suman Pal
5B/2 Jagannath Ghosh road, 1st floor, Post office: Kasba, Kolkata- 700042, West Bengal, India
3. Sri Gowtham Nuthi
Dr no 41-4-58, Metla Bazar, Krishna lanka, Vijayawada, Andhra Pradesh pincode: 520013, India
4. Sethuraman T V
No. 46, G1, Greenwood apartments, Ramu street, Srinivasapuram, Guduvancheri, Chengalpattu, Tamilnadu - 603202, India
5. Sachin Kumar
Flat 5076, Prestige Misty Waters - Vista Tower, Kempapura Village, Hebbal - 560024, Bengaluru, Karnataka, India
6. Tridib Mukherjee
5057 Prestige Tranquility, Budigere Cross, Bangalore - 560049, India

Specification

FORM-2
THE PATENT ACT,1970
(39 OF 1970)
AND
THE PATENT RULES, 2003
(As Amended)
COMPLETE SPECIFICATION (See section 10;rule 13)
"DESIGN INTELLIGENCE AND AUTOMATION FRAMEWORK (DIAF)"
Play Games24x7 Private Limited, a corporation organized and existing under the laws of India, of 5th Floor, Central (B) Wing, Nesco IT Park, Tower 4, Western Express Highway, Goregaon (East), Mumbai - 400063, Maharashtra, India.
The following specification particularly describes the invention and the manner in which it is to be performed:

DESIGN INTELLIGENCE AND AUTOMATION FRAMEWORK (DIAF)
TECHNICAL FIELD
The present invention relates to automation of user interface (UI) design using advanced modules, more specifically, a design intelligence and automation framework that improves the whole UI design process by removing subjective guesswork and iterative loops at the design and targeting modules by using computer vision (CV) and Artificial intelligence (AI), and hence better prediction with causality/reasoning.
BACKGROUND
A UI design is an integral part of any user interface-based system. This is the first medium of communication with the end-user and hence must be the best version for better user familiarity or experience. A UI is designed based on a brief from the design team. Post design, the UIs are adapted to the requirements of a channel, which is again a very time consuming and manual process. Finally, the design is hosted and the performance metrics in terms of user interaction with the platform, which can be quantified by metrics, such as click through rate, time spent by the user actively on the platform, etc., are tracked, monitored and findings are drawn manually, which is again subjective and laborious. The whole process today is very manual and involves a lot of subjectivity, and hence results in inefficiency and lack of efficacy.
Therefore, there is a need for an automation system that automates and improves the whole UI design process by removing the subjective guesswork, iterative loops at the design and targeting modules, and therefore better prediction with causality or reasoning. This UI design process needs to cover an end-to-end pipeline for design creation, smart insights generation, and optimal recommendations by leveraging AI and automation modules.
SUMMARY OF THE INVENTION
The following presents a simplified summary of the subject matter in order to provide a basic understanding of some aspects of subject matter embodiments. This summary is not

an extensive overview of the subject matter. It is not intended to identify key/critical elements of the embodiments or to delineate the scope of the subject matter. Its sole purpose is to present some concepts of the subject matter in a simplified form as a prelude to the more detailed description that is presented later.
A user interface (UI) design automation system disclosed here addresses the above-mentioned need for automation and improvement in the whole UI design process by removing the subjective guesswork, iterative loops at the design and targeting modules, and therefore better prediction with causality or reasoning. This UI design process involves an end-to-end pipeline for design creation, smart insights generation, and optimal recommendations by leveraging AI and automation modules.
The user interface design automation system comprises at least one processor that operates under control of a stored program comprising a sequence of program instructions to control one or more components. The components comprise an intelligence module, an adaptation module, and a performance analytics-based module. The intelligence module is inputted with design data comprising design objectives and strategies associated with the user interface to generate a design for the user interface. The adaptation module builds the user interface design based on the design for the UI with different resolutions and languages based on requirements of different channels for the user interface design. The performance analytics-based module monitors and controls the launched user interface design. A decision whether to scale up, scale down, or discontinue the launched user interface design is made based on monitoring parameters used by the performance analytics-based module.
In an embodiment, the intelligence module is in communication with a design objective module, a design brief module, and a design module. The design objective module contains the design data comprising the design objectives and strategies associated with the user interface. The design objective module provides the design data to the intelligence module. The intelligence module, thus, generates design briefs based on the communication from the design objective module and design data. The design brief module, thus, receives

generated briefs from the intelligence module. In an embodiment, the intelligence module extracts various cognitive and design meta from historical data associated with the user interface design. This cognitive and design meta is combined with corresponding performance metrics respectively, to provide recommendations with respect to the user interface design to design module, and therefore, a designer is enabled to tweak the design of the user interface. A quality check is performed after the design module to verify a selected user interface design.
In an embodiment, the adaptation module is in communication with an audience and channel selection module and a launch module. The audience and channel selection module gets recommendation on audience and channel from the intelligence module. The audience and channel selection module transfers information associated with the selected channel to the adaptation module. The launch module launches the selected user interface design after verifying the selected user interface design with the user selection data. In an embodiment, the performance analytics-based module is in communication with a monitoring and control module containing data associated with the monitoring parameters. The performance analytics-based module is in communication with the monitoring and control module to provide smart suggestions based on performance of different launched user interface designs across different channels, type of users being on-boarded, and frequency of user visitations.
In an embodiment, the intelligence module comprises a database and an intelligence unit. The database comprises design data, user data, and channel data. The design data includes information associated with performance and meta extraction, user data includes information associated with user persona extraction, and channel data includes information associated with the channel. The information associated with the design meta is subdivided into cognitive meta and user interface (UI) meta. The intelligence unit is in communication with the database to receive and process information that includes the design data, the user data, and the channel data to generate a design recommendation, a meta recommendation, and an audience and channel recommendation.

In an embodiment, the adaptation module is configured to identify design elements associated with the user interface design, extract the design elements, and resize the design elements to a target proportion based on a check performed for predefined design rules. In an embodiment, the performance analytics-based module uses live data associated with the launched user interface design in multiple stages. In the first stage, the live data is pre-processed. In the second stage, the user interface design performance is benchmarked, provided with a design ranking, and tracked for fatigue experienced in channels via a fatigue tracker. In the third stage, alerts are raised, reports are generated, and recommendations are generated based on the user interface design live performance.
These and other objects, embodiments and advantages of the present invention will become readily apparent to those skilled in the art from the following detailed description of the embodiments having reference to the attached figures, the invention not being limited to any particular embodiments disclosed.
BRIEF DESCRIPTION OF FIGURES
The foregoing and further objects, features and advantages of the present subject matter will become apparent from the following description of exemplary embodiments with reference to the accompanying drawings, wherein like numerals are used to represent like elements.
It is to be noted, however, that the appended drawings along with the reference numerals illustrate only typical embodiments of the present subject matter, and are therefore, not to be considered for limiting its scope, for the subject matter may admit to other equally effective embodiments.
FIG. 1 illustrates a Design Intelligence and Automation Framework (DIAF), as an example embodiment of the present invention.

FIG. 2 illustrates an intelligence module of the design intelligence and automation framework, as an example embodiment of the present invention.
FIG. 3A illustrates an adaptation module of the design intelligence and automation framework, as an example embodiment of the present invention.
FIG. 3B illustrates an example embodiment of the adaptation module of the design intelligence and automation framework, showing resolution buckets having similar layouts, as an example embodiment of the present invention.
FIG. 4 illustrates a Performance Analytics module of the design intelligence and automation framework, as an example embodiment of the present invention.
FIG. 5 illustrates method flow associated with the working of the Design Intelligence and Automation Framework (DIAF), as an example embodiment of the present invention.
DETAILED DESCRIPTION
Exemplary embodiments now will be described with reference to the accompanying drawings. The disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey its scope to those skilled in the art. The terminology used in the detailed description of the particular exemplary embodiments illustrated in the accompanying drawings is not intended to be limiting. In the drawings, like numbers refer to like elements.
It is to be noted, however, that the reference numerals used herein illustrate only typical embodiments of the present subject matter, and are therefore, not to be considered for limiting its scope, for the subject matter may admit to other equally effective embodiments.

The specification may refer to “an”, “one” or “some” embodiment(s) in several locations. This does not necessarily imply that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments.
As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms “includes”, “comprises”, “including” and/or “comprising” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Furthermore, “connected” or “coupled” as used herein may include operatively connected or coupled. As used herein, the term “and/or” includes any and all combinations and arrangements of one or more of the associated listed items.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skills in the art to which this disclosure pertains. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The figures depict a simplified structure only showing some elements and functional entities, all being logical units whose implementation may differ from what is shown. The connections shown are logical connections; the actual physical connections may be different. It is apparent to a person skilled in the art that the structure may also comprise other functions and structures.

Also, all logical units described and depicted in the figures include the software and/or hardware components required for the unit to function. Further, each unit may comprise within itself one or more components which are implicitly understood. These components may be operatively coupled to each other and be configured to communicate with each other to perform the function of the said unit.
As used herein, the term “channel” refers to any device through which a user can access the application, for example, mobile or cell phone, laptop, etc. The phrase “channel data” refers to any data regarding the device through which the application is being accessed. The term “channel analytics” refers to a module that works in communication with a processor to generate insights based on an input channel data. The phrase “audience and channel recommendation” refers to an intelligence unit that provides outputs regarding what kind of UI needs to be shown to the users and to which channel.
FIG. 1 illustrates a Design Intelligence and Automation Framework (DIAF) 100, as an example embodiment of the present invention. Also, FIG. 5 illustrates method flow associated with the working of the Design Intelligence and Automation Framework (DIAF) 100, as an example embodiment of the present invention. The user interface design automation system 100 or the DIAF comprises at least one processor 102 that operates under control of a stored program comprising a sequence of program instructions to control one or more components. The components comprise an intelligence module 104, an adaptation module 106, and a performance analytics-based module 108. The intelligence module 104 is inputted 502 (FIG. 5) with design data comprising design objectives and strategies associated with the user interface to generate a design for the user interface. The adaptation module 106 builds 504 (FIG. 5) the user interface design based on the design for the UI with different resolutions and languages based on requirements of different channels 138 for each UI design, as also described in FIG. 2. The performance analytics-based module 108 monitors and controls 506 (FIG. 5) the launched user interface design. A decision whether to scale up, scale down, and discontinue the launched user interface

design is made based on monitoring parameters used by the performance analytics-based module 108.
The intelligence module 104 is in communication with a design objective module 110, a design brief module 112, and a design module 114. The design objective module 110 contains the design data comprising the design objectives and strategies associated with the user interface. The design objective module 110 provides the design data to the intelligence module 104. The design brief module 112 communicates with the intelligence module 104 and the design objective module 110 based on the design data. The intelligence module 104 generates design briefs based on the communication from the design objective module 110 and the design data. The intelligence module 104 takes in input from design objective and design data. Based on the details provided, the intelligence module 104 comes up with different design briefs that are required. The intelligence module 104 defines who the target audience are and through which channel it has to be shared with them. The intelligence module 104 also gives multiple inputs to designers with respect to positioning of different elements, colors, etc.
The design module 114 communicates with the intelligence module 104 and the design brief module 112, where the intelligence module 104 generates the user interface design. A quality check 116 is performed after the design module 114 to verify a selected user interface design. If the design is a FAIL post the quality check 116, the selected user interface design is fed back to the design module 114 for reworking. If the design is a PASS post the quality check 116, the selected user interface design is transferred to the audience and channel selection module 118 for review with the adaptation module 106.
The intelligence module 104 extracts various cognitive and design meta from historical data associated with the user interface design. This cognitive and design meta is combined with corresponding performance metrics respectively, to provide recommendations with respect to the user interface design, and therefore, a designer is enabled to tweak the design of the user interface. The adaptation module 106 is in communication with the audience and

channel selection module 118 and a launch module 120. The audience and channel selection module 118 transfers the following features to the adaptation module 106: a verified selected user interface design after the quality check 116, recommendations on audience and channel selection that are received from the intelligence module 104, a user selection data associated with selection of the selected user interface design via the users, and a selected channel to launch the selected user interface design.
The launch module 120 launches the selected user interface design after verifying the selected user interface design with the user selection data. In other words, the adaptation module 106 processes designs with different resolutions and languages to meet the requirements of different channels 138 and saves a lot of time. The performance analytics-based module 108 is in communication with a monitoring and control module 122 containing data associated with the monitoring parameters. The performance analytics-based module 108 is in communication with the monitoring and control module 122 to provide smart suggestions based on performance of different launched user interface designs across different channels 138, type of users being on-boarded, and frequency of user visitations.
FIG. 2 illustrates an intelligence module 104 of the design intelligence and automation framework 100, as an example embodiment of the present invention. In an embodiment, the intelligence module 104 comprises a database 124 and an intelligence unit 132. The database 132 comprises design data 126, user data 128, and channel data 130. The design data 126 includes information associated with performance 134 and meta extraction 136, the user data 128 includes information associated with user persona extraction 137, and channel data 130 includes information associated with the channel 138 or channel analytics 138. The information associated with the meta extraction 136 is subdivided into cognitive meta 140 and UI meta 142. The intelligence unit 132 is in communication with the database 124 to receive and process information that includes the design data 126, the user data 128, and the channel data 130 to generate a design recommendation 144, a meta

recommendation 146, and an audience and channel recommendation 148 based on the user interface design.
With reference to the process of user Persona Extraction 137, which involves profiling of users to understand their emotional goals to affect their emotions with appropriate UIs to maximize conversions. To make the most of the UIs, one has to truly know the target customer. Understanding what value they are looking for, what their goals are, what challenges they face, and their objectives are really important in nudging a customer to click a design. A design generally has a variety of features but certain features excite a group of people more than others. Hence, highlighting the correct set of features to the correct set of customers goes a long way in making an “ok” UI to a hundred times better UI. User segmentation process is carried out by the following procedure:
1. Use historical data on the platform to identify various cohorts of users based on their emotional response/goal on the platform and identify the pre-conversion (external) features which define the similar cohorts.
2. Mapping the cohort of users to the UI to identify the salient elements of UIs for different user cohorts.
3. Identify the distribution of cohorts in various channels (like Google, Facebook
etc.).
4. Recommend the appropriate UI on the appropriate platform to the cohort of users.
With reference to Channel Analytics 138, a channel is a medium used to showcase their designs and inform customers about some promotions. As a part of channel analytics key features/properties of channels are extracted which are analyzed in the Intelligence Module 104 to give recommendations with respect to Channel selection. Some of the features are: a) Audience distribution, b) Traffic, c) Cost, d) Coverage, e) Fraud, f) Past performance, and g) Optimization tools available and its efficacy, etc. Based on the target audience and channel properties, budget allocation is done to optimize ROI.
In other words, the intelligence module 104 processes data about the users, UI and channels

and gives recommendations on brief, various design elements and channel selection. The Intelligence module 104 is fed with design data comprising design objectives and strategy. The Intelligence module 104 leverages various data sources about user persona, channels and historic designs along with their performance metrics. Various cognitive meta 140 and UI meta 142 are extracted from the historical data. These meta are combined with their performance and thus recommendations with respect to design are given to the design module 114 and therefore, a designer is enabled to tweak the design of the user interface.
With reference to the performance 134 module, once an UI is made live on a platform, all the necessary metrics such as, impressions, clicks, leads, conversions and the cost incurred are tracked to assess its performance 134. With reference to meta extraction 136, a design is launched in the form of a static image and a suitable set of measurable features are extracted from it. These are broadly divided into two categories: a) UI meta 142, which is mostly explicit in nature, b) Cognitive meta 140, which is mostly implicit in nature. Some examples of cognitive meta 140 include attention map, cognitive load, etc., whereas UI meta 142 includes number of elements, types of elements, their positioning, communication text, etc.
In order to explain UI Meta 142 in detail, every media type has an associated meta which essentially is vital for conveying the design features and attracting the users. A lot of intuitive effort is put by a design team to create an UI. However, the problem is that this is a matter that is still biased based on the opinion of viewpoints of a few individuals. Multiple questions arise while designing an UI, for example, What should be the messaging style?, colors?, positions?, etc. Therefore, the first step is to build a pipeline which makes the collection of such data points easier. Several methods including computer vision, deep learning and NLP algorithms along with the Photoshop Document (PSD) files are used for meta data collection. Taking an example of an UI, a lot of information is present inside an UI which affects the emotion and attention of the viewer. These could be in the form of colors, message, model, product features or overall cognitive aspects. Some could be visible to naked eye and some won't, and therefore, these features are custom built using

open source resources. In an example, there are around 60 plus features being collected in total for various ad forms. These features can be bucketed into a few categories. They are a)Theme, b)Product features - elements/objects being detected related to product, c)Elements, d) Text - Font Style, Size, Alignment, Orientation, Emotion etc., e) Color features, f) Music, g) Emotion, and h) Cognitive features.
With reference to Cognitive Meta 140, any design content is designed to gain viewers' attention, so it is inevitable to measure the salient regions on a banner or any UI medium to elicit any form of association or response. Hence, proactively estimating people's attention becomes instrumental. Thus, a model is developed based on the Design Intelligence and Automation Framework (DIAF) 100 described in the present disclosure to predict people's visual gaze patterns while they are viewing the UI. This model considers the semantic information of the content, which may be subjective, and the visual features that attract human attention irrespective of the content involved, which is objective.
In short, the Design Intelligence and Automation Framework (DIAF) 100: 1. Provides quality check for the UIs, 2. Helps designers to accelerate the design process, and 3. Enhances the performance metrics. Typically, cognitive overload occurs when any design medium contains information greater than the processing capacity of the human information-processing system. Since the information viewed by the user (UI design) is under our control, it’s proposed to modulate this content to vary the cognitive load experienced.
The whole visual cognition engine is a human-in-the-loop system. This engine is an AI assistant which provides intelligent recommendations considering various performance and scientific parameters. Among several recommendations provided by the engine, humans can pick the best designs which accelerates the UI generation process and at the same time does not hamper the creativity of the designers. Furthermore, the system checks for the various other low-level features like contrast, brightness, saturation, hue, RGB content, image energy, etc. The system evaluates all these metrics on two bases: 1. objectively -

which is pleasing to the eye and 2. subjectively - Which gives a better performance.
The initial optimization is carried out on the objective scores, followed by the recommendation on the subjective features. This is due to the inherent stochastic nature of such subjective properties. Finally, the system understands the emotions elicited by the UI and supports the same with objective and subjective evidence. The system supports all the evidence by several other multivariate features. For example, the increased values of “attention weighted contrast” on the image reinforce attention on the UI elements. Thus this ensures both visual inspection and quantitative feedback. The most crucial element of the system is its intelligent way of providing recommendations in a multivariate fashion. There might be several constraints while designing a UI and thus, the system accounts for all that and provides various alternatives to achieve the desired results. This ensures better flexibility and enhanced creativity.
Furthermore, with reference to the intelligence module 104, its design tool performs two vital functions in terms of recommendation and quality check in the UI generation journey. As a part of recommendation, the Intelligence Module 104 assists the Arts and design team by recommending color combinations, text fonts & style, their alignment, positioning etc. given the set of design elements to be used as input. The UI generation process starts with these initial sets of recommendations. The UI generation is an iterative process where the UIs being designed are fed to a quality check engine which approves the UIs as per the aesthetic scores. Along with the aesthetic quality, the intelligence module 104 looks into the aspect of image clarity and attention map of the designs in order to ensure minimal clutter and desirable attention distribution. This is an important step as along with the UI content, its representation plays a crucial role in driving desirable response from the users.
With respect to the STP module, an initial brief is composed from the design objective 110. The key words indicating the objective of the design are extracted from the brief. Based on this, the useful metas/features are selected from the plethora of features available in the database and are provided as an input to the recommendation engine 144 along with

corresponding performance metrics and target user data/persona/cohort under consideration. As a part of STP analysis, two functionalities are provided:
a) Exploratory Engine: This is a dashboard providing a comprehensive historical analysis of UIs for a given time period and user segment of concern. In terms of univariate analysis, a better understanding is made on the effect of design meta on various performance metrics. Necessary statistical tests are performed for individual selected design elements to understand their effect on performance metrics. There can be scenarios where two or more meta elements are co-occurring and the change in performances cannot be attributed to a single meta element, for example, design A is always used for Theme A and so on, which demand for a multivariate analysis. Hence, the dependency of meta elements is checked among each other and subsequently attribute the performance change to one or more elements based on process knowledge. In terms of multivariate analysis, the best combination of design elements (two or more) is identified in terms of their performances for the given user segment. This is very useful to the business team in coming up with a brief for UI.
b) Recommendation Engine: This is developed keeping in mind the (segmenting, targeting, positioning) STP marketing framework. Appropriate recommendation is given to stakeholders with respect to segmentation, targeting and positioning. There are three broad stages: (i) Based on design objectives, appropriate user segments are chosen as per their unique characteristics, (ii) The distribution of the concerned user segment is analyzed across different channels and hence spending across different channels is recommended, and (iii) Based on the segment being targeted, design contents are recommended and a refined brief is being generated for the design team for UI generation.
FIG. 3A illustrates an adaptation module 106 of the design intelligence and automation framework 100, as an example embodiment of the present invention. In an embodiment, the adaptation module 106 is configured to: identify 302 design elements associated with the user interface design, extract 304 the design elements, and resize 306 the design elements to a target proportion based on a check 308 performed for predefined design rules. In other

words, the adaptation module 106 is an automation tool that adapts the designs to different resolutions and languages. This process dilutes the cognitively challenging task through artificial intelligence (AI) and hence, the designer saves a lot of time and may focus on more productive tasks that require professional judgment and aesthetic sensibilities.
With reference to the adaptation module 106, after a UI is designed and finalized, it has to be adapted to different screen resolutions, which is a very laborious task. For any UI, there are more than 50 resolutions and around 8-9 languages. Adaptation for every resolution takes 1-2 hours which consumes a lot of time which could be devoted to designing new UIs and it also reduces the overall UI generation time. The resolutions are bucketed into 8-9 buckets where each bucket consists of a set of resolutions where the layout (that is the relative position of design elements of UI) is similar. So the problem can be further subdivided into two parts:
1. Automatically generate UIs within a bucket having similar layouts, and
2. Move across buckets by generating layouts for individual buckets
In this way from a single benchmark UI, all the resolutions and languages can be populated.
FIG. 3B illustrates an example embodiment of the adaptation module of the design intelligence and automation framework, showing resolution buckets having similar layouts, as an example embodiment of the present invention. The process explained allows automatic generation of UIs within a bucket having similar layouts.
In the resolution bucket example shown in FIG.3B one can see that resolutions within a bucket have a varied aspect ratio (except for 1:1 bucket) and linear scaling of UIs into target shape would result in distortion of individual elements compromising the aesthetics of UIs. When image retargeting is taken into consideration, the first job is to identify the salient contents of the image. This step is important irrespective of whether it is of natural images/graphical content or the model that is used (CV techniques/CNN architectures). Raw files of banners are used to identify the salient contents of the image.

With reference to the rule based semi-automated engine 308, individual elements are located and rescaled while maintaining their aspect ratio given design files and target shape as input. Let us now discuss the methodology in detail. The rule based engine 308 generates UI for all the resolutions of a bucket given a benchmark UI (of highest resolution) of that particular bucket. 1. The design elements of the UI are identified and extracted from the raw file. These are the salient contents of the UI whose aspect ratio has to be maintained while retargeting, 2. Based on the target shape, the background element is cropped and the design elements are placed in the cropped background element while maintaining their relative positions, and 3. The design elements are individually resized maintaining their aspect ratio based on certain design rules.
Additionally, the tool is provided with few edit functionalities for post generation processing. These are: 1. Addition deletion of elements of UI, 2. Changing the order of pasting elements, and 3. Moving and resizing of individual elements. In the absence of data, the rule-based engine 308 discussed above is deployed for generating rough master layouts for buckets as well. As the rule-based engine 308 is used, the UI database becomes richer which can be used to build more sophisticated layout generation models. Post adaptation, the UIs are launched into different channels as per the audience to be targeted which is recommended by the Intelligence Module.
The database is the most crucial part of the system as it forms the linkage between the UI content and their respective performance metrics. User data is fed into the database as well. It comprises both internal (interaction with the product, for example, game play data) and external (features like their lifestyle, user behavior, etc.) user features. To summarize, the database (or the data lake) contains performance metrics, UI content and user data.
FIG. 4 illustrates a performance analytics module 108 of the design intelligence and automation framework 100, as an example embodiment of the present invention. In an embodiment, the performance analytics-based module 108 uses live data 124a associated with the launched user interface design in multiple stages. In a first stage 400, the live data

124a is pre-processed. In a second stage 402, the user interface design is performance benchmarked, provided with a design ranking, and tracked for fatigue experienced in channels 138 via a fatigue tracker 408. In a third stage 404, alerts are raised, reports are generated, and recommendations are generated, based on the user interface design live performance. In other words, the performance analytics module 108 provides smart insights on live user interface designs. The performance analytics module 108 analyses how different user interface designs are performing across different channels 138, the type of users being on-boarded such as high/low engaged users for shorter versus longer time span per session, high/low frequent visitors, etc.
With reference to the Performance Analytics and Dashboard 108, once the UIs go live, tracking is performed to see how they are performing and driving the performance metrics. The performance analytics and dashboard module 108 gives intelligence about the performance of the UI. Based on this intelligence, decisions are made whether to scale up/down or stop the UI. It has 4 components as below:
1. Design Benchmarking:
The dashboard provides a snapshot of comparative UI performance across different channels, audiences and objectives, based on various KPIs. A benchmark is decided based on various metrics of user activity on the UI and we take decisions on the remaining UIs accordingly. It is scaled across different channels, to get the best overall performance.
2. Design Rating:
Scoring all UI assets across channels based on performance and then ranking them based on that score. So, for a particular channel, the benchmark UI should have the highest score and hence should rank high. Relatively, all other UIs will be ranked to make various decisions.
3. Fatigue tracker:
UIs are meant to nudge potential customers towards acting on the call to action. Fatigue sets in when a particular audience is shown the same UI over a long period of time, that it

stops having any impact on their desire to act on the call to action. This starts showing up in the upper funnel metrics and low user activity is an indication of fatigue. This can be solved with a UI refresh or audience refresh.
All these tools together remove subjective bias that arises during the UI design process, saves time, generates actionable insights to improve the effectiveness of UI design and manages the process more efficiently. Further, these tools do not hinder creativity and at the same time equips designers with insights which might be hard to obtain in the absence of the proposed intelligent and automated process.
As will be appreciated by one of skill in the art, the present invention may be embodied as a method, system and apparatus. Accordingly, the present invention may take the form of an entirely hardware embodiment, a software embodiment or an embodiment combining software and hardware aspects.
It will be understood that each block of the block diagrams can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
In the drawings and specification, there have been disclosed exemplary embodiments of the invention. Although specific terms are employed, they are used in a generic and descriptive sense only and not for purposes of limitation of the scope of the invention.

We Claim:
1. A user interface design automation system comprising:
at least one processor that operates under control of a stored program comprising a sequence of program instructions to control one or more components, wherein the components comprising:
an intelligence module that is inputted with design data comprising design objectives and strategies associated with the user interface (UI) to generate a design for the UI;
an adaptation module that builds the user interface design based on the design for the UI with different resolutions and languages based on requirements of different channels for the user interface design; and
a performance analytics-based module that monitors and controls the user interface design, wherein a decision whether to one of scale up, scale down, and discontinue the user interface design is made based on monitoring parameters used by the performance analytics-based module.
2. The user interface design automation system as claimed in claim 1, wherein the
intelligence module is in communication with:
a design objective module that contains the design data comprising the design objectives and strategies associated with the user interface, and wherein the design objective module provides the design data to the intelligence module;
a design brief module that communicates with the intelligence module and the design objective module based on the design data, wherein the intelligence module generates design briefs based on the communication from design objective module and the design data; and
a design module that communicates with the intelligence module and the design brief module, wherein the intelligence module generates the user interface design, and wherein a quality check is performed to verify a selected user interface design.

3. The user interface design automation system as claimed in claim 1, wherein the intelligence module extracts various cognitive and design meta from historical data associated with the user interface design, wherein this cognitive and design meta is combined with corresponding performance metrics respectively, to provide recommendations with respect to the user interface design, and wherein a designer is enabled to tweak the design of the user interface.
4. The user interface design automation system as claimed in claim 3, wherein the adaptation module is in communication with:
an audience and channel selection module that transfers to the adaptation module: a verified selected user interface design after the quality check, recommendations on audience and channel that are received from the intelligence module,
user selection data associated with selection of the selected user interface design via the users, and a launch module that launches the selected user interface design after verifying the selected user interface design with the selected channel.
5. The user interface design automation system as claimed in claim 4, further comprising a monitoring and control module containing data associated with the monitoring parameters, wherein the performance analytics-based module is in communication with the monitoring and control module to provide smart suggestions based on performance of different launched user interface designs across different channels, type of users being onboarded, and frequency of user visitations.
6. The user interface design automation system as claimed in claim 5, wherein the intelligence module comprises:
a database that comprises the design data, user data, and channel data, wherein the design data includes information associated with performance and meta extraction, user

data includes information associated with user persona extraction, and channel data includes information associated with the channel, and wherein information associated with the meta extraction is subdivided into cognitive meta and user interface (UI) meta; and
an intelligence unit that is in communication with the database to receive and process information that includes the design data, the user data, and the channel data to generate a design recommendation, a meta recommendation, and an audience and channel recommendation based on the user interface design.
7. The user interface design automation system as claimed in claim 6, wherein the
adaptation module is configured to:
identify design elements associated with the user interface design; extract the design elements; and
resize the design elements to a target proportion based on a check performed for predefined design rules.
8. The user interface design automation system as claimed in claim 7, wherein the
performance analytics-based module uses live data associated with the launched user
interface design to;
in a first stage, pre-process the live data;
in a second stage, perform benchmarking of the user interface design, provide a design ranking of the user interface design, and track the fatigue experienced by the user interface design in channels via a fatigue tracker; and
in a third stage, raise alerts, generate reports, and generate recommendations based on the user interface design.
9. A method for user interface design automation comprising:
providing at least one processor that operates under control of a stored program comprising a sequence of program instructions to control one or more components, wherein the components comprising:
inputting design data comprising design objectives and strategies associated with the

user interface (UI) to an intelligence module to generate a design for the UI;
building the user interface design, via an adaptation module, based on the design for the UI with different resolutions and languages based on requirements of different channels for the user interface design; and
monitoring and controlling the user interface design via a performance analytics-based module, wherein a decision whether to one of scale up, scale down, and discontinue the user interface design is made based on monitoring parameters used by the performance analytics-based module.
10. The method as claimed in claim 9, further comprising:
containing the design data comprising the design objectives and strategies associated with the user interface in a design objective module in communication with the intelligence module, and wherein the design objective module provides the design data to the intelligence module;
communicating, using a design brief module, with the intelligence module and the design objective module based on the design data, wherein the intelligence module generates design briefs based on the communication from design objective module and the design data; and
communicating, via a design module, with the intelligence module and the design brief module, wherein the intelligence module generates the user interface design, and wherein a quality check is performed after the design module to verify a selected user interface design.
11. The method as claimed in claim 9, further comprising extracting various cognitive and
design meta from historical data associated with the user interface design, via the
intelligence module, wherein this cognitive and design meta is combined with
corresponding performance metrics respectively, to provide recommendations with respect
to the user interface design, and wherein a designer is enabled to tweak the design of the
user interface.

12. The method as claimed in claim 11, wherein the adaptation module is in communication
with:
an audience and channel selection module that transfers to the adaptation module: a verified selected user interface design after the quality check, recommendations on audience and channel that are received from the intelligence module,
user selection data associated with selection of the selected user interface design via the users, and a selected channel to launch the selected user interface design; and
a launch module that launches the selected user interface design after verifying the selected user interface design with the user selection data.
13. The method as claimed in claim 12, further comprising containing data associated with the monitoring parameters in a monitoring and control module, wherein the performance analytics-based module is in communication with the monitoring and control module to provide smart suggestions based on performance of different launched user interface designs across different channels, type of users being on-boarded, and frequency of user visitations.
14. The method as claimed in claim 13, wherein the intelligence module comprises:
a database that comprises the design data, user data, and channel data, wherein the design data includes information associated with performance and meta extraction, user data includes information associated with user persona extraction, and channel data includes information associated with the channel, and wherein information associated with the meta extraction is subdivided into cognitive meta and user interface (UI) meta; and
an intelligence unit that is in communication with the database to receive and process information that includes the design data, the user data, and the channel data to generate a design recommendation, a meta recommendation, and an audience and channel recommendation based on the user interface design.

15. The method as claimed in claim 14, wherein the adaptation module performing:
identifying design elements associated with the user interface design; extracting the design elements; and
resizing the design elements to a target proportion based on a check performed for predefined design rules.
16. A computer program product comprising at least one non-transitory computer-readable
storage medium having computer-executable program code instructions stored therein, the
computer-executable program code instructions comprising program code instructions to:
input design data comprising design objectives and strategies associated with the user interface (UI) to an intelligence module to generate a design for the UI;
build the user interface design, via an adaptation module, based on the design for the UI with different resolutions and languages based on requirements of different channels for the user interface design; and
monitor and control the user interface design via a performance analytics-based module, wherein a decision whether to one of scale up, scale down, and discontinue the user interface design is made based on monitoring parameters used by the performance analytics-based module.

Documents

Application Documents

# Name Date
1 202221031138-STATEMENT OF UNDERTAKING (FORM 3) [31-05-2022(online)].pdf 2022-05-31
2 202221031138-FORM 1 [31-05-2022(online)].pdf 2022-05-31
3 202221031138-DRAWINGS [31-05-2022(online)].pdf 2022-05-31
4 202221031138-DECLARATION OF INVENTORSHIP (FORM 5) [31-05-2022(online)].pdf 2022-05-31
5 202221031138-COMPLETE SPECIFICATION [31-05-2022(online)].pdf 2022-05-31
6 Abstract1.jpg 2022-09-12
7 202221031138-FORM 18 [24-04-2025(online)].pdf 2025-04-24