Abstract: A method and system for detecting issues triggered during processing of a monolithic application is disclosed. In some embodiments, the method includes generating (302) a set of six evaluation dimensions corresponding to a monolithic application. It should be noted that, the set of six evaluation dimensions comprises a flow unit level dimension, a method level dimension, a functional flow dimension, a data flow dimension, a technology impact dimension, and an environment and external impact dimension. The method further includes performing (304) evaluation of the monolithic application corresponding to each of the set of six evaluation dimensions. The method further includes detecting (308) a plurality of issues triggered in the monolithic application, based on evaluation of the monolithic application corresponding to each of the set of six evaluation dimensions.
Generally, the invention relates to monolithic applications. More
specifically, the invention relates to method and system for detecting issues triggered
during processing of a monolithic application.
Background
[002] Monolithic application architecture has been successfully used for decades
now. Monolithic applications are designed in a way such that they can handle multiple
related tasks. In other words, monolithic applications are dynamic applications that may
include many closely coupled functions. Since monolithic applications tend to have huge
code basis, therefore any change in code basis may require compiling and testing of
entire platform implemented with monolithic applications. This approach of compiling and
testing entire platform goes against an agile approach that today’s developers tend to
follow.
[003] During processing life cycle of monolithic applications, the monolithic
applications may pass through many changes or stages. Many of these changes may
impact the performance and functionality of these monolithic applications. Moreover, in
case of monolithic application a root cause that may impact the performance and
functionality based on changes done, is difficult to identify. Due to above listed reasons,
a user of these monolithic applications is forced to live with these issues, that may occur
based on changes happened during processing of the monolithic application. Some
parameters associated with the monolithic application that may cause difficulty in
identifying the root cause are size of monolithic application, multiple technologies and
frameworks used to develop a monolithic application, old legacy technologies used to
build monolithic application, communication of monolithic application with various
external systems, various dependent libraries used, complex logics implemented,
multiple database calls with more joins in SQL queries, and code redundancy.
[004] Therefore, there is a need of an efficient and reliable method and system
for detecting issues triggered during processing of a monolithic application.
Docket No: IIP-HCL-P0049
-3-
SUMMARY OF INVENTION
[005] In one embodiment, a method for detecting issues triggered during
processing of a monolithic application is disclosed. The method may include generating
a set of six evaluation dimensions corresponding to a monolithic application. It should be
noted that, the set of six evaluation dimensions comprises a flow unit level dimension, a
method level dimension, a functional flow dimension, a data flow dimension, a technology
impact dimension, and an environment and external impact dimension. The method may
include performing evaluation of the monolithic application corresponding to each of the
set of six evaluation dimensions. The method may include detecting a plurality of issues
triggered in the monolithic application, based on evaluation of the monolithic application
corresponding to each of the set of six evaluation dimensions.
[006] In another embodiment, a system for detecting issues triggered during
processing of a monolithic application is disclosed. The system includes a processor and
a memory communicatively coupled to the processor. The memory may store processorexecutable instructions, which, on execution, may causes the processor to generate a
set of six evaluation dimensions corresponding to a monolithic application. It should be
noted that, the set of six evaluation dimensions comprises a flow unit level dimension, a
method level dimension, a functional flow dimension, a data flow dimension, a technology
impact dimension, and an environment and external impact dimension. The processorexecutable instructions, on execution, may further cause the processor to perform
evaluation of the monolithic application corresponding to each of the set of six evaluation
dimensions. The processor-executable instructions, on execution, may further cause the
processor to detect a plurality of issues triggered in the monolithic application, based on
evaluation of the monolithic application corresponding to each of the set of six evaluation
dimensions.
[007] It is to be understood that both the foregoing general description and the
following detailed description are exemplary and explanatory only and are not restrictive
of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[008] The present application can be best understood by reference to the
following description taken in conjunction with the accompanying drawing figures, in
which like parts may be referred to by like numerals.
Docket No: IIP-HCL-P0049
-4-
[009] FIG. 1 illustrates a functional block diagram of a system for detecting issues
triggered during processing of a monolithic application, in accordance with an
embodiment.
[010] FIG. 2 illustrates a function block diagram of a memory of an assessment
device used for detecting issues triggered during processing of a monolithic application,
in accordance with an embodiment.
[011] FIG. 3 illustrates a flowchart of a method for detecting issues triggered
during processing of a monolithic application, in accordance with an embodiment.
[012] FIG. 4 illustrates a flowchart of a method for analysing the monolithic
application corresponding to a flow unit level dimension, in accordance with an
embodiment.
[013] FIGs. 5A – 5D illustrate an exemplary representation of an analysis
performed corresponding to a flow unit level dimension of the monolithic application, in
accordance with an exemplary embodiment.
[014] FIG. 6 illustrates a flowchart of a method for analysing the monolithic
application corresponding to a method level dimension, in accordance with an
embodiment.
[015] FIGs. 7A and 7B illustrate an exemplary representation of an analysis
performed corresponding to a method level dimension of the monolithic application, in
accordance with an exemplary embodiment.
[016] FIG. 8 illustrates a flowchart of a method for analysing the monolithic
application corresponding to a functional flow dimension, in accordance with an
embodiment.
[017] FIGs. 9A and 9B illustrate a table representing an analysis performed
corresponding to a functional flow dimension of the monolithic application, in accordance
with an exemplary embodiment.
[018] FIG. 10 illustrates a flowchart of a method for analysing the monolithic
application corresponding to a data flow dimension, in accordance with an embodiment.
[019] FIG. 11 illustrates a flowchart of a method of analysing the monolithic
application corresponding to a technology impact dimension, in accordance with an
embodiment.
[020] FIG. 12 illustrates a flowchart of a method for analysing the monolithic
application corresponding to an environment and external impact dimension, in
accordance with an embodiment.
Docket No: IIP-HCL-P0049
-5-
[021] FIGs. 13A and 13B illustrate an exemplary representing of an analysis
performed corresponding to a technological impact dimension of the monolithic
application, in accordance with an exemplary embodiment.
DETAILED DESCRIPTION OF THE DRAWINGS
[022] The following description is presented to enable a person of ordinary skill
in the art to make and use the invention and is provided in the context of particular
applications and their requirements. Various modifications to the embodiments will be
readily apparent to those skilled in the art, and the generic principles defined herein may
be applied to other embodiments and applications without departing from the spirit and
scope of the invention. Moreover, in the following description, numerous details are set
forth for the purpose of explanation. However, one of ordinary skill in the art will realize
that the invention might be practiced without the use of these specific details. In other
instances, well-known structures and devices are shown in block diagram form in order
not to obscure the description of the invention with unnecessary detail. Thus, the
invention is not intended to be limited to the embodiments shown, but is to be accorded
the widest scope consistent with the principles and features disclosed herein.
[023] While the invention is described in terms of particular examples and
illustrative figures, those of ordinary skill in the art will recognize that the invention is not
limited to the examples or figures described. Those skilled in the art will recognize that
the operations of the various embodiments may be implemented using hardware,
software, firmware, or combinations thereof, as appropriate. For example, some
processes can be carried out using processors or other digital circuitry under the control
of software, firmware, or hard-wired logic. (The term “logic” herein refers to fixed
hardware, programmable logic and/or an appropriate combination thereof, as would be
recognized by one skilled in the art to carry out the recited functions.) Software and
firmware can be stored on computer-readable storage media. Some other processes can
be implemented using analog circuitry, as is well known to one of ordinary skill in the art.
Additionally, memory or other storage, as well as communication components, may be
employed in embodiments of the invention.
[024] A system 100 for detecting issues triggered during processing of a
monolithic application, is illustrated in FIG. 1. In particular, the system 100 may include
an assessment device 102 that detects issues triggered while processing the monolithic
application. The assessment device 102 may detect triggered issues based on each of
Docket No: IIP-HCL-P0049
-6-
a set of six dimensions generated corresponding to the monolithic application. The set of
six dimensions may include a flow unit level dimension, a method level dimension, a
functional flow dimension, a data flow dimension, a technology impact dimension, and
an environment and external impact dimension. In order to detect issues triggered during
processing life cycle of the monolithic application, the assessment device 102 may
perform evaluation of the monolithic application. In order to perform evaluation, the
assessment device 102 may analyze the monolithic application based on each of the set
of six dimensions associated with the monolithic application. Once the monolithic
application is analyzed based on each of the set of six dimensions generated, the
assessment device 102 may detect issues triggered in the monolithic application. This is
further explained in detail in conjunction with FIG. 2 to FIG. 13.
[025] Examples of the assessment device 102 may include, but are not limited
to, a server, a desktop, a laptop, a notebook, a tablet, a smartphone, a mobile phone, an
application server, or the like. The assessment device 102 may include a memory 104,
a processor 106, and a display 108. The display 108 may further include the user
interface 110. A user or an administrator may interact with the assessment device 102
and vice versa through the display 108.
[026] By way of an example, the display 108 may be used to display results (i.e.,
issues identified based on analysis of the monolithic application, recommendations
provided based on the identified issues) based on the actions performed by the
assessment device 102, to a user (i.e., a customer or administrator of the monolithic
application). In addition, the display 108 may be used to display changes done in the
monolithic application. The changes may be done in order to modify the monolithic
application based on requirement of a new user or a new technology introduced. By way
of another example, the user interface 110 may be used by the user to provide inputs to
the assessment device 102. Thus, for example, in some embodiment, the assessment
device 102 may ingest an input for a persistent data or a non-persistent data associated
with the monolithic application. Further, for example, in some embodiments, the
assessment device 102 may render intermediate results (e.g., the analysis performed for
each of the set of six dimensions) or final results (e.g., the issues detected, or
recommendations) to the user via the user interface 110.
[027] The memory 104 may store instructions that, when executed by the
processor 106, may cause the processor 106 to detect issues that may occur during
processing of the monolithic application. The processor 106 may detect issues based on
Docket No: IIP-HCL-P0049
-7-
analyzations of each of the set of six dimension, in accordance with some embodiments.
As will be described in greater detail in conjunction with FIG. 2 to FIG. 13, in order to
detect issues in the monolithic application, the processor 106 in conjunction with the
memory 104 may perform various functions including generation of each of the set of six
dimensions, evaluation of each of the set of six dimensions, scoring of the monolithic
application for some dimensions, detection of issues triggered, etc.
[028] The memory 104 also store various data (e.g. the set of six dimensions
generated, results of evaluation of each of the set of six dimensions, score provided,
effective performance score, the predefined acceptance threshold, etc.) that may be
captured, processed, and/or required by the assessment device 102. The memory 104
may be a non-volatile memory (e.g., flash memory, Read Only Memory (ROM),
Programmable ROM (PROM), Erasable PROM (EPROM), Electrically EPROM
(EEPROM) memory, etc.) or a volatile memory (e.g., Dynamic Random Access Memory
(DRAM), Static Random-Access memory (SRAM), etc.).
[029] The assessment device 102 may be connected to a database 112. The
database 112 may be used to store a set of dimensions generated corresponding to the
monolithic application. In addition, the database 112 may store results generated based
on analysis of each of the set of dimensions generated for the monolithic application.
Additionally, the database 112 may be periodically updated based on issues detect
during analysis of each of the set of dimensions associated with the monolithic
application.
[030] Further, the assessment device 102 may interact with a server 114 or
external devices 120 over a network 118 for sending and receiving various data. The
external devices 120 may be used by a plurality of users to provide their selection of for
each function associated with each of the set of persistent data and the set of nonpersistent data for verifying an output associated with functional requirement of each
function associated with the monolithic application by the assessment device 102. The
external devices 120 may include, but may not be limited to a desktop, a laptop, a
notebook, a netbook, a tablet, a smartphone, a remote server, a mobile phone, or another
computing system/device. The network 118, for example, may be any wired or wireless
communication network and the examples may include, but may be not limited to, the
Internet, Wireless Local Area Network (WLAN), Wi-Fi, Long Term Evolution (LTE),
Worldwide Interoperability for Microwave Access (WiMAX), and General Packet Radio
Service (GPRS).
Docket No: IIP-HCL-P0049
-8-
[031] In some embodiment, the assessment device 102 may fetch information
associated with the monolithic application from the server 114. In addition, the server 114
may provide access of the monolithic application to the plurality of user. The server 114
may further include a database 116. The database 116 may store information associated
with the monolithic application. By way of an example, the database 116 may store the
set of dimensions associated with the monolithic application. The database 116, may be
periodically update with a set of dimensions associated with a new monolithic application.
Alternatively, the assessment device 102 may receive the user input one of the set of six
dimensions from one of the external devices 120.
[032] Referring now to FIG. 2, a functional block diagram of a memory 104 of an
assessment device 102 configured to detect issues triggered during processing of a
monolithic application is illustrated, in accordance with an embodiment. Initially, a
monolithic application for which issues need to be identified may be received from at
least one of the external devices 120 by the memory 104. In an embodiment, the memory
104 may include a generation module 202, an evaluation module 204, an analysis
module 206, and an issue detection module 214. The analysis module 206 may further
include an identification module 208, a scoring module 210, and a testing module 212.
The testing module 212 may receive a user input 216 from a plurality of users of the
assessment device 102. The modules 202-214 may include routines, programs, objects,
components, data structures, etc., which perform particular tasks or implement particular
abstract data types. The modules 202-214 described herein may be implemented as
software modules that may be executed in a cloud-based computing environment of the
assessment device 102.
[033] Upon receiving the monolithic application, the generation module 202 may
be configured to generate a set of six dimensions corresponding to the monolithic
application. The set of six dimensions may include, but is not limited to, a flow unit level
dimension, a method level dimension, a functional flow dimension, a data flow dimension,
a technology impact dimension, and an environment and external impact dimension.
Once the set of six dimensions corresponding to the monolithic application are
generated, the generation module 202 may be configured to send each of the set of six
dimensions to the evaluation module 204.
[034] Upon receiving the set of six dimensions, the evaluation module 204 may
be configured to perform evaluation of the monolithic application corresponding to each
of the set of six dimensions received. In order to perform evaluation, the evaluation
Docket No: IIP-HCL-P0049
-9-
module 204 may send the monolithic application along with each of the set of six
dimensions to the analysis module 206. In addition, the evaluation module 204 may be
configured to receive results of analysis performed from the analysis module 206.
Moreover, based on results received from the analysis module 206, the evaluation
module 204 may perform evaluation of the monolithic application. Further, the evaluation
module 204 may be configured to send a result of evaluation performed for the monolithic
application to the issue detection module 214. It should be noted that, the evaluation
module 204 and the assessment module 206 may work together to perform evaluation
of the monolithic application corresponding to each of the set of six dimensions. In other
words, there will be iterative communication between the evaluation module 204 and the
analysis module 206.
[035] The analysis module 206 may be configured to receive the monolithic
application along with each of the set of six dimensions from the evaluation module 204.
The analysis module 206 may be configured to individually analyze monolithic application
corresponding to each of the set of six dimensions. In order to perform analysis, the
analysis module 206 may send at least one dimension from the set of six dimensions
associated with the monolithic application to the identification module 208.
[036] The identification module 208 may be configured to identify contribution of
each flow unit for each of a plurality of methods in the method level analysis. In an
embodiment, the contribution of each flow unit in the method level analysis may be
identified based on based on an occurrence count and a performance score associated
with each of the set of flow units. In addition, the identification module 208 may be
configured identify possible contribution of environment and external elements
influencing performance of the monolithic application for the environment and external
impact dimension. Further, the identification module 208 may be configured to send the
results of identification corresponding to at least one dimension, to the scoring module
210.
[037] The scoring module 210 may be configured receive results of identification
from the identification module 208. Further, the scoring module 210 may be configured
to provide an effective performance score for each of the plurality of methods based on
identified contribution of each flow units. In an embodiment, the effective performance
score may be provided in order to perform evaluation of the monolithic application for the
method level dimension. In addition, the scoring module 210 may be configured to
provide an effective performance score for each function associated with each of the set
Docket No: IIP-HCL-P0049
-10-
of persistent data and the set of non-persistent data while performing analysis of the
monolithic application for the function flow dimension.
[038] The testing module 212 may be configured to perform a functional test for
analysis of the monolithic application corresponding to the functional flow dimension. In
order to perform the functional test, the testing module 212 may be configured to receive
the user input 216 for each function associated with each of the set of persistent data
and the set of non-persistent data, from the plurality of users. In an embodiment, the user
input 216 may be received by the testing module 212 to verify an output associated with
functional requirement of each function. Further, based on results generated by the
identification module 208, the scoring module 210, and the testing module 212, the
analysis module 206 may be configured to analyze the monolithic application
corresponding to each of the set of six dimensions. Further, based on results of analysis
performed by the analysis module 206, the evaluation module 204 may be configured to
evaluate the performance of the monolithic application corresponding to each of the set
of six dimensions.
[039] Once the evaluation of the monolithic application is performed, the issue
detection module 214 may be configured to receive results of performed evaluation from
the evaluation module 204. Further, the issue detection module 214 may be configured
to detect a plurality of issues triggered in the monolithic application. It should the noted
that, the plurality of triggered issues may be detected based on evaluation performed for
the monolithic application by the evaluation module 204. In order to detect the plurality
of issues, the issue detection module 214 may be configured to analyze combinations of
each of the set of six evaluation dimensions associated with the monolithic application
against a specific weightage. The specific weightage may be associated with the
monolithic application.
[040] In particular, as will be appreciated by those of ordinary skill in the art,
various modules 202-214 for performing the techniques and steps described herein may
be implemented in the assessment device 102, either by hardware, software, or
combinations of hardware and software. For example, suitable code may be accessed
and executed by the one or more processors on the assessment device 102 to perform
some or all of the techniques described herein. Similarly, application specific integrated
circuits (ASICs) configured to perform some or all of the processes described herein may
be included in the one or more processors on the host computing system. Even though
Docket No: IIP-HCL-P0049
-11-
FIGs. 1-2 describe the assessment device 102, the functionality of the components of
the assessment device 102 may be implemented in any computing devices.
[041] Referring now to FIG. 3, a flowchart of a method for detecting issues
triggered during processing of a monolithic application is illustrated, in accordance with
an embodiment. At step 302, a set of six dimensions may be generated. The set of six
dimensions may be generated corresponding to the monolithic application. In an
embodiment, the set of six dimensions generated may include a flow unit level dimension,
a method level dimension, a functional flow dimension, a data flow dimension, a
technology impact dimension, and an environment and external impact dimension. Once
the set of six dimensions are generated, at step 304, an evaluation of the monolithic
application may be performed. The evaluation of the monolithic application may be
performed corresponding to each of the set of six dimensions.
[042] In order to perform evaluation, at step 306, each of the set of six dimensions
associated with the monolithic application may be analyzed. The process of analysing
the monolithic application corresponding to each of the set of six dimensions is further
explained in detail in conjunction with FIG. 4 to FIG. 13. Once the analysis is performed,
at step 308, a plurality of issues triggered in the monolithic application may be detected.
Examples of the plurality of issues may include, but are not limited to, continuous
deployment of monolithic application, bugs in various modules of the monolithic
application, reliability, ensuring modularity, enabling innovations through composing
services, management of growing data, incorporation of latest technologies, etc.
[043] In order to detect each of the plurality of issues, at step 310, combinations
of each of the set of six dimensions may be analyzed. The combination of each of the
set of six dimensions may be analyzed against a specific weightage associated with the
monolithic application. In other words, impact of each of the set of six dimensions
associated with the monolithic application may be analyzed against specific weightage.
In an embodiment, the impact of each of the set of six dimensions may be analyzed in
order to revise and find tune results of earlier findings associated with the monolithic
application.
[044] Referring now to FIG. 4, a flowchart of a method for analyzing the
monolithic application corresponding to the flow unit level dimension is illustrated, in
accordance with an embodiment. With reference to FIG. 3, in order to analyze the
monolithic application corresponding to the flow unit level dimension as mention in step
402, at step 404, each of a set of flow units in the monolithic application may be
Docket No: IIP-HCL-P0049
-12-
evaluated. In an embodiment, each of the set of flow units may correspond to an atomic
code block through which a functional control flow of the monolithic application may pass
through. In other words, each of the set of flow units may correspond to an atomic code
block that may cause pain points or issues in the monolithic application. In an
embodiment, each of the set of flow units may be evaluated against a predefined
threshold. Moreover, each of the set of flow units in the monolithic application may be
evaluated based on a plurality of factors. The plurality of factors may include, but are not
limited to, a performance score, an occurrence count, a cost of failure, and a failure rate.
[045] In an embodiment, the performance score of each of the set of flow units
may depict performance of each of the set of flow units in the monolithic application.
Further, the occurrence count of each of the flow unit may depict a number of times a
flow unit from the set of flow units occurs in various functional flows. The cost of failure
associated with each of the set of flow units may be calculated based on various
parameters. Examples of various parameters may include, but are not limited to failure
of the flow unit, crashing of the monolithic application, loss of data, etc. Moreover, each
of these parameters may be calculated based on different scales. For example, the
failure of the flow unit may be zero. Similarly, cost of crashing of the monolithic application
may be determined to be ‘50’. In addition, loss of data may be determined to be ‘-500’.
Further, the failure rate associated with each of the set of flow units may be determined
based on percentage of failure occurrence of each of the set of flow units.
[046] Once each of the set of flow units are evaluated, at step 406, an impact of
each of the set of flow units in each of a plurality of functional flows of the monolithic
application may be analyzed. Further, in order to analyze impact of each of the set of
flow units, a normal flow and a deviation flow associated with each of the set of flow units
may be analyzed. Moreover, the impact of each of the set of flow units may be analyzed
in order to determine impact of each flow unit from the set of flow units in overall paint
points of the monolithic application. A method for analyzing the impact of each of the set
of flow units in the monolithic applications is further explained in detail in conjunction with
FIGs. 5A – 5D.
[047] Referring now to FIGs. 5A-5D, an exemplary representation of an analysis
performed corresponding to a flow unit level dimension of the monolithic application is
illustrated, in accordance with an exemplary embodiment. In FIG. 5A, a set of four
methods comprising a set of flow units is depicted. In an embodiment, each of the set of
four methods of the monolithic application may be depicted as ‘M1’, ‘M2’, ‘M3’, and ‘M4’.
Docket No: IIP-HCL-P0049
-13-
It should be noted that, for ease of explanation, the set of four methods is considered.
However, the monolithic application may include any number of methods that may be
required to build the monolithic application. Further, the set of flow units present in the
method ‘M1’ may be depicted as ‘F1.1’, ‘F1.2’, ‘F1.3’. The set of flow units present in the
method ‘M2’ may be depicted as ‘F2.1’, ‘F2.2’, and ‘F2.3’. Similarly, the set of flow units
in the method ‘M3’ may be depicted as ‘F3.1’, and ‘F3.2’. In addition, the set of flow units
in the method ‘M4’ may be depicted as ‘F4.1’, ‘F4.2’, and ‘F4.3’.
[048] In an embodiment, impact of each of the set of flow units in different
functional flow may be analyzed in different scenarios. In one scenario In FIG. 5B, the
functional flow of flow units corresponding to the set of four methods is depicted via a
functional flow ‘FF1’. Moreover, the functional flow ‘FF1’ represented in FIG. 5B
represents a normal function flow of some of the set of flow units associated with the set
of four methods. As represent in the FIG. 5B, the functional flow FF1 may pass through
and may include the following flow units: ‘F1.1’, ‘F2.1’, ‘F3.2’, and ‘F4.2’. Further, based
on the functional flow, an impact of each of the set of flow units on the functional flow
may be determined. The impact of each of the set of flow units on the functional flow may
be determined based on an equation (1) represented below:
(Performance score *success rate*occurrence count) + (cost of Failure *failure
rate*occurrence count) … (1)
[049] Moreover, analysis of each of the set of flow units corresponding to the flow
unit level dimension may be performed based on the above equation (1). Consider a
scenario, where the impact of flow unit ‘F3.2’ may be determined on the function flow
‘FF1’. In order to determine impact of the flow unit ‘F3.2’, a value for each of the plurality
of parameter for the flow unit ‘F3.2’ may be determined. By way of an example, suppose
a performance score for the flow unit ‘F3.2’ may be 5. Similarly, a success rate for the
flow unit ‘F3.2’ may be ‘60%’. The occurrence count for the flow unit ‘F3.2’ may be 15.
The cost of failure for the flow unit ‘F3.2’ may be determined to be ‘-500’, Further, the
failure rate for the flow unit ‘F3.2’ may be determined to be ‘40%’. The occurrence count
for the flow unit ‘F3.2’ may be determined to be ‘15’. Based on above determined values,
the impact of flow unit ‘F3.2’ may be determined on the functional flow ‘FF1’ based on
above determined values as depicted via equation (2) represented below:
(5*60%*15) + (-500*40%*15) = -2955 ….. (2)
[050] In above equation (2), each parameter value depicted in equation (1) may
be replaced with the values determined for each parameter value. The final result ‘-2955’
Docket No: IIP-HCL-P0049
-14-
may represent impact of flow unit ‘F3.2’ on the functional flow ‘FF1’In Fig, 5C, a deviation
flow for each of the set of flow units corresponding to the set of four methods is depicted.
In the deviation flow as represented the functional flow ‘FF1’ may pass through ‘F3.1’
flow unit in the method M3, instead of F3.2 flow unit. As represent in the FIG. 5D, the
functional flow FF1 in this case may include and may pass through ‘F1.1’, ‘F2.1’, ‘F3.1’,
and ‘F4.2’. Further, the impact of flow unit ‘F3.1’ may be identified on the functional flow
‘FF1’. In order to identify the impact of flow unit ‘F3.1’, a value for each of the plurality of
parameters associated with the flow unit ‘F3.1’ may be determined. By way of an
example, suppose a performance score for the flow unit ‘F3.1’ may be 3. Similarly, a
success rate for the flow unit ‘F3.1’ may be ‘80%’. The occurrence count for the flow unit
‘F3.1’ may be 5. The cost of failure for the flow unit ‘F3.1’ may be determined to be ‘-1’,
Further, the failure rate for the flow unit ‘F3.1’ may be determined to be ‘20%’. The
occurrence count for the flow unit ‘F3.1’ may be determined to be ‘5’. Based on above
determined values, the impact of flow unit ‘F3.1’ may be determined on the functional
flow ‘FF1’ as depicted via equation (3) represented below:
(3*80%*5) + (-1*20%*5) = -11 ….. (3)
[051] Further, based on above determined impact of flow units, i.e., ‘F3.1’ and
‘F3.2’ an analysis of flow units (‘F3.1’, ‘F3.2’) on the functional flow ‘FF1’ may be
performed. In one embodiment, when the flow unit ‘F2.3’ is functioning normally, then the
functional flow ‘FF1’ may pass through the flow unit ‘F3.2’. Further, when the functional
flow ‘FF1’ may pass through the flow unit ‘F3.2’, then the impact may be ‘-2955’. In
another embodiment, when an exception happens on the flow unit ‘F2.3’, then the
functional flow ‘FF1’ may pass through the flow unit ‘F3.1’. When the functional flow ‘FF1’
may pass through the flow unit ‘F3.1’, then the impact may be ‘11’.
[052] Hence in current state, happening of exception in the flow unit ‘F2.3’ may
be good for the functional flow ‘FF1’. Moreover, the exception in the flow unit ‘F2.3’ may
be addressed only after addressing issues in the flow unit ‘F3.2’. It should be noted that,
the analysis of the impact of the each of the set of flow units in the plurality of functional
may happen by considering multiple combinations of flow units in various functional flow
of the monolithic application. This process may continue until all issues in any flow unit
of the monolithic application is determined. Moreover, the analysis of impact of each of
the set of flow units in the monolithic application may be done to determine all pain points
of the monolithic application.
Docket No: IIP-HCL-P0049
-15-
[053] Referring now to FIG. 6, a flowchart of a method of analysing the monolithic
application corresponding to a method level dimension is illustrated, in accordance with
an embodiment. In order to analyze the monolithic application corresponding to the
method level dimension as depicted via step 602, at step 604, contribution of each flow
unit for each of a plurality of methods may be identified. In an embodiment, the
contribution of each flow unit may be identified to determine overall contribution of each
of the plurality of methods in the monolithic application. Moreover, contribution of each
flow unit for each of the plurality of methods may be identified based on an occurrence
count and a performance score associated with each of the set of flow units. The
occurrence count of each of the set of flow units may depict a number of functional flows
in which each of the set of flow units is involved in. In addition, the performance score of
each of the set of flow units may depict impact of each flow unit on overall functional flow.
[054] Once contribution of each of the set of flow units is determined, at step 606,
an effective performance score corresponding to each of the plurality of methods may be
calculated. The effective performance score may be calculated based on the identified
contribution of each flow unit. Upon determining the effective performance score, at step
608, each of the plurality of methods may be evaluated against a predefined acceptance
threshold. Each of the plurality of methods may be evaluated based on the calculated
effective performance score. A technique for evaluating each of the plurality of methods
is further explained in detail in conjunction with FIGs. 7A – 7B.
[055] Referring now to FIGs. 7A -7B, an exemplary representation of an analysis
performed corresponding to a method level dimension of the monolithic application is
illustrated, in accordance with an exemplary embodiment. In FIG. 7A, functional flow of
each of a set of flow units is depicted. Moreover, for ease of explanation, in the present
FIG. 7A, an evaluation of a set of flow units in a method ‘M2’, may be performed.
However, the evaluation may be performed for each of the set flow units in the plurality
of methods in the monolithic application. Further, each of the set of flow units of method
‘M2’ may correspond to flow unit ‘F2.1’, flow unit ‘2.2’, and flow unit ‘F2.3’.
[056] By way of an example, consider a monolithic application in which total
number of methods may be ‘100’. In addition, each method of the monolithic application
may include a set of flow units. In an embodiment, an effective performance score of
each of the set of flow units in the total number of methods may be calculated. The
effective performance score may be calculated based on equation (4) represented below:
Effective Performance score = Loop Count * Performance score … (4)
Docket No: IIP-HCL-P0049
-16-
[057] For example, the performance score of flow unit ‘1’ of the method ‘M1’
represented in a table 700B may be determined based on above equation (4). In order
to determine the effective performance score of flow unit ‘1’ of method ‘M1’, product of
values of loop count and performance score associated with flow unit ‘1’ may be
determined as represented below by equation (5):
Effective performance score of flow unit ‘1’= 4*1 = 4 …. (5)
[058] Similarly, an effective performance score of flow unit ‘2’ of method ‘M1’ may
be determined as represented below by equation (6):
Effective performance score of flow unit ‘2’= 1*3 = 3 …. (6)
[059] Similarly, an effective performance score of flow unit ‘3’ of method ‘M1’ may
be determined as represented below by equation (7):
Effective performance score of flow unit ‘3’= 5*2 = 10 …. (7)
[060] Similarly, the effective performance score may be calculated for each of the
set of flow units present in each of the total number of methods. Further, based on the
effective performance score calculated for each of the set of flow units in the total number
methods, a total effective performance score associated with each of the set of flow unit
for each of the total number of methods may be determined. By way of an example, the
total effective performance score calculated for each of the set of flow unit of the method
‘M1’ may be determined. In an embodiment, the total effective performance score
associated with the method ‘M1’ may be calculated based on equation (8) represented
below:
Total Effective Performance score = (Summation of the set of flow units)/ (Loop Count
of each of the set of flow units) … (8)
[061] The total effective performance score calculated for the method ‘M1’, based
on each of the set of flow units present in method ‘M1’ may be depicted via equation (9)
represented below:
Total Effective Performance score = (4+3+10)/ (4+1+5) … (9)
[062] It should be noted that, the total effective performance score may be
calculated for each of the total number of methods, i.e., ‘100’ of the monolithic application.
Further, each of the total number of methods may be evaluated against the predefined
acceptance threshold. In an embodiment, the predefined acceptance threshold limit may
be specific for the monolithic application. In other words, the predefined acceptance limit
may be defined for a specific monolithic application. In addition, the predefined
acceptance limit may be defined based on one of two approaches, i.e., an industry
Docket No: IIP-HCL-P0049
-17-
standard acceptance limit and an acceptance limit calculated based on comparison of
the plurality of methods.
[063] Based on evaluation of each of the total number of methods, a
recommendation may be provided to the user of the monolithic application. As will be
appreciated, the recommendations provided may differ based on the approach used to
calculate the predefined acceptance threshold limit. In one embodiment, when the
predefined acceptance limit is defined based on industry standard acceptance limit, then
an actual performance score of each of the total number of methods (i.e., 100) may be
calculated against the predefined acceptance limit i.e., defined based on industry
standard acceptance limit for a method of same type. Further, a deviation in each of the
total number of methods may be calculated. Based on calculation of the deviation, when
the deviation high is determined to be high in most of the total number of methods, then
root-cause of the deviation may be in overall architecture and design of the monolithic
application. Hence, a structural change of the monolithic application may be
recommended to the user.
[064] In another embodiment, in order to determine the predefined acceptance
threshold based on the comparison of the plurality of methods, the total effective
performance score calculated for each of the plurality of methods may be arranged in
ascending order. Further, the total effective performance score calculated may be
arranged in ascending order to determine middle value from the total effective
performance score calculated for each of the plurality of methods. Moreover, when the
total effective performance score calculated for each of the plurality of methods is even
number, then a mean of middle two values may be calculated. By way of an example,
suppose the total effective performance score calculated for a set of methods may be
‘1.5’, ‘1.7’, ‘5’, ’15.16’. Then in order to determine the predefined acceptance threshold,
the mean of ‘1.7’, and ‘5’ may be calculated. The mean of ‘1.7’, and ‘5’ may be calculated
as depicted via equation (10) below:
(1.7+5)/2 = 3.35 … (10)
[065] Each of the set of flow units present in the plurality of methods that may
have total effective performance score greater than ‘3.35’ (i.e., the predefined
acceptance threshold) may be considered for optimization.
[066] Referring now to FIG. 8, a flowchart of a method of analysing the monolithic
application corresponding to a functional flow dimension is illustrated, in accordance with
an embodiment. At step 802, the monolithic application may be analyzed corresponding
Docket No: IIP-HCL-P0049
-18-
to the functional flow dimension associated with the monolithic application. In order to
analyze the monolithic application corresponding to the functional flow dimension, at step
804, a functional test corresponding to each of a set of persistent dataset and each of a
set of non-persistent dataset of the monolithic application may be performed. In order to
perform the functional test, at step 804, an input may be provided for each function
associated with each of the set of persistent data and the set of non-persistent data. In
an embodiment, the input may be provided to for verifying an output associated with
functional requirement of each function.
[067] Once functional requirement of each function is verified, then at step 806,
an effective performance score may be calculated for each function associated with each
of the set of persistent data and the set of non-persistent data. In an embodiment, the
effective performance score for each function may be calculated based on the
occurrence count and the performance score associated with each of the set of flow units.
Further, at step 808, each function may be evaluated against the predefined acceptance
threshold. Moreover, each function may be evaluated based on the calculated effective
performance score. In an embodiment, the predefined acceptance threshold may be
determined based on one of two approaches, i.e., an industry standard acceptance limit
and an acceptance limit calculated based on comparison of all functional flows in the
monolithic application. This has been explained in greater detail in conjunction to FIGs.
9A and 9B.
[068] Referring now to FIGs. 9A and 9B, tables representing an analysis
performed corresponding to a functional flow dimension of the monolithic application are
illustrated, in accordance with an exemplary embodiment. In an embodiment, in order to
analyze the monolithic application corresponding to the functional flow, a functional test
may be performed. The functional test may be performed corresponding to each of a set
of persistent dataset and each of a set of non-persistent dataset of the monolithic
application. The functional test may be performed to test each function of the monolithic
application. The functional test may be performed by providing an appropriate input in
order to verify an associated output against functional requirement of the monolithic.
Moreover, the functional test may help to check application programming interface
(APIs), Database and other functionality of the monolithic application.
[069] In order to perform the functional test for the set of persistent data as
depicted via table 900A, total effective performance score for a plurality of methods of
Docket No: IIP-HCL-P0049
-19-
the monolithic application may be calculated. The total effective performance score may
be calculated based on equation (11) represented below:
Total effective performance score = summation of an effective performance score
of each of the plurality of methods … (11)
[070] Further, the effective performance score for each of the plurality of methods
may be predicted based on equation (12) represented below:
Effective performance score of a function = sum of Occurrence Count (902a) *
Performance score (904a) of the data queries of that function … (12)
[071] In the table 900A, a fun 1, fun 2, fun 3, and fun 50 may represent total
number of functions of the monolithic application. Further ‘Q1’, ‘Q2’, ‘Q3’, and ‘Q4’ may
represents each of a plurality of data queries of the monolithic application. In addition, an
occurrence count 902a may represent number of occurrences of a data query in the
function. Moreover, a performance score 904a may represent a performance score of
each of the plurality of data queries based on an associated function. In an embodiment,
based on the above equation (12), the effective performance score of each for the
plurality of data queries may be predicted. By way of an example the effective
performance score for a set of three data queries, i.e., Q1, Q2, and Q3, which may require
fun 1, may be calculated as the effective performance score for ‘Q1’ = 14*10 = 140, the
effective performance score for ‘Q2’ = 11*30 = 330, and the effective performance score
for ‘Q3’ = 15*20 = 300.
[072] Based on the effective performance score calculated for the set of three
data queries, i.e., Q1, Q2, and Q3 that requires fun 1, the total effective performance
score may be calculated based on equation (11) above. By way of an example, the total
effective performance score for the set of data queries, i.e., Q1, Q2, and Q3 may be
calculated as ‘140 + 330 + 300’, that is ‘770’. Similarly, the total effective performance
score may be calculated for each function, i.e., ‘50’ of the monolithic application. Once,
the total effective performance score is calculated for each function, then based on the
predefined acceptance threshold, recommendations to modify the monolithic application
may be derived accordingly. It should be noted that, the recommendations may be
different based on approach used to define the predefine acceptance threshold.
[073] In an embodiment, the predefined acceptance threshold may be defined
based on one of two approaches, i.e., an industry standard acceptance limit and an
acceptance limit calculated based on comparison of all functional flows in the monolithic
application. In one embodiment, when the predefined acceptance threshold is defined
Docket No: IIP-HCL-P0049
-20-
based on industry standard acceptance limit, then based on analysis of each function, a
deviation in each of the plurality of methods may be calculated. Further, when the
calculated deviation is high, then structural change of the monolithic application for data
management may be recommended. The structural change may be recommended due
to overall architecture and design of the monolithic application.
[074] In another embodiment, for non-performance methods, when the
predefined acceptance threshold is defined based on the comparison of each the plurality
of methods, the comparison of each of the total effective performance score may be
done. The comparison of the total effective performance score may be done for each
function of the monolithic application. In order to compare the total effective performance
score, the total effective performance calculated for each function (i.e., 50) may be
arranged in ascending order. Further, the total effective performance score calculated
may be arranged in ascending order to determine middle value from the total effective
performance score calculated for each of the plurality of methods. Moreover, when the
total effective performance score calculated for each of the plurality of methods is even
number, then a mean of middle two values may be calculated. By way of an example,
suppose the total effective performance score calculated for a set of four functions may
be ‘770’, ‘920’, ‘4080’, and ‘5400’. Then, in order to determine the predefined acceptance
threshold, the mean of ‘920’, and ‘4080’ may be calculated. The mean of ‘920’, and ‘4080’
may be calculated as depicted via equation (13) below:
(920+4080)/2 = 2500 … (13)
[075] Further, each function (i.e., 50) of the monolithic application that may have
the total effective performance score greater than ‘2500’ (i.e., the predefined acceptance
threshold) may be considered for optimization.
[076] In an embodiment, in order to perform functional test for the set of nonpersistent data, the total effective performance score for each function that are
independent of persistence data and external dependencies may be calculated. The total
effective performance score calculated for each of the set of non-persistent data may be
depicted via the effective performance score 902b of a table 900B. Once, the total
effective performance score is calculated for each function for each of the set of nonpersistent data, then based on the predefined acceptance threshold, recommendations
to modify the monolithic application may be derived accordingly. It should be noted that,
the recommendations may be different based on approach used to define the predefine
acceptance threshold.
Docket No: IIP-HCL-P0049
-21-
[077] In an embodiment, the predefined acceptance threshold may be defined
based on one of two approaches, i.e., an industry standard acceptance limit and an
acceptance limit calculated based on comparison of all functional flows in the monolithic
application. In one embodiment, when the predefined acceptance threshold is defined
based on industry standard acceptance limit, then based on analysis of each function, a
deviation in each of the plurality of methods may be determined. Further, when the
determined deviation is high, then structural change of the monolithic application for data
management may be recommended. The structural change may be recommended due
to overall architecture and design of the monolithic application.
[078] In another embodiment, for non-performance methods, when the
predefined acceptance threshold is defined based on the comparison of each the plurality
of methods, comparison of the total effective performance score may be done. The
comparison of the total effective performance score may be done by comparing the total
effective performance score calculated for each function of the monolithic application. In
order to compare the total effective performance score, the total effective performance
calculated for each function (i.e., 50) may be arranged in ascending order. Further, the
total effective performance score calculated may be arranged in ascending order to
determine middle value from the total effective performance score calculated for each of
the plurality of methods. Moreover, when the total effective performance score calculated
for each of the plurality of methods is even number, then a mean of middle two values
may be calculated. By way of an example, suppose the total effective performance score
calculated for a set of eight functions may be ‘54’, ‘60’, ‘80’, ‘120’, ‘150’, ‘270’, ‘300’ and
‘230’. Then, in order to determine the predefined acceptance threshold, the mean of
‘120’, and 150‘’ may be calculated. The mean of ‘120’, and ‘150’ may be calculated as
depicted via equation (14) below:
(120+150)/2 = 135 … (13)
[079] Further, each function (i.e., 50) of the monolithic application that may have
the total effective performance score greater than ‘135’ (i.e., the predefined acceptance
threshold) may be considered for optimization. Hence Fun 3, Fun 4, Fun 6, and Fun 50
may be identified for optimization.
[080] Referring now to Fig. 10, a flowchart of a method of analysing the monolithic
application corresponding to a data flow dimension is illustrated, in accordance with an
embodiment. At step 1002, the monolithic application may be analyzed corresponding to
the data flow dimension. In order to analyze the monolithic application corresponding to
Docket No: IIP-HCL-P0049
-22-
the data flow dimension, at step 1004, data inflows and data outflows may be analyzed.
In an embodiment, the data inflows and data outflows that are needed to be analyzed
may be associated with the monolithic application. Moreover, the data inflows and data
outflows may be analyzed to identify impact of at least one of a new data being added
and an existing data being removed in association with the monolithic application.
[081] Referring now to FIG. 11, a flowchart of a method of analysing the
monolithic application corresponding to a technology impact dimension is illustrated, in
accordance with an embodiment. At step 1102, the monolithic application may be
analyzed corresponding to the technology impact dimension. Moreover, in order to
perform analysis of the monolithic application corresponding to the technology
dimension, each of a set of technologies used in the monolithic application may be
determined. In an embodiment, the set of technologies used may correspond to
frameworks and tools used to develop the monolithic application. In order to analyze the
monolithic application corresponding to the technology impact dimension, at step 1104,
an impact of technology associated with the monolithic application may be determined.
In one embodiment, the associated technology may correspond to a technology being
selected to develop the monolithic application. In another embodiment, the associated
technology may correspond to a technology being used to modify the monolithic
application.
[082] By way of an example, during initial development stage, a set of
frameworks and tools required to develop the monolithic application may be defined.
However, during later stages of the monolithic application, if the user of the monolithic
application may want to implement new features or adopt new/emerging
technologies/platforms, then the technology impact dimension analysis may be
beneficial. As based on the technology impact analysis, the user may get suggestions
about best possible combination of frameworks and tools that may be required to modify
features of the monolithic application. In addition, based on analysis of the monolithic
application corresponding to the technology impact dimension, any challenges in the
monolithic application may be addressed. This in turn may prevent any bottleneck that
may occur during later stages of the monolithic application. Consider a scenario, where
the monolithic application may be initially developed with Java Database Connectivity
(JDBC) to address database connectivity to a client (i.e., the user). Further, based on
enhancement different database, the monolithic application may need to adopt a
hibernate kind of framework that is flexible enough to support different databases. In this
Docket No: IIP-HCL-P0049
-23-
scenario, the technology impact dimension analysis, may provide a recommendation of
a framework for the monolithic application to support different database.
[083] Consider another scenario, where the monolithic application may be initially
developed with a legacy User Interface (UI) technology. The legacy technology used may
good enough for the monolithic application for certain period. However, later user of this
monolithic application may want to implement or show set of complex data in a form of
graph/chart. In this case, the legacy UI technology that is initially used to develop the
monolithic application may come as a barrier. Hence, based on the technology impact
dimension analysis, the initially developed legacy UI technology may be switched with a
modern UI technology (for example: Angular/React JavaScript). Moreover, the switching
of the legacy UI technology with the modern UI technology may be addressed based on
requirement of right combination of backend technology and framework required to
implement the modern UI technology. This has been explained in greater detail in
conjunction with FIGs. 13A and 13B.
[084] Referring now to FIG. 12, a flowchart of a method of analysing the
monolithic application corresponding to an environment and external impact dimension
is illustrated, in accordance with an embodiment. At step 1202, the monolithic application
may be analyzed corresponding to the environment and external impact dimension. In
order to analyze the monolithic application, at step 1204, possible contribution of
environment and external elements influencing performance of the monolithic application
may be identified. In other words, in order to perform environment and external impact
dimension analysis, external libraries and context associated with the monolithic
applications may be analyzed. The external libraries and context may be analysed to
identify security vulnerabilities, performance and functional issues in the monolithic
application. In an embodiment, the environment and external impact dimension analysis
may be performed by analysing obsolete libraries. Further, based on analysis of obsolete
libraries, recommendations for upgradation on new versions of libraries may be provided.
In an embodiment, recommendations for upgradation may be based on a predefined
threshold limit set by the user of the monolithic application.
[085] By way of an example, suppose a monolithic application includes ‘100’
libraries which are used in the monolithic application. In addition, from the ‘100’ libraries
used, suppose a set of latest libraries may be ‘60’. A set of obsolete libraries may be ‘30’.
A set of unused libraries may be ‘10’. Further, based on above determined values, a set
of impacted libraries may be determined using equation (14) represented below:
Docket No: IIP-HCL-P0049
-24-
Impacted library – ((Obsolete + Unused) / Total libraries) * 100 …. (14)
[086] By replacing the determined values in the above equation (14), the set of
impacted libraries may be determined to be ‘40%’. Now suppose, the user of the
monolithic application has defined the predefined threshold limit for upgradation of the
monolithic application to be “10%’. Then, based on value determined for the set of
impacted libraries (i.e., 40%), total libraries used (i.e., 100) in the monolithic application
may be send for upgradation.
[087] Referring now to FIGs. 13A and 13B, an exemplary representing of an
analysis performed corresponding to a technological impact dimension of the monolithic
application is illustrated, in accordance with an exemplary embodiment. In an
embodiment, initially a monolithic application may be built using a set of technologies that
may be required to build the monolithic application. The set of technologies may include
a framework, or a toolset required to implement a set of services. As represented in the
present FIG. 13A, suppose a monolithic application is implemented with a set of hundred
services. In order to implement the set of hundred services, a set of four frameworks may
be utilized. In FIG. 13A, S1 to S100 represent each of the set of hundred services of the
monolithic application 1302A. Further, old set of frameworks/toolset 1304A represent
each of the set of four frameworks used to implement the monolithic application 1302A.
The set of four frameworks may include framework 1, framework 2, framework 2,
framework 3, and framework 4. Over a period of time, when complexity increases or
moving to cloud, some frameworks from the set of four frameworks may not be best
suited.
[088] Further, when the user (i.e., a customer) of the monolithic application
1302A, wants to implement a new service in the monolithic application 1302A, then at
least one of the set of four toolset may be changed. Based on analysis of the monolithic
application 1302A corresponding to the technology impact dimension the at least one
toolset may be changed. As represented by the FIG. 13B, the new service that may
implemented in the monolithic application 1302A may require a set of two new
frameworks. In an embodiment, in order to implement the new service, the set of two new
frameworks may be replaced with framework 2 and framework 4. The replaced set of two
frameworks may include framework A and framework B as depicted via FIG. 13B. Based
on the replaced set of two frameworks, the new service may be implemented in the
monolithic application 1302A.
Docket No: IIP-HCL-P0049
-25-
[089] Various embodiments provide method and system for detecting issues
triggered during processing of a monolithic application. The disclosed method and
system may generate a set of six evaluation dimensions corresponding to a monolithic
application. The set of six dimensions may include a flow unit level dimension, a method
level dimension, a functional flow dimension, a data flow dimension, a technology impact
dimension, and an environment and external impact dimension. Further, the system and
method may perform evaluation of the monolithic application corresponding to each of
the set of six evaluation dimensions. Thereafter, detect a plurality of issues triggered in
the monolithic application, based on evaluation of the monolithic application
corresponding to each of the set of six evaluation dimensions.
[090] The system and method provide some advantages like, the system and the
method may help in analyzing the monolithic application during its processing lifecycle.
Based on analyzation of the monolithic application, an error occurred in the monolithic
application may be rectified. In addition, the system and the method may implement any
new service in the monolithic application based on requirement of a user (i.e., the
customer).
[091] It will be appreciated that, for clarity purposes, the above description has
described embodiments of the invention with reference to different functional units and
processors. However, it will be apparent that any suitable distribution of functionality
between different functional units, processors or domains may be used without detracting
from the invention. For example, functionality illustrated to be performed by separate
processors or controllers may be performed by the same processor or controller. Hence,
references to specific functional units are only to be seen as references to suitable means
for providing the described functionality, rather than indicative of a strict logical or
physical structure or organization.
[092] Although the present invention has been described in connection with some
embodiments, it is not intended to be limited to the specific form set forth herein. Rather,
the scope of the present invention is limited only by the claims. Additionally, although a
feature may appear to be described in connection with particular embodiments, one
skilled in the art would recognize that various features of the described embodiments
may be combined in accordance with the invention.
[093] Furthermore, although individually listed, a plurality of means, elements or
process steps may be implemented by, for example, a single unit or processor.
Additionally, although individual features may be included in different claims, these may
Docket No: IIP-HCL-P0049
-26-
possibly be advantageously combined, and the inclusion in different claims does not
imply that a combination of features is not feasible and/or advantageous. Also, the
inclusion of a feature in one category of claims does not imply a limitation to this category,
but rather the feature may be equally applicable to other claim categories, as appropriate.
CLAIMS
WHAT IS CLAIMED IS:
1. A method (300) for detecting issues triggered during processing of a monolithic
application, the method comprising:
generating (302), by an assessment device (102), a set of six evaluation
dimensions corresponding to a monolithic application, wherein the set of six evaluation
dimensions comprises a flow unit level dimension, a method level dimension, a functional
flow dimension, a data flow dimension, a technology impact dimension, and an
environment and external impact dimension;
performing (304), by the assessment device (102), evaluation of the monolithic
application corresponding to each of the set of six evaluation dimensions; and
detecting (308), by the assessment device (102), a plurality of issues triggered in
the monolithic application, based on evaluation of the monolithic application
corresponding to each of the set of six evaluation dimensions.
2. The method of claim 1, wherein performing evaluation of the monolithic application
further comprises analysing (306) the monolithic application corresponding to each of the
set of six evaluation dimensions, and wherein detecting the plurality of issues triggered
in the monolithic application comprises analysing (310) combinations of each of the set
of six evaluation dimensions associated with the monolithic application against a specific
weightage associated with the monolithic application.
3. The method of claim 2, wherein:
analysing (402) the monolithic application corresponding to the flow unit level dimension
comprises:
evaluating (404) each of a set of flow units in the monolithic application based on
a plurality of factors, against a predefined threshold, wherein the plurality of factors
comprises a performance score, an occurrence count, a cost of failure, and a failure rate;
and
analysing (406) impact of each of the set of flow units in each of a plurality of
functional flows of the monolithic application;
Docket No: IIP-HCL-P0049
-28-
analysing (602) the monolithic application corresponding to the method level dimension
comprises:
identifying (604) contribution of each flow unit for each of a plurality of methods in
the monolithic application, wherein contribution of each flow unit for each of the plurality
of methods is identified based on an occurrence count and a performance score
associated with each of the set of flow units;
calculating (606) an effective performance score corresponding to each of the
plurality of methods based on the identified contribution of each flow unit; and
evaluating (608) each of the plurality of methods against a predefined acceptance
threshold, wherein each of the plurality of methods is evaluated based on the calculated
effective performance score; and
analysing (802) the monolithic application corresponding to the functional flow dimension
comprises:
performing (804) a functional test corresponding to each of a set of persistent
dataset and each of a set of non-persistent dataset of the monolithic application, and
wherein performing the functional test further comprises:
providing (806) an input for each function associated with each of the set
of persistent data and the set of non-persistent data for verifying an output
associated with functional requirement of each function;
calculating (808) an effective performance score for each function
associated with each of the set of persistent data and the set of non-persistence
data, wherein the effective performance score for each function is calculated
based of the occurrence count and the performance score associated with each
of the set of flow units; and
evaluating (810) each function against the predefined acceptance
threshold, wherein each function is evaluated based on the calculated effective
performance score.
4. The method of claim 2, wherein analysing (1002) the monolithic application
corresponding to the data flow dimension comprises analysing (1004) data inflows and
data outflows associated with the monolithic application, and wherein the data inflows
Docket No: IIP-HCL-P0049
-29-
and the data outflows are analysed to identify impact of at least one of a new data being
added and an existing data being removed in association with the monolithic application.
5. The method of claim 2, wherein analysing (1102) the monolithic application
corresponding to the technology impact dimension comprises determining (1104) impact
of a technology being selected for developing or modifying the monolithic application,
and wherein analysing (1202) the monolithic application corresponding to the
environment and external impact dimension comprises identifying (1204) possible
contribution of environment and external elements influencing performance of the
monolithic application.
6. A system (100) for detecting issues triggered during processing of a monolithic
application, the system (100) comprising:
a processor (106); and
a memory (104) communicatively coupled to the processor (106), wherein the
memory (104) stores processor executable instructions, which, on execution, causes the
processor (106) to:
generate (302) a set of six evaluation dimensions corresponding to a monolithic
application, wherein the set of six evaluation dimensions comprises a flow unit level
dimension, a method level dimension, a functional flow dimension, a data flow dimension,
a technology impact dimension, and an environment and external impact dimension;
perform (304) evaluation of the monolithic application corresponding to each of
the set of six evaluation dimensions; and
detect (308) a plurality of issues triggered in the monolithic application, based on
evaluation of the monolithic application corresponding to each of the set of six evaluation
dimensions.
7. The system of claim 6, wherein the processor executable instructions cause the
processor to perform evaluation of the monolithic application by analysing (306) the
monolithic application corresponding to each of the set of six evaluation dimensions, and
wherein the processor executable instructions cause the processor to detect the plurality
of issues triggered in the monolithic application by analysing (310) combinations of each
of the set of six evaluation dimensions associated with the monolithic application against
a specific weightage associated with the monolithic application.
Docket No: IIP-HCL-P0049
-30-
8. The system of claim 7, wherein the processor executable instructions cause the
processor to:
analyse (402) the monolithic application corresponding to the flow unit level dimension
by:
evaluating (404) each of a set of flow units in the monolithic application based on
a plurality of factors, against a predefined threshold, wherein the plurality of factors
comprises a performance score, an occurrence count, a cost of failure, and a failure rate;
and
analysing (406) impact of each of the set of flow units in each of a plurality of
functional flows of the monolithic application;
analyse (602) the monolithic application corresponding to the method level dimension by:
identifying (604) contribution of each flow unit for each of a plurality of methods in
the monolithic application, wherein contribution of each flow unit for each of the plurality
of methods is identified based on an occurrence count and a performance score
associated with each of the set of flow units;
calculating (606) an effective performance score corresponding to each of the
plurality of methods based on the identified contribution of each flow unit; and
evaluating (608) each of the plurality of methods against a predefined acceptance
threshold, wherein each of the plurality of methods is evaluated based on the calculated
effective performance score; and
analyse (802) the monolithic application corresponding to the functional flow dimension
by:
performing (804) a functional test corresponding to each of a set of persistent
dataset and each of a set of non-persistent dataset of the monolithic application, and
wherein performing the functional test further comprises:
providing (806) an input for each function associated with each of the set
of persistent data and the set of non-persistent data for verifying an output
associated with functional requirement of each function;
calculating (808) an effective performance score for each function
associated with each of the set of persistent data and the set of non-persistence
Docket No: IIP-HCL-P0049
-31-
data, wherein the effective performance score for each function is calculated
based of the occurrence count and the performance score associated with each
of the set of flow units; and
evaluating (810) each function against the predefined acceptance
threshold, wherein each function is evaluated based on the calculated effective
performance score.
9. The system of claim 7, wherein the processor executable instructions cause the
processor to analyse (1002) the monolithic application corresponding to the data flow
dimension by analysing (1004) data inflows and data outflows associated with the
monolithic application, and wherein the data inflows and the data outflows are analysed
to identify impact of at least one of a new data being added and an existing data being
removed in association with the monolithic application.
10. The system of claim 7, wherein the processor executable instructions cause the
processor to analyse (1102) the monolithic application corresponding to the technology
impact dimension by determining (1104) impact of a technology being selected for
developing or modifying the monolithic application, and wherein the processor
executable instructions cause the processor to analyse (1202) the monolithic application
corresponding to the environment and external impact dimension by identifying (1204)
possible contribution of environment and external elements influencing performance of
the monolithic application.
| # | Name | Date |
|---|---|---|
| 1 | 202111004496-CLAIMS [27-09-2022(online)].pdf | 2022-09-27 |
| 1 | 202111004496-STATEMENT OF UNDERTAKING (FORM 3) [02-02-2021(online)].pdf | 2021-02-02 |
| 2 | 202111004496-CORRESPONDENCE [27-09-2022(online)].pdf | 2022-09-27 |
| 2 | 202111004496-REQUEST FOR EXAMINATION (FORM-18) [02-02-2021(online)].pdf | 2021-02-02 |
| 3 | 202111004496-REQUEST FOR EARLY PUBLICATION(FORM-9) [02-02-2021(online)].pdf | 2021-02-02 |
| 3 | 202111004496-DRAWING [27-09-2022(online)].pdf | 2022-09-27 |
| 4 | 202111004496-PROOF OF RIGHT [02-02-2021(online)].pdf | 2021-02-02 |
| 4 | 202111004496-FER_SER_REPLY [27-09-2022(online)].pdf | 2022-09-27 |
| 5 | 202111004496-POWER OF AUTHORITY [02-02-2021(online)].pdf | 2021-02-02 |
| 5 | 202111004496-OTHERS [27-09-2022(online)].pdf | 2022-09-27 |
| 6 | 202111004496-FORM-9 [02-02-2021(online)].pdf | 2021-02-02 |
| 6 | 202111004496-FER.pdf | 2022-04-07 |
| 7 | 202111004496-FORM 18 [02-02-2021(online)].pdf | 2021-02-02 |
| 7 | 202111004496-COMPLETE SPECIFICATION [02-02-2021(online)].pdf | 2021-02-02 |
| 8 | 202111004496-FORM 1 [02-02-2021(online)].pdf | 2021-02-02 |
| 8 | 202111004496-DECLARATION OF INVENTORSHIP (FORM 5) [02-02-2021(online)].pdf | 2021-02-02 |
| 9 | 202111004496-DRAWINGS [02-02-2021(online)].pdf | 2021-02-02 |
| 9 | 202111004496-FIGURE OF ABSTRACT [02-02-2021(online)].jpg | 2021-02-02 |
| 10 | 202111004496-DRAWINGS [02-02-2021(online)].pdf | 2021-02-02 |
| 10 | 202111004496-FIGURE OF ABSTRACT [02-02-2021(online)].jpg | 2021-02-02 |
| 11 | 202111004496-DECLARATION OF INVENTORSHIP (FORM 5) [02-02-2021(online)].pdf | 2021-02-02 |
| 11 | 202111004496-FORM 1 [02-02-2021(online)].pdf | 2021-02-02 |
| 12 | 202111004496-COMPLETE SPECIFICATION [02-02-2021(online)].pdf | 2021-02-02 |
| 12 | 202111004496-FORM 18 [02-02-2021(online)].pdf | 2021-02-02 |
| 13 | 202111004496-FER.pdf | 2022-04-07 |
| 13 | 202111004496-FORM-9 [02-02-2021(online)].pdf | 2021-02-02 |
| 14 | 202111004496-OTHERS [27-09-2022(online)].pdf | 2022-09-27 |
| 14 | 202111004496-POWER OF AUTHORITY [02-02-2021(online)].pdf | 2021-02-02 |
| 15 | 202111004496-FER_SER_REPLY [27-09-2022(online)].pdf | 2022-09-27 |
| 15 | 202111004496-PROOF OF RIGHT [02-02-2021(online)].pdf | 2021-02-02 |
| 16 | 202111004496-DRAWING [27-09-2022(online)].pdf | 2022-09-27 |
| 16 | 202111004496-REQUEST FOR EARLY PUBLICATION(FORM-9) [02-02-2021(online)].pdf | 2021-02-02 |
| 17 | 202111004496-CORRESPONDENCE [27-09-2022(online)].pdf | 2022-09-27 |
| 17 | 202111004496-REQUEST FOR EXAMINATION (FORM-18) [02-02-2021(online)].pdf | 2021-02-02 |
| 18 | 202111004496-STATEMENT OF UNDERTAKING (FORM 3) [02-02-2021(online)].pdf | 2021-02-02 |
| 18 | 202111004496-CLAIMS [27-09-2022(online)].pdf | 2022-09-27 |
| 1 | 202111004496E_07-04-2022.pdf |
| 1 | SearchHistory_202111004496AE_13-07-2023.pdf |
| 2 | 202111004496E_07-04-2022.pdf |
| 2 | SearchHistory_202111004496AE_13-07-2023.pdf |