Sign In to Follow Application
View All Documents & Correspondence

Assessing Process Deployment

Abstract: System and methods for assessing process deployment are described. In one implementation, the method includes collecting data related to different processes and computation of one or more metrics. The metrics are analyzed, and an index, indicating the extent of deployment of the one or more processes, is obtained based on the analysis. In another implementation, the result of the analysis is displayed.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
11 October 2010
Publication Number
46/2012
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2019-08-29
Renewal Date

Applicants

TATA CONSULTANCY SERVICES LIMITED
NIRMAL BUILDING, 9TH FLOOR, NARIMAN POINT, MUMBAI MAHARASHTRA-400021, INDIA

Inventors

1. CHANDRA, ARUNAVA
TATA CONSULTANCY SERVICES, PLOT C,BLOCK EP, SECTOR V, SALT LAKE ELECTRONICS COMPLEX, SALT LAKE CITY, KOLKATA - 700091, WEST BENGAL, INDIA
2. SUBRAMANI, BALAKRISHNAN
TATA CONSULTANCY SERVICES, PLOT C, BLOCK EP, SECTOR V, SALT LAKE ELECTRONICS COMPLEX, SALT LAKE CITY, KOLKATA - 700 091, WEST BENGAL, INDIA.
3. TYAGI, KAMNA
TATA CONSULTANCY SERVICES, TCS AWADH PARK, VIBHUTI KHAND, GOMTI NAGAR, LUCKNOW - 226 010, UTTAR PRADESH, INDIA.
4. PRADHAN, PRADIP
TATA CONSULTANCY SERVICES, PLOT C, BLOCK EP, SECTOR V, SALT LAKE ELECTRONICS COMPLEX, SALT LAKE CITY, KOLKATA - 700 091, WEST BENGAL, INDIA
5. MODI , NINA
TATA CONSULTANCY SERVICES, MAKER TOWERS, E BLOCK 11TH FLOOR, CUFFE PARADE, COLOBA, MUMBAI - 400 005, MAHARASHTRA, INDIA.
6. MOHILE, JYOTI
TATA CONSULTANCY SERVICES, MAKER TOWERS, E BLOCK 11TH FLOOR, CUFFE PARADE, COLOBA, MUMBAI - 400 005, MAHARASHTRA, INDIA.
7. CHAWLA, ALKA
TATA CONSULTANCY SERVICES, LTD. 5TH FLOOR, PTI BUILDING, 4, PARLIAMENT STREET, NEW DELHI - 110 001, INDIA.
8. REKHI, SANDEEP
TATA CONSULTANCY SERVICES, LTD. 5TH FLOOR, PTI BUILDING, 4, PARLIAMENT STREET, NEW DELHI - 110 001, INDIA.
9. KAKKAR, SANDHYA
TATA CONSULTANCY SERVICES, LTD. 5TH FLOOR, PTI BUILDING, 4, PARLIAMENT STREET, NEW DELHI - 110 001, INDIA.
10. PADMANABHAN, VASU
TATA CONSULTANCY SERVICES, PLOT NO. G1 SIPCOT INFORMATION TECHNOLOGY PARK, NAVALUR POST, CHENNAI - 603 103, TAMIL NADU, INDIA

Specification

FORM 2
THE PATENTS ACT, 1970 (39 of 1970) & THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)
/. Title of the invention:
ASSESSING PROCESS DEPLOYMENT
2. Applicant(s)
NAME NATIONALITY ADDRESS
TATA CONSULTANCY SERVICES Indian Nirmal Building, 9th Floor, Nariman Point,
LIMITED Mumbai Maharashtra-400021, India
3. Preamble to the description
COMPLETE SPECIFICATION
the following specification particularly describes the invention and the manner in which it
is to be performed.

TECHNICAL FIELD
[0001] The present subject matter relates, in general, to systems and methods for
assessing deployment of a process in an organization. BACKGROUND
[0002] An organization typically has multiple operating units, each having a specific set
of responsibilities, and a business objective. The operating units deploy different processes to meet their specific business objectives. A process is generally a series of steps or acts followed to perform a task. Some processes may be common to some or all operating units, while some processes may be unique to a particular operating unit depending on the functioning of the unit. Processes may also be provided for different functional areas like Sales & Customer Relationship, Delivery, Leadership & Governance, Information Security, Knowledge Management and so on. In an organization use of standard set of processes helps in streamlining activities, and ensures a consistent way of performing different functions thereby reducing the risk and generating predictive outcome. Furthermore such processes may also facilitate performing functions of different roles across the organization to generate one or more predictive outcomes.
[0003] In order to assess the rigor of deployment and compliance of the processes,
organizations may conduct regular audits of the organizational entities and detect the
deviations. This can be accomplished by various systems that implement process audit
mechanisms for checking compliance with one or more organizational polices.
[0004] The deployment of a process in an organization generally refers to the extent to
which the process is implemented and adhered to during the normal course of working of the organization. Deployment of processes in an organization is typically impacted by different factors, such as structure of the organization, different types of operating units, project life-cycles, and project locations. There are various tracking or review mechanisms available to assess the extent and rigor of deployment of processes. Though these mechanisms are able to identify areas of strengths and weakness but are not much effective to clearly indicate the extent of deployment of one or more processes within the organization.

SUMMARY
[0005] This summary is provided to introduce concepts related to assessment or
deployment of processes in an organization, which are further described below in the detailed
description. This summary is not intended to identify essentia! features of the claimed subject
matter nor is it intended for use in determining or limiting the scope of the claimed subject
matter.
[0006J In one implementation, the method includes collecting data related to different
processes and computation of one or more metrics. The metrics are analyzed, and an index,
indicating the extent of deployment of one or more processes, is obtained based on the
analysis. In another implementation, the result of the analysis is displayed.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] The detailed description with reference to the accompanying figures is provided.
In the figures, the left-most digit(s) of a reference number identifies the figure in which the
reference number first appears. The same numbers are used throughout the drawings to
reference like features and components.
[0008] Fig. 1 illustrates an exemplary computing environment implementing a process
evaluation system for assessment of process deployment, in accordance with an
implementation of the present subject matter.
[0009] Fig. 2 illustrates exemplary components of a process evaluation system, in
accordance with an implementation of the present subject matter.
[00010] Fig. 3 illustrates an exemplary method to assess the deployment of processes in
an organization, in accordance with an implementation of the present subject matter.
[00011] Fig. 4 illustrates an implementation of a process deployment index (PDI)
Dashboard, in accordance with an implementation of the present subject matter.
[00012] Fig. 5 illustrates another implementation of the PDI Dashboard, in accordance
with an implementation of the present subject matter.
[00013] Fig. 6 illustrates an exemplary method to evaluate a process readiness index in
an organization, in accordance with an implementation of the present subject matter.

DETAILED DESCRIPTION
[00014] A process is typically a series of steps that are planned to be performed so as to
achieve one or more identified business objectives. An organization generally deploys
multiple processes to achieve the business objectives in a consistent and efficient manner. The
efficiency and profitability of the organization, in most cases, depend on the maturity and
deployment of the processes. Process deployment takes into consideration various aspects
including readiness, coverage, rigor and effectiveness of a process. For example, readiness of
a process deployment can be indicated by an assessment of whether the process is ready to be
deployed, and is dependant on multiple factors. Coverage of process deployment refers to an
extent to which the process is rolled-out in the organization. This can include, for example.
the number of people using the process and the number of people aware of the process. Rigor
of a process deployment refers to an extent to which the process is institutionalized and has
become a part of routine activities. Effectiveness of deployment of a process refers to an
extent to which the process is being followed so that it meets the intended business objective.
[00015] In conventional systems, to assess process deployment, different parameters or
metrics are evaluated for different processes. Since the metrics are composed of different
variables of a process, the scale of assessment or the unit of measurement of these metrics
also varies for different metrics. As a result, the process deployment status for each process
would be assessed and reported differently, and a meaningful comparison of deployment
across various processes becomes difficult. Further, the assessment carried out for the
different processes is typically specific to a process area and therefore is not totally reliable
and unable to provide overall status of deployment across different process areas.
[00016] To this end, systems and methods for assessing process deployment are
described. In one implementation, for the harmonized assessment and representation of the deployment of different processes in an organization, a process deployment index (PDI) may be used. Such representations facilitate identification of areas, where improvements may be required. Once such areas are identified, necessary corrective or preventive actions can be taken. The PDI can be computed for a metric, for a process area, or an operating unit or the entire organization from the different metrics corresponding to the different processes. These metrics have different units of representation. For example, different measures for a particular

process area can be percentage of projects completed, number of trained employees, etc. Also
measures for processes of a particular process area may or may not be applicable to all
operating units. In one implementation, a matrix may be prepared listing different measures
for the different processes and applicability of these measures to different operating units.
[00017] In an embodiment, an operating unit may be a logical or a functional group
responsible for providing services to the customers of different domain, for example an
industry domain, a major market segment, a strategic market segment, a distinct service sector
and a technology solution domain. The particular industry domain includes banking, finance,
manufacturing, and retail. The major market segment may also include different countries like
USA, UK, Europe, etc., and strategic market segment includes new growth Market, and
Emerging market. The distinct service sector may include BPO, Consulting, and Platform
BPO and technology solution domain include SAP, BI, Oracle Applications etc. Once the
metrics are defined for different processes, the metrics are collected from the different
operating units. As discussed earlier, the metrics may have different unit of measure e.g.,
percentage, absolute value, etc. Once collected, the values of different metrics can be
normalized to a common scale without affecting the significance of the original values of the
metrics. The metrics are then analyzed to calculate the PDI, which can be analyzed to indicate
the extent to which the processes have been deployed in the organization.
[00018] It would be noted that the PDI indicates an overall status of the deployment of
the processes across the organization. As discussed, the PDI can be computed for the entire
organization, for different operating units, different process areas, and metrics for specific
time periods. In one implementation, the PDI can be displayed through a common dashboard
in the form of value, color codes indicating the state, graph, trends etc. Thus, process
deployment across various operating units can be effectively collated and compared in a
harmonized manner, thereby making the assessment reliable, informative and efficient.
[00019] In another implementation, before an operating unit can be included for
reporting the metrics and for determination of PDI, a readiness index can be calculated, which indicates the level of readiness of the newly included operating unit. In one implementation, this would include determining conformance of the newly included operating units with one or more basic readiness parameters.

[00020J While aspects of described systems and methods for assessing the status of
processes can be implemented in any number of different computing systems, environments, and/or configurations, the implementations are described in the context of the following exemplary system(s). EXEMPLARY SYSTEMS
[00021] Fig. 1 shows an exemplary computing environment 100 for implementing a
process evaluation system to assess process deployment in an organization. To this end, the computing environment 100 includes a process evaluation system 102 communicating, through a network 104, with client devices 106-1,..., N (collectively referred to as client devices 106). The client devices 106 include one or more entities, which can be individuals or a group of individuals working in different operating units within the organization to meet their aspired business objectives.
[00022] The network 104 may be a wireless or a wired network, or a combination
thereof. The network 104 can be a collection of individual networks, interconnected with each other and functioning as a single large network (e.g., the internet or an intranet). Examples of such individual networks include, but are not limited to, Local Area Networks (LANs), Wide Area Networks (WANs), and Metropolitan Area Networks (MANs).
[00023] It would be appreciated that the client devices 106 may be implemented as any
of a variety of conventional computing devices, including, for example, a server, a desktop PC, a notebook or portable computer, a workstation, a mainframe computer, a mobile computing device, an entertainment device, an internet appliance, etc. For example, in one implementation; the computing environment 100 can be an organizations computing network in which different operating units use one or more client devices 106.
[00024] For analysis of different processes implemented by the different operating
units, the process evaluation system 102 collects various data or metrics from the client devices 106. In one implementation, analysis of different processes means checking deployment status of different processes in the organization. In one implementation, each of the client devices 106 may be provided with collection agent 108-1, 108-2... 108-N, respectively. The collection agent 108-1, 108-2... 108-N (collectively referred to as collection

agents 108) collect the data or metrics related to different processes deployed through the computing environment 100.
[00025] The collection agents 108 can be configured to collect the metrics related to
different processes automatically. In one implementation, one or more users can upload the metrics manually. In one implementation, a user may directly enter data related to the different processes through a user interface of the client devices 106, and the data may then be processed to obtain the metrics. The processing of the data may be performed at any of the client devices 106 or at the process evaluation system 102. In such a case, one or more of the client devices 106 may not include the collection agent 108.
[00026] In yet another implementation, the metrics related to the different processes
may be collected through a combination of automatic collection, i.e., implemented in part by one or more collection agents 108, and entry by a user.
[00027] Once collected, the metrics can be verified for completeness and correctness.
For example, metric values reported incorrectly by accident can be identified and corrected. In
one implementation, the metrics are verified by the process evaluation system 102. The
verification of the metric collected from the client devices 106 can either be based on rules
that are defined at the process evaluation system 102 or can be performed manually.
[00028] Once the metrics are verified, the process evaluation system analyses the
metrics to compute a process deployment index, also referred to as PDI, as described
hereinafter. To this end, the process evaluation system 102 includes an analysis module 110,
which analyzes the metrics of different process areas. In one implementation, the analysis
module 110 analyzes the metrics based on one or more specific rules. In another
implementation, the analysis module 110 analyzes the metrics based on historical data. The
PDI can then be calculated for the assessment of the deployed processes. In another
implementation, various rules can be applied to the PDI for further analysis. For example, the
analysis of the PDI can be performed using a business intelligence tool.
[00029] Once calculated, the PDI of different metrics, process areas, operating units,
and entire organization and the associated analysis can be displayed on a display device (not shown) associated with the process evaluation system 102, In one embodiment, the analysis can be displayed through a dashboard, referred as PDI Dashboard. The PDI Dashboard and

the analytics can be collectively displayed on the display device as a visual dashboard using visual indicators, such as bar graphs, pie charts, color indications, etc. Displaying the PDI associated with the different processes being implemented in an organization, along with the analysis objectively portrays the overall status of deployment of one or more processes in a consolidated and a standardized manner. The manner in which the PDI is calculated is further explained in detail in conjunction with Fig. 2.
[00030] The present description has been provided based on components of the
exemplary network environment 100 illustrated in Fig. 1. However, the components can be present on a single computing device wherein the computing device can be used for assessing the processes deployed in the organization, and would still be within the scope of the present subject matter.
[00031] Fig. 2 illustrates a process evaluation system 102, in accordance with an
implementation of the present subject matter. The process evaluation system 102 includes processor(s) 202, interface(s) 204, and a memory 206. The processor(s) 202 are coupled to the memory 206. The processor(s) 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) 202 are configured to fetch and execute computer-readable instructions stored in the memory 206.
[00032] The interface(s) 204 may include a variety of software and hardware
interfaces, for example, a web interface allowing the process evaluation system 102 to interact with a user. Further, the interface(s) 204 may enable the process evaluation system 102 to communicate with other computing devices, such as the client devices 106, web servers and external repositories. The interface(s) 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example LAN, cable, etc., and wireless networks such as WLAN, cellular, or satellite. The interface(s) 204 may include one or more ports for connecting a number of computing devices to each other or to another server.
[00033] The memory 206 can include any computer-readable medium known in the art
including, for example, volatile memory (e.g., RAM), and/or non-volatile memory (e.g.,

EPROM, flash memory, etc.)- In one implementation, the memory 206 includes module(s) 208 and data 210. The module(s) 208 further include a conversion module 212, an analysis module 110, and other module(s) 216. Additionally, the memory 206 further includes data 210 that serves, amongst other things, as a repository for storing data processed, received and generated by one or more of the module(s) 208. The data 210 includes, for example, metrics 218, historical data 220, analyzed data 222, and other data 224. In one implementation, the metric 218, the historical data 220, and the analyzed data 222, may be stored in the memory 206 in the form of data structures. In one implementation, the metrics received or generated by the process evaluation system 102 are stored as the metrics 218.
[00034] The process evaluation system 102 assesses the status of deployment of
processes in an organization or an enterprise by analyzing the metrics 218. The different processes implemented in the organization may relate to various process areas, examples of which include but are not limited to, Sales and Customer Relationship, Leadership and Governance, Delivery, Information Security, Knowledge Management, Process Improvement, Audit and Compliance, etc. The metrics 218 associated with the different processes may therefore have a variety of units of assessment or scales. For example, in one case, the metric 218 may be in the form of an absolute numerical value. In another case, the metric 218 may be in the form of a percentage. Once collected, the metrics 218 can be verified for completeness and correctness by the analysis module 110. For example, metric values reported incorrectly by accident can be identified and corrected. The metrics 218 can be verified by the analysis module 110 based on one or more rules, such as rules defined by a system administrator. The analysis module 110, in such a case, can verify the completeness and consistency of the metrics 218 reported by the client devices 106. Consider an example where one of the metrics 218 was incorrectly reported as 5% as opposed to 55% that was intended to be reported through the client device 106. In such a case, the analysis module 110 can measure the deviation of the reported metrics 218 from the trend of previously reported metrics, stored in the historical data 220. If the deviation exceeds a predefined threshold, the analysis module 110 can identify the 5% reported as a probable incorrect data. In one implementation, the analysis module 110 can be configured to prompt the user to either confirm the value of the metric reported or can request the metrics 218 to be provided again.

It would be appreciated that other forms of verification can further be implemented which
would still be within the scope of the present subject matter. In another implementation, the
verification of the metric collected, from the client devices 106 can be performed manually,
[00035] In order to analyze the different processes, the conversion module 212
normalizes the metrics 218 for different processes. In one implementation, the conversion
module 212 normalizes the metrics 218 based on a common scale, such as a scale of 1-10
where values from 1 to 4 represent RED performance band, 4 to 8 represent AMBER band
and 8-10 GREEN band. In one implementation, the metrics 218 may be converted to the
common scale by dividing an original scale of the metrics into multiple ranges and mapping
these ranges to corresponding ranges of the common scale so that performance bands of both
the scales map with each other. For example, a metric that is originally in the percentage scale
can be converted to a common scale by mapping an original value between 80% -100% to
values in the range of 8-10 of the common scale. Similarly, original values between 40% -
80% can be associated to values in the range of 4-8 and original values less than 40% can be
mapped to values less then 4. In another example, where a metric value is represented by a
numeric and ranging between 0 to 5, values between 0 to 2 can be mapped to 1-4 of the
common scale, values greater than 2 to 4 can be mapped to 5-8 of the common scale and
values more than 4 can be mapped to common scale's values 9-10. Similarly, other scales of
the metrics 218 can be also converted to a common unit of measurement. In one
implementation, the normalized metrics values are stored in metrics 218.
[00036] Once the scales of the metrics 218 have been obtained, the different ranges
within the common scale of 1-10 can be associated with different visual indicators to display
the status of deployment of a certain process, say within an operating unit or for a process
area or for the entire organization. For example, the values 8-10 may be represented by a
GREEN colored indicator indicating an above average or desirable extent of deployment for a
process under consideration, values between 4-8 may be represented by an AMBER colored
indicator indicating an average extent of deployment and values below 4 may be represented
by a RED colored indicator would indicate a below average deployment of the process.
[00037] Once the metrics 218 are converted by the conversion module 212, the analysis
module 110 receives the converted metrics from conversion module 212. The analysis module

110 analyzes the converted metrics to calculate the process deployment index (PDI) for a process or an operating unit or a process area or for the organization. As described previously, the PDI indicates the extent of the deployment of one or more processes in an organization. In one implementation, the PDI is calculated using the following formula:

where Xi is the value of the metric 'i'.
[00038] The PDI can be calculated for a particular process, a particular operating unit, a
particular process area, or for the organization for a particular time period. In one implementation, the analysis module 110 displays the PDI through a dashboard in a variety of visual format. For example, in one implementation, the PDI is represented as a value on the scale of 1-10. In another implementation, the PDI may be displayed in the form of colored ranges having a GREEN, AMBER or RED color. In one implementation, the analysis module 110 may further analyze the obtained PDI. For example, the analysis module 110 may represent the PDI in the terms of statistical analysis of data such as variations and mean trends. The representation of the PDI in such a manner can be based on one or more analysis rules. The PDI value provides information on extent to which a process is deployed in the organization and can also be used to assess the areas of improvement.
[00039] In another implementation, the analysis module 110 can further analyze the
PDI obtained based on the historical data 220, In such a case, the analysis module 110 can be further configured to provide a comparative analysis between the PDI calculated over a period of time. It would be appreciated that such an analysis can provide further insights into the trend of extent of deployment of one or more processes and their improvement over a period of time.
[00040] In another implementation, the metrics 218 associated with various processes
being implemented in the organization can be reported by a group of individuals or practitioners within an operating unit that is implementing one or more processes under consideration. In another implementation the metrics 218 can be reported to a group of individuals responsible for the process deployment and for providing support to the operating units towards effective process deployment. In one implementation, the PDI is displayed to

relevant stakeholders at the organizational level for assessing the extent of deployment of processes across different operating units and to identify generic as well as specific opportunities of improvement.
[00041] In another implementation, before an operating unit can be included for
reporting the metrics and for determination of PDI, a readiness index can be evaluated which indicates the level of maturity of a newly included operating unit. In one implementation, this would include determining conformance of the newly included operating units with one or more basic compliance parameters related to readiness check. For example, a readiness index, or a process readiness index (hereinafter referred to as PRI) can also be evaluated by the analysis module 110.
[00042] To this end, the analysis module 110 can calculate the PRI based on the
metrics 218. In one implementation, the PRI can be calculated based on the following equation:

where Xi is the value of the Readiness metric 'i'.
[00043] Once the PRI is determined, the analysis module 110 can compare the
calculated PRI with one or more threshold parameters. In one implementation, threshold
parameter may have GREEN, AMBER and RED ranges indicating good, fair and poor status
respectively. If the analysis module 110 determines that the PRI is within the limits defined
by the threshold parameters and the unit stabilizes on that PRI for some period of time, it may
subsequently consider evaluating PDI for the newly added operating unit.
[00044] Fig. 3 illustrates an exemplary method 300 for calculating the process
deployment index of an organization. The order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method, or an alternative method. Additionally, individual blocks may be added to or removed from the method without departing from the spirit and scope of the subject matter described herein. Furthermore, the methods can be implemented in any suitable hardware, software, firmware, or combination thereof.

[00045] At block 302, process indicators or metrics associated with one or more
processes are collected. For example, the process evaluation system 102 collects the metrics from collection agents 108 within one or more client devices 106. The collection agents 108 can either report the metrics related to different processes in a predefined automated manner, or can be configured to allow one or more users to upload the metrics manually, say through user-interfaces, templates, etc.
[00046] At block 304, the reported metrics are verified. For example, the analysis
module 110 can verify the metrics 218 provided, say by the client devices 106, or as collected by the collection agents 108 based on one or more rules. In one implementation, the analysis module 110 can be configured to prompt the user to either confirm the value of the metrics 218 reported or correct the metric 218 reported, as required. It would be appreciated that other forms of verification can also be contemplated which would still be within the scope of the present subject matter. In another implementation, the verification of the metric collected from the client devices 106 can be performed manually. In another implementation, a value that is not reported is provided a default score.
[00047] At block 306, the metrics are normalized. For example, the metrics 218 can be
normalized to a common scale by the conversion module 212. In one implementation, the
metrics 218 may be converted to the common scale by logically dividing an original scale of
the metrics into multiple ranges and associating the different ranges of the original scale with
a corresponding range of the common scale. Furthermore, different ranges within the scale of
1-10 can be associated with different visual indicators, such as color GREEN, AMBER, and
RED, to display the performance status of deployment of a certain process.
[00048] At block 308, a process deployment index or PDI is calculated based on the
where Xi is the value of the metric 'i'.
normalized metrics. For example, the analysis module 110 calculates the PDI based on the metrics 218 normalized by the conversion module 212. In one implementation, PDI is calculated using the following formula:


[00049] In one implementation, the PDI is calculated by the analysis module 110 on
periodic basis. For example, the analysis module 110 can be configured to provide the PDI on monthly, weekly, quarterly or any other time interval. Furthermore, the PDI can be calculated for one. more, or all process metrics or process areas, or operating units, or the entire organization. For example, the analysis module 110 can be configured to calculate the PDI for different processes areas like sales and relation, delivery, and leadership and governance, and for different operating units like Banking and Financial Services (BFS), insurance, manufacturing, and telecom etc. In one implementation, the metrics related processes considered for PDI may undergo additions or deletions in view of the business objectives of the organizations. Similarly, a process area may be added to or deleted from the purview of PDI if situation demands.
[00050] At block 310, the calculated PDI is further analyzed. For example, the PDI is
displayed using a visual dashboard with statistical formats indicating trends, distributions, variations depicting the extent of process deployment over a period of time. The representation of the PDI in such a manner can be based on one or more analysis rules. Furthermore, the process evaluation system 102 can be configured to allow a viewer to drill-down to the underlying data by clicking on one or more of the visual elements being displayed on the dashboard. In one implementation, the analysis module 110 can further analyze the PDI obtained based on the historical data 220 to provide a comparative analysis between the PDI calculated for more than one operating units over a period of time, provide one or more alerts associated with the PDI, etc. In one implementation the system can add additional analytics based on requirement.
[00051] Fig. 4 illustrates an exemplary PDI Dashboard 400, as per one implementation of the present subject matter. As can be seen, the dashboard 400 includes different fields, such as the process area field 402, measures field 404 associated with the process area 402, and frequency 406. The field frequency 406 depicts the duration or the interval, i.e., monthly, at which the data or metrics 218 are collected and published.
[00052] The dashboard 400 further includes a period field 408 which indicates the period of metric collection. The unit column 410 displays the unit of measurement for the various metrics 218 that have been reported by one or more of the client devices 106. The field

current value 412 indicates the value of the particular metric that has been reported for the
period 408. Furthermore, the PDI field 414 indicates the PD1 that has been calculated by the
analysis module 110 for the metric or process area of that corresponding row.
[00053] The dashboard 400 also includes four other fields 416, such as GREEN target
column which indicates the target values to be achieved by the corresponding metric in
column 404. The status field shows the performance status of the processes under
consideration using one or more visual elements such as RED, AMBER, and GREEN. In
addition, the previous value field and the % change field Indicate the last collected value of
the metric 218 and the change in the current value as compared to the previous value,
respectively. For example, for the process area A&C (Audit and Compliance) frequency of
collection of the last two metrics 218 namely '% of auditors compared to auditable entities'
and 'Number of Overdue NCR's and OFI's per 100 auditable entities' are shown as monthly.
The PDI trend for'% of auditors compared to auditable entities' the second last metrics 218 is
downward and that for 'Number of Overdue NCR's and OFI's per 100 auditable entities' is
upward. The cumulative PDI for the entire process area, i.e., A&C is shown as 0.65.
[00054] Fig. 5 illustrates an exemplary graph displaying PDI for various process areas,
as per an implementation. As illustrated the graph displays variation in the PDI for processes in one or more process areas for a period of six month. It would be appreciated that the trends can be generated for any time period, based on the preference of a user. As can be seen, different process areas are plotted on the X-axis and their corresponding PDI values are provided along the Y-axis. The values of the PDI are based on a scale of 0.00-1.00. In a similar way, a different scale for indicating the PDI can be used.
[00055J As illustrated, the different processes that are plotted include Sales and Customer Relationship (S&R), Audit and Compliance (AC), Delivery (DEL), Information Security (SEC), Process Improvement (PI), Knowledge Management (KM), , Leadership and Governance (LG), . PDI values for the period of six month are plotted starting from Jan -09 to Jun-09. PDI values for Jan-09, Feb-09, May-09 and Jun-09 are plotted in the form of bars. Whereas, PDI values for the months of Mar-09 and Apr-09 are plotted in the form of solid and dashed lines, respectively. By plotting this graph comparison of PDI values of one or more process areas over a period of time can be displayed. In one implementation, instead of

month PDI values can be plotted on a quarterly or yearly basis. In another implementation,
instead of plotting process areas on X-axis, similar plots can also be generated for selective
metrics or operating units.
{00056] Fig. 6 illustrates an exemplary method 600 for calculating the process
readiness index (PRI). The order in which the method is described is not intended to be
construed as a limitation, and any number of the described method blocks can be combined in
any order to implement the method, or an alternative method. Additionally, individual blocks
may be added to or deleted from the method without departing from the spirit and scope of the
subject matter described herein. Furthermore, the methods can be implemented in any suitable
hardware, software, or combination thereof.
[00057] As indicated previously, PRI is calculated whenever a new operating unit is
included within an organization. A favorable value of the PRI would indicate that the
operating unit has reached a certain minimum level of readiness to be considered for
computation of PDI for one or more processes deployed by the unit along with other operating
units already reporting PDI.
[00058] At block 602, the metrics are collected from operating units that have been
newly added in an organization. For example, for the newly created operating unit, metrics
218 can be collected using collection agents 108 at each of the client devices 106. In one
implementation, the metrics 218 can be collected periodically, such as on a weekly, monthly,
quarterly basis or any other time interval.
[00059] At block 604, the metrics are analyzed. In one implementation, the analysis
module 110, analyzes metrics 218. The analysis module 110 analyzes the metrics 218
associated with the newly added operating unit based on one or more rules and with respect to
data stored in historical data 220.
[00060] At block 606, the PRI of the newly added operating unit is calculated. After
analyzing metrics 218 of the new client device 102, the analysis module 110 calculates the
PRI associated with one or more newly added operating units, and the processes deployed
within the operating units. The calculated PRI value can lie in the range 1-10.
[00061] At block 608, a determination is made to check whether the calculated PRI is
within threshold limits. For example, the analysis module 110 determines whether the PRI

value of the newly added operating unit lies within a threshold limit. In one implementation,
the threshold limits are defined in other data 224. In another implementation, the analysis
module 110 can further associate the PRI with one or more visual indicators, such as color
codes, etc. For example, a value of the PRI less than 4 can be depicted by color RED
indicating a critical condition. Similarly, values between 4-8 and 9-10 can be depicted by
colors AMBER and GREEN, respectively, to indicate an average and acceptable conditions.
[00062J If the calculated PRI is not within the acceptable limits ('No' path from block
608), one or more suggestive practices may be proposed for the newly added operating unit (block 610) to improve its performance. Subsequently, the method proceeds to block 606, which means that the unit continues to report PRI for some more time. For example, if a critical condition exists, individuals responsible for making management decisions may propose working practices to improve the PRI.
[00063] If the calculated PRI is within the acceptable limits ('Yes' path from block
608), the process for calculating the PDI is initiated (block 612). In one implementation, the analysis module 110 identifies the metrics 218 for the newly added operating unit, based on which the PDI would be evaluated. Once the process is initiated, the analysis module 110 also evaluates the PDI based on the identified metrics 218 for the newly added unit. Conclusion
[00064] Although embodiments for evaluating deployment of a process in an
organization have been described in language specific to structural features and/or methods, it is to be understood that the invention is not necessarily limited to the specific features or methods described. Rather, the specific features and methods for evaluating deployment of a process are disclosed as exemplary implementations of the present invention.

I/We Claim:
1. A computer implemented method for calculating a process deployment index , the method
comprising:
collecting at least one metric value associated with at least one operating unit within an organization;
normalizing the at least one collected metric value to a common scale to obtain normalized metric values; and
calculating the process deployment index based on the normalized metric values, wherein the process deployment index is indicative of the extent of deployment of different processes within the organization.
2. The method as claimed in claim 1, wherein the at least one metric value is associated with at least one process area implemented within the organization.
3. The method as claimed in claim 1, wherein the at least one metric value is associated with at least one operating unit type.
4. The method as claimed in claim 1, wherein the process deployment index is displayed to one or more stakeholders associated with the organization.
5. The method as claimed in claim 1, further comprises verifying the correctness of the collected metric values based on a set of predefined rules.
6. The method as claimed in claim 5, wherein the verifying comprises generating a request to re-enter the at least one of the collected metric value.
7. The method as claimed in claim 1, further comprises comparing the process deployment index with one from a group consisting of pre-defined threshold limits and historically collected data.
8. The method as claimed in claim 1, further comprises associating the process deployment index with visual indicators to represent poor, average, and acceptable performance of an underlying process.
9. The method as claimed in claim 8. further comprises generating a critical indication using the visual indicator when the process deployment index exceeds at least one threshold limit.

10. The method as claimed in claim 8, further comprises providing an indication when the process deployment index of a current reporting period varies with respect to process deployment index of a previous reporting period.
11. The method as claimed in claim 8. further comprises generating a comparative analytics of the process deployment index for the at least one metric value over a predetermined time period based on statistical techniques.
12. A system (102) system for evaluating different processes comprising :
a processor (202);
a memory (206) coupled to the processor (202), wherein the memory (206) comprises, a conversion module (212) configured to convert metrics (218), associated
with at least one operating unit within an organization, to a standard unit of
measurement; and
an analysis module (110) configured to analyze the metrics (218) based on a
set of rules.
13. The system (102) as claimed in claim 13, wherein the conversion module (212) is further configured to convert the metrics (218) to a scale of 1-10.
14. The system (102) as claimed in claim 13, wherein the analysis module (110) is further configured to determine, based on rules and historical data (220), a process deployment index.
15. The system (102) as claimed in claim 13, wherein the analysis module (110) is further configured to display the process deployment index as one from a group consisting of bar graphs, pie charts, and color indications.
16. The system (102) as claimed in claim 13, wherein the analysis module (110) is configured to calculate the process deployment index value for a predetermined time period.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 2814-MUM-2010-POWER OF ATTORNEY(29-11-2010).pdf 2010-11-29
1 2814-MUM-2010-RELEVANT DOCUMENTS [26-09-2023(online)].pdf 2023-09-26
2 2814-MUM-2010-FORM 1(29-11-2010).pdf 2010-11-29
2 2814-MUM-2010-RELEVANT DOCUMENTS [27-09-2022(online)].pdf 2022-09-27
3 2814-MUM-2010-RELEVANT DOCUMENTS [28-09-2021(online)].pdf 2021-09-28
3 2814-MUM-2010-CORRESPONDENCE(29-11-2010).pdf 2010-11-29
4 2814-MUM-2010-RELEVANT DOCUMENTS [29-03-2020(online)].pdf 2020-03-29
4 2814-MUM-2010-Information under section 8(2) (MANDATORY) [29-03-2018(online)].pdf 2018-03-29
5 2814-MUM-2010-IntimationOfGrant29-08-2019.pdf 2019-08-29
5 2814-MUM-2010-FORM 3 [29-03-2018(online)].pdf 2018-03-29
6 2814-MUM-2010-PatentCertificate29-08-2019.pdf 2019-08-29
6 2814-MUM-2010-OTHERS [10-04-2018(online)].pdf 2018-04-10
7 2814-MUM-2010-Written submissions and relevant documents (MANDATORY) [07-08-2019(online)].pdf 2019-08-07
7 2814-MUM-2010-FER_SER_REPLY [10-04-2018(online)].pdf 2018-04-10
8 2814-MUM-2010-ORIGINAL UR 6(1A) FORM 26-300719.pdf 2019-08-06
8 2814-MUM-2010-CORRESPONDENCE [10-04-2018(online)].pdf 2018-04-10
9 2814-MUM-2010-AMMENDED DOCUMENTS [05-08-2019(online)].pdf 2019-08-05
9 2814-MUM-2010-COMPLETE SPECIFICATION [10-04-2018(online)].pdf 2018-04-10
10 2814-MUM-2010-CLAIMS [10-04-2018(online)].pdf 2018-04-10
10 2814-MUM-2010-FORM 13 [05-08-2019(online)].pdf 2019-08-05
11 2814-MUM-2010-MARKED COPIES OF AMENDEMENTS [05-08-2019(online)].pdf 2019-08-05
11 abstract1.jpg 2018-08-10
12 2814-MUM-2010-ExtendedHearingNoticeLetter_24-07-2019.pdf 2019-07-24
12 2814-mum-2010-form 5.pdf 2018-08-10
13 2814-mum-2010-form 3.pdf 2018-08-10
13 2814-MUM-2010-FORM-26 [23-07-2019(online)].pdf 2019-07-23
14 2814-MUM-2010-Correspondence to notify the Controller (Mandatory) [17-07-2019(online)].pdf 2019-07-17
14 2814-mum-2010-form 2.pdf 2018-08-10
15 2814-mum-2010-ExtendedHearingNoticeLetter_24Jul2019.pdf 2019-06-24
15 2814-mum-2010-form 2(title page).pdf 2018-08-10
16 2814-MUM-2010-FORM 18(18-8-2011).pdf 2018-08-10
16 2814-MUM-2010-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [21-06-2019(online)].pdf 2019-06-21
17 2814-MUM-2010-HearingNoticeLetter.pdf 2019-05-24
17 2814-mum-2010-form 1.pdf 2018-08-10
18 2814-mum-2010-abstract.pdf 2018-08-10
18 2814-MUM-2010-FER.pdf 2018-08-10
19 2814-mum-2010-claims.pdf 2018-08-10
19 2814-mum-2010-drawing.pdf 2018-08-10
20 2814-MUM-2010-CORRESPONDENCE(18-8-2011).pdf 2018-08-10
20 2814-mum-2010-description(complete).pdf 2018-08-10
21 2814-mum-2010-correspondence.pdf 2018-08-10
22 2814-MUM-2010-CORRESPONDENCE(18-8-2011).pdf 2018-08-10
22 2814-mum-2010-description(complete).pdf 2018-08-10
23 2814-mum-2010-claims.pdf 2018-08-10
23 2814-mum-2010-drawing.pdf 2018-08-10
24 2814-MUM-2010-FER.pdf 2018-08-10
24 2814-mum-2010-abstract.pdf 2018-08-10
25 2814-MUM-2010-HearingNoticeLetter.pdf 2019-05-24
25 2814-mum-2010-form 1.pdf 2018-08-10
26 2814-MUM-2010-FORM 18(18-8-2011).pdf 2018-08-10
26 2814-MUM-2010-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [21-06-2019(online)].pdf 2019-06-21
27 2814-mum-2010-ExtendedHearingNoticeLetter_24Jul2019.pdf 2019-06-24
27 2814-mum-2010-form 2(title page).pdf 2018-08-10
28 2814-MUM-2010-Correspondence to notify the Controller (Mandatory) [17-07-2019(online)].pdf 2019-07-17
28 2814-mum-2010-form 2.pdf 2018-08-10
29 2814-mum-2010-form 3.pdf 2018-08-10
29 2814-MUM-2010-FORM-26 [23-07-2019(online)].pdf 2019-07-23
30 2814-MUM-2010-ExtendedHearingNoticeLetter_24-07-2019.pdf 2019-07-24
30 2814-mum-2010-form 5.pdf 2018-08-10
31 2814-MUM-2010-MARKED COPIES OF AMENDEMENTS [05-08-2019(online)].pdf 2019-08-05
31 abstract1.jpg 2018-08-10
32 2814-MUM-2010-CLAIMS [10-04-2018(online)].pdf 2018-04-10
32 2814-MUM-2010-FORM 13 [05-08-2019(online)].pdf 2019-08-05
33 2814-MUM-2010-AMMENDED DOCUMENTS [05-08-2019(online)].pdf 2019-08-05
33 2814-MUM-2010-COMPLETE SPECIFICATION [10-04-2018(online)].pdf 2018-04-10
34 2814-MUM-2010-CORRESPONDENCE [10-04-2018(online)].pdf 2018-04-10
34 2814-MUM-2010-ORIGINAL UR 6(1A) FORM 26-300719.pdf 2019-08-06
35 2814-MUM-2010-FER_SER_REPLY [10-04-2018(online)].pdf 2018-04-10
35 2814-MUM-2010-Written submissions and relevant documents (MANDATORY) [07-08-2019(online)].pdf 2019-08-07
36 2814-MUM-2010-PatentCertificate29-08-2019.pdf 2019-08-29
36 2814-MUM-2010-OTHERS [10-04-2018(online)].pdf 2018-04-10
37 2814-MUM-2010-IntimationOfGrant29-08-2019.pdf 2019-08-29
37 2814-MUM-2010-FORM 3 [29-03-2018(online)].pdf 2018-03-29
38 2814-MUM-2010-RELEVANT DOCUMENTS [29-03-2020(online)].pdf 2020-03-29
38 2814-MUM-2010-Information under section 8(2) (MANDATORY) [29-03-2018(online)].pdf 2018-03-29
39 2814-MUM-2010-RELEVANT DOCUMENTS [28-09-2021(online)].pdf 2021-09-28
39 2814-MUM-2010-CORRESPONDENCE(29-11-2010).pdf 2010-11-29
40 2814-MUM-2010-RELEVANT DOCUMENTS [27-09-2022(online)].pdf 2022-09-27
40 2814-MUM-2010-FORM 1(29-11-2010).pdf 2010-11-29
41 2814-MUM-2010-RELEVANT DOCUMENTS [26-09-2023(online)].pdf 2023-09-26
41 2814-MUM-2010-POWER OF ATTORNEY(29-11-2010).pdf 2010-11-29

Search Strategy

1 2814_MUM_2010_search_01-09-2017.pdf

ERegister / Renewals

3rd: 04 Oct 2019

From 11/10/2012 - To 11/10/2013

4th: 04 Oct 2019

From 11/10/2013 - To 11/10/2014

5th: 04 Oct 2019

From 11/10/2014 - To 11/10/2015

6th: 04 Oct 2019

From 11/10/2015 - To 11/10/2016

7th: 04 Oct 2019

From 11/10/2016 - To 11/10/2017

8th: 04 Oct 2019

From 11/10/2017 - To 11/10/2018

9th: 04 Oct 2019

From 11/10/2018 - To 11/10/2019

10th: 04 Oct 2019

From 11/10/2019 - To 11/10/2020

11th: 30 Sep 2020

From 11/10/2020 - To 11/10/2021

12th: 28 Sep 2021

From 11/10/2021 - To 11/10/2022

13th: 07 Oct 2022

From 11/10/2022 - To 11/10/2023

14th: 10 Oct 2023

From 11/10/2023 - To 11/10/2024

15th: 08 Oct 2024

From 11/10/2024 - To 11/10/2025

16th: 09 Oct 2025

From 11/10/2025 - To 11/10/2026