Sign In to Follow Application
View All Documents & Correspondence

Information Technology Infrastructure Services Transition Quality Management

Abstract: Systems and methods for information technology infrastructure services transition quality management are described. According to the present subject matter, the system(s) implements the described method(s) for information technology infrastructure services transition quality management. The method includes generating a transition quality kit for each of a plurality of technology teams based on one or more service quality parameters, where the transition quality kit includes actual team data of the corresponding technology teams, and where the one or more service quality parameters include customer preferences and program requirements. The method further includes ascertaining a team transition quality rating of each of the plurality of technology teams based on the corresponding transition quality kit and team target data corresponding to the each of the plurality of technology teams, where the team transition quality rating indicates a team status update of the corresponding technology teams.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
04 April 2013
Publication Number
15/2015
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
iprdel@lakshmisri.com
Parent Application
Patent Number
Legal Status
Grant Date
2021-12-30
Renewal Date

Applicants

TATA CONSULTANCY SERVICES LIMITED
Nirmal Building, 9th Floor, Nariman Point, Mumbai, Maharashtra 400021

Inventors

1. OMONISHI, Ms. Toshiko
2715 Melcombe Circle #308, Troy Michigan 48084
2. BEYLIN, Mr. Alex
4307 Stoddard Road, West Bloomfield, Michigan 48323
3. LUKIBANOV, Dr. Oleg
960 River Mist Drive, Rochester Michigan 48307
4. CHELLAPPA, Mr. Bhaskar Kanthadai
Tata Consultancy Services Plot No.G1, SIPCOT INFORMATION TECHNOLOGY PARK, NAVALUR POST, Chennai Tamil Nadu 603 103
5. PARAMASIVAM, Mr. Saravanan
1901 Somerset Blvd Apt #101, Troy Michigan 48084

Specification

FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)
1. Title of the invention: INFORMATION TECHNOLOGY INFRASTRUCTURE SERVICES
TRANSITION QUALITY MANAGEMENT
2. Applicant(s)
NAME NATIONALITY ADDRESS
TATA CONSULTANCY Indian Nirmal Building, 9th Floor, Nariman
SERVICES LIMITED Point, Mumbai, Maharashtra 400021,
India
3. Preamble to the description
COMPLETE SPECIFICATION
The following specification particularly describes the invention and the manner in which it
is to be performed.

TECHNICAL FIELD
[0001] The present subject matter relates, in general, to information technology
infrastructure service transition and, in particular, to information technology infrastructure service transition quality management.
BACKGROUND
[0002] Information technology infrastructure is generally understood as a combination of
computer hardware, software, data, storage technology, and networks that provides a portfolio of shared information technology resources to an organization. Typically, most organizations rely upon third-party information technology infrastructure service providers that specialize in providing information technology infrastructure services for satisfying their information technology infrastructure requirements.
[0003] Information technology infrastructure as a service refers to the implementation
and management of information technology infrastructure to meet the needs of business of an organization. The information technology infrastructure services are typically designed by a third-party information technology service provider to make the infrastructure of the organization more capable of supporting expanded business capabilities, improving the availability and integrity, and optimizing the use of information technology resources to support business. Further, providing such information technology infrastructure services typically involve transition of the information technology infrastructure from the organization to the third-party information technology infrastructure service provider. Such transition involves the transfer of knowledge regarding the current state of information technology infrastructure owned by the organization, the organizations business information, and standard operating procedures to the third-party information technology infrastructure service provider.
SUMMARY
[0004] This summary is provided to introduce concepts related to information technology
transition quality management. This summary is neither intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.

[0005] In an embodiment, a method for information technology transition quality
management is described. The method includes generating a transition quality kit for each of a plurality of technology teams based on one or more service quality parameters, where the transition quality kit includes actual team data of the corresponding technology teams, and where the one or more service quality parameters include customer preferences and program requirements. The method further includes ascertaining a team transition quality rating of each of the plurality of technology teams based on the corresponding transition quality kit and team target data corresponding to the each of the plurality of technology teams, where the team transition quality rating indicates a team status update of the corresponding technology teams.
BRIEF DESCRIPTION OF THE FIGURES
[0006] The detailed description is described with reference to the accompanying figures.
In the figures, the left-mostdigit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and components.
[0007] Fig. 1(a) illustrates a transition quality management method, in accordance with
an embodiment of the present subject matter;
[0008] Fig. 1(b) illustrates a relationship between team status interface, a sub-platform of
the transition quality kit, and other sub-platforms of the transition quality kit for the transition quality management, in accordance with an embodiment of the present subject matter;
[0009] Fig. 1(c) to Fig. 1(j) illustrate user interface screens of transition quality kit for the
transition quality management method, in accordance with an embodiment of the present subject matter; and
[0010] Fig. 2 illustrates a communication environment implementing a transition quality
management system, in accordance with an embodiment of the present subject matter.
[0011] In the present document, the word "exemplary" is used herein to mean "serving as
an example, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.

[0012] It should be appreciated by those skilled in the art that any block diagrams herein
represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
[0013] System and method for information technology infrastructure service transition
quality management are described herein. Information technology infrastructure services may be described as implementation and management of information technology infrastructure of an organization by third-party information technology service providers. Such information technology infrastructure services may have to be transitioned from a current service provider to the third-party information technology service providers. Generally, the objective of the third-party information technology service providers, during the transition of the information technology infrastructure services, is to acquire knowledge of the infrastructure services for taking over responsibility of such operations from current service provider(s). The current service provider(s) may include an in-house team of the organization, where the in-house team provides the infrastructure services within the organization or a third party service provider contracted by the organization for providing the infrastructure services. The information technology infrastructure service transition is typically completed over two phases: a transition planning phase and a transition execution phase. The transition planning phase generally involves validating scope of services, detailing out project plan for the transition, and defining customer preferences. The transition execution phase typically involves execution of the project plan for transition and is further divided in to three phases: a Knowledge Transfer (KT) phase, a Secondary Support (SS) phase, and a Primary Support (PS) phase.
[0014] Typically, the KT phase involves acquiring, from the current service provider,
infrastructure and application knowledge and available documentation that are necessary in order to provide in-scope infrastructure services. In the SS phase initiated after the KT phase, technology teams of the third-party information technology service provider shadows the current service provider’s teams to gain practical knowledge of the customer’s information technology

infrastructure environment. Further, in the PS phase the technology teams begin taking responsibilities of daily operational tasks. At the end of the transition execution phase, the technology team executes a formal sign-off to move in to a steady state indicating completion of the information technology infrastructure transition program.
[0015] Typically, information technology infrastructure services transition is a complex
process which involves multiple technology teams. Further, the technology teams may be located in geographically distant locations, and may thus be collaborating over a prolonged period of time. Because of such complexity, many issues, such as uncertainty of the current state and progress of the transition program may arise. Further, there is a difficulty in identifying critical path and establishing appropriate governance structure for the transition program. As a result, the quality of transition program may suffer due to lack of coordination among technology teams. For example, lack of coordination among technology teams may result in insufficient knowledge transfer and missing of some critical tasks. Conventional quality management techniques used in project management are generic and unable to effectively manage the information technology infrastructure transition and quality of information technology infrastructure transition.
[0016] The present subject matter describes a system and a method for information
technology infrastructure service transition quality management. In accordance with the present subject matter, the transition quality management is performed based on transition quality assessment and program transition quality evaluation. The quality assessment of transition of technology teams performing the transition, i.e., assessment of technology team transition quality is performed using a systematic assessment platform, referred to as a transition quality kit (TQK). Further, the quality of transition at a program level is evaluated based on a program status report generated utilizing progress and quality data submitted through various TQKs.
[0017] In accordance with an embodiment of the present subject matter, service
parameters are obtained for transition quality management, where the service parameters include program requirements and customer preferences. The program requirements include requirements, such as number of resources, number of technology teams, and task list for effective information technology infrastructure transition. The customer preferences define the customer expectations from the information technology infrastructure transition program and include preferences, such as target date of program completion and cost of program. Further,

based on the obtained service quality parameters, one or more TQKs are generated for the technology teams. The TQKs may be defined as a systematic assessment platform having multiple tracking sub-platforms, such as task list track, question answer track, standard operating procedure track, access track, playback evaluation track, and validation track for assessment of technology teams’ transition quality, i.e., quality of transition of technology teams performing the transition.
[0018] Further the team transition quality is assessed for the technology teams based on
the TQKs and target data for the teams, i.e., team target data. The team target data indicates the number of tasks that should be completed and the date by which the task should be completed. Further, quality ratings are determined based on comparison of the actual data included in the TQKs with the team target data. The rating may be in the form of good, average, and poor or satisfactory and unsatisfactory.
[0019] Subsequently, a program status report is generated based on the TQKs for
program transition quality evaluation. In one implementation, the program status report may be defined as a systematic assessment platform having multiple tracking sub-platforms, such as transition schedule track. Subsequently, the program transition quality is evaluated based on the program status report and a target data of the program, i.e., a program target data. Further, the evaluation includes comparison of actual data in the program status report and the program target data to obtain program status update and program transition quality grade.
[0020] The present subject matter thus provides an efficient method and system for
information technology infrastructure service transition quality management. Implementing the transition quality management using a bottom-up approach, i.e., beginning from the team transition quality management and moving towards program transition quality management helps in achieving an increased predictability of the status and reduced uncertainty about the progress of the information technology infrastructure transition. Further, target data and actual data comparison, systematic way of identifying, and tracking risks and issues at the team level and program level, facilitates in enabling effective assessment and transparency in the transition quality management. Furthermore, the transition quality management technique is customizable for the infrastructure transition programs and thus can be repeated for multiple infrastructure transition programs of various size, scope, and complexity without involving any substantially

additional costs and efforts. These and other advantages of the present subject matter would be described in greater detail in conjunction with the following figures.
[0021] The described methodologies can be implemented in hardware, firmware,
software, or a combination thereof. Herein, the term "system" encompasses logic implemented by software, hardware, firmware, or a combination thereof. For a firmware and/or software implementation, the methodologies can be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine readable medium tangibly embodying instructions can be used in implementing the methodologies described herein. For example, software codes and programs can be stored in a memory and executed by a processing unit. Memory can be implemented within the processing unit or may be external to the processing unit. As used herein the term "memory" refers to any type of long term, short term, volatile, non-volatile, or other storage devices and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
[0022] In another firmware and/or software implementation, the functions may be stored
as one or more instructions or code on a non transitory computer-readable medium. Examples include computer-readable media encoded with a data structure and computer-readable media encoded with a computer program. Computer-readable media may take the form of an article of manufacture. Computer-readable media includes physical computer storage media. A storage medium may be any available medium that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer; disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), and Blue-ray disc. Combinations of the above should also be included within the scope of computer-readable media.
[0023] The manner in which the systems and methods for information technology
infrastructure service transition quality management be implemented has been explained in details with respect to the Fig. 1 and 2. While aspects of the described systems and methods can be implemented in any number of different computing systems, transmission environments

and/or configurations, the embodiments are described in context of the following exemplary system(s).
[0024] Fig. 1(a) illustrates exemplary transition quality management method 100 for
transition quality management of information technology infrastructure services, according to an embodiment of the present subject matter. Fig. 1(b) illustrates a relationship between team status interface, a sub-platform of the transition quality kit, and other sub-platforms of the transition quality kit for the transition quality management, in accordance with an embodiment of the present subject matter. Further, Fig. 1(c)-Fig. 1(j) illustrate user interface screens of sub-platforms of transition quality kits for the transition quality management method, in accordance with the example. In particular, Fig. 1(c), and Fig. 1(d) illustrate exploded view of user interface screen of technology team status track, a sub-platform of the transition quality kit. Fig. 1(e) illustrates user interface screen of task list track, a sub-platform of the transition quality kit. Fig 1(f) illustrates user interface screen of question and answer track, a sub-platform of the transition quality kit. Fig. 1(g) illustrates user interface screen of standard operating procedure track, a sub-platform of the transition quality kit. Fig. 1(h) illustrates user interface screen of access track, a sub-platform of the transition quality kit. Fig. 1(i) illustrates user interface screen of playback track, a sub-platform of the transition quality kit. Fig. 1(j) illustrates user interface screen of validation track, a sub-platform of the transition quality kit. For the sake of clarity and avoiding repetition, Fig 1(a)-Fig 1(j) have been further explained in conjunction.
[0025] The method 100 may be described in the general context of computer executable
instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, and modules, functions that perform particular functions or implement particular abstract data types. The method 100 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communication network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0026] The order in which the methods are described is not intended to be construed as a
limitation, and any number of the described method blocks can be combined in any order to implement the methods or any alternative methods. Additionally, individual block may be

deleted from the methods without departing from the spirit and scope of the subject matter described herein. Furthermore, the methods can be implemented in any suitable hardware, software, firmware, or combination thereof.
[0027] For the sake of clarity, and not as a limitation, the method 100 is explained using
an example of a banking company offering a variety of services. Examples of such services include, but are not limited to, vehicle loans, saving account, houses loans, etc. It should be appreciated that the present example is provided for sake of illustration only and should not construed as a limitation onto the scope of the present subject matter. The banking company may have offices in multiple countries and have integrated network, multiple servers, and databases for supporting the day to day business operations. The banking company may further decide to outsource its infrastructure services to a third party information technology infrastructure service provider in order to achieve cost efficiency, operational efficiency, and uniformity. The third party service provider may employ the transition quality management system and method for effective quality management of information technology infrastructure service transition.
[0028] A team performing the transition typically includes multiple technology teams.
These technology teams are generally classified in to towers according to the associated technology. In larger service contract engagements, towers may be further divided to sub-tower level. In the described banking company example, the outsourcing contract scope includes two towers - computing services (CS) tower and network services (NS) tower. The CS tower has sub-towers for three technology streams - UNIX, Windows, and Databases. Similarly, NS tower has data and voice sub-towers.
[0029] According to the present example, the banking company may be using UNIX
servers in one region, say, the North American region and Windows servers in another regions, say, the European regions. In this case, the UNIX team may be located in the U.S.A to cover the North American region while the Windows team may be located in London to cover the European regions.
[0030] Referring to the method 100 in Fig. 1(a), at block 102, service quality parameters
are obtained for transition quality management of an organization, for example, the banking company described in the above example. In one implementation, the service quality parameters may include program requirements and customer preferences. The program requirements may

include program timeline, technology team structure, number of resources of the organization, scope of services of the expected from the contracted third-party service provider, tasks to be performed by the technology team, and standard operating procedure (SOP) document to be updated or created by the technology team. The customer preferences may include program completion target date, effective governance model for transition, requisite resource skills , resource knowledge levels required, and service performance levels expected. As will understood by a person skilled in the art, the service quality parameters are based on the technology for which transition is being performed and the services quality parameters may vary with respect to the variation in technology.
[0031] For instance, in the previous example of the banking company, the program
requirements may include the number of technology teams of the banking company, such as five technology teams, broadly divided in two towers, CS tower and NS tower and further divided into five sub-towers UNIX, Windows, Databases, Network Data, and Voice. Further, the program requirements may include task lists for the technology teams, such as the UNIX team’s task list may include Standard Server Deployment, Hardening and Decommissioning Procedure; whereas Network Data task list may include LAN configuration and WAN configuration. The customer preferences may comprise milestone dates which may vary per technology teams. In the said example, UNIX team’s knowledge transfer phase end milestone may be 16 March 2012, and Database team’s KT phase end milestone date may be 30 March 2012. Other customer preferences may be at the project level, such as transition completion target date of 1 July 2012.
[0032] At block 104, transition quality kits (TQKs) are generated for each of technology
teams based on the service quality parameters. The TQKs may be defined as a systematic assessment platform for obtaining project management related information, such as team status, issues and risks that are relevant to the team and team status update with respect to progress and quality data. Further TQKs contain multiple sub assessment platform for determination of team transition quality, such as task list track, questions and answers track, standard operating procedure topics track, access acquisition track, playback topics and evaluation track, and validation checklist track.
[0033] At block 106, the progress data and quality data in TQKs are ascertained for each
technology teams, and progress and quality of transition at team level are assessed. During the

transition execution phase, the TQKs are updated with actual progress and quality data by the technology team leads at predefined intervals, for example, weekly as shown in fig. 1(c) and fig. 1(d). The actual progress and quality data may include the number of tasks completed, number of system access acquired by the team members, number of SOP topics completed, Playback sessions conducted and quality rating received from the customer, etc., as shown in Fig. 1(e) through Fig. 1(j). Further, an assessment is performed of the actual data supplied by each of the technology teams through TQKs with a target data of the technology team’s transition quality, i.e., team target data. Based on the assessment, progress status information is determined in form of percentage, such as % task complete, % access obtained, etc., and quality is determined based on satisfactory and unsatisfactory rating, as shown in Fig. 1(b). The determination signifies ascertaining the quality and progress of the transition performed by the technology teams. Further, a grading may be obtained based on the grading criteria for the transition quality, using colour codes such as green, yellow, and red.
[0034] For instance, in the previous example of the banking company, the service quality
parameter “Task Completion” for the UNIX Team comprises target data, such as 75 tasks to be completed by 25 May 2012. Furthermore, quality grading ranges are set for green if 95% and above target, yellow if between 80% and 95% of target, and red if below 80% of target. As of 25 May 2012, the actual data in the TQK of the UNIX team shows 71 KT tasks are completed. Further based on the TQK and the team target data, the UNIX team’s task completion progress of 95% and grading green is determined, as shown in Fig. 1 (d). Further, the TQKs may indicate that the UNIX team was scheduled to perform a Playback session on 25 May 2012 on the topic of Zone Management (Solaris). Playback session may be described as a review meeting between the third party service provider and customer where the service provider demonstrates their knowledge acquired on particular topic.
[0035] At the end of Playback session, customers are requested to evaluate the third party
service provider’s knowledge level and give them a quality rating. In the said example, the TQK may show that the team indeed conducted a Playback session on 25 May 2012, but customer gave unsatisfactory score of 1, which degraded the cumulative score the parameter. In this case, the service quality parameter “KT topics covered by Playback sessions” may be on schedule, however, the service quality parameter “Playback session evaluation” whose target score is 2, was not satisfied due to the unsatisfactory score given by the customer this week as shown in

Fig. 1(i). Based on such assessment, green grading for Topics covered parameter, and yellow grading is assigned for Playback session evaluation for this week’s UNIX team’s TQK status, as shown in Fig. 1 (d).
[0036] At the block 108, program status report is generated based on the TQKs of the
technology teams. Program status report may be described as systematic assessment platform for the purpose of transition program quality evaluation. It may be subdivided in to multiple sub platforms. A sub platform may be the generated progress and transition quality matrix, such as number of tasks completed and playback session evaluation information from TQKs. The numerical progress and quality information is a systematic consolidation of TQK data from all technology teams. Further, the other sub platform may be a narrative program management related data, such as comments from the team leads regarding the issues and risks they are currently facing.
[0037] At the block 110, the program transition quality is evaluated based on the program
status report. The evaluation is performed based on the comparison of an actual data included in the program status report and a target data of program transition quality, i.e., program target data. Further based on such evaluation, program status in form of percent completion, and program transition grading indicative of transition quality, are obtained. Further, a transition quality matrix may also be generated. The transition quality matrix is generated as cumulative matrix, often represented as array(s), which can be further enhanced by usage of weighted transition quality matrix. The weight of the weighted transition quality matrix depends on phase of the transition and customer preferences. For example, in the knowledge transition phase more weight will be given to the playback session rating, where during Secondary Support phase it will be reduced and more weight is given to the completeness of Standard Operating Procedure documentation. Customer preferences also play a role in determining the weight of the weighted transition quality matrix. Further, an alert may be generated for taking corrective action to maintain the quality of the transition. Furthermore, based on the quality assessment at individual team level through the TQK and quality evaluation at program level through the program status report, effective transition quality management is enabled.
[0038] For instance, in the previous example of the banking company, the actual task
completion percentage obtained from TQKs of the UNIX team, Windows team, and Database

team are 95%, 72%, and 85% respectively. As the UNIX team, Windows team, and Database team are sub-towers of CS Tower, the CS Tower’s actual task completion percentage of 84% is obtained based on an average of the actual task completion percentage of the UNIX team, Windows team, and Database team. The CS Tower’s the actual task completion = 84% is compared with the CS Tower’s target data = 92%. Further, based on such comparison the CS tower status is evaluated to determine that the CS Tower is behind schedule in task completion by 9% (i.e., 84%/92% = 91%). Further, the program transition grading indicative of the progress status is obtained based on the evaluation, where the grading criterion is the status colour Green for 95% and above, Yellow for between 80% and 95%, and Red for below 80%. Furthermore, a grading of yellow is provided to the CS tower in current week’s program status report. Similarly, based on the actual data obtained from TQKs of the Data team Voice team, the actual task completion of the Network tower = 103% is obtained and a green grading is provided. Further, an overall actual task completion at program level = 97% is determined based on an average of the CS tower and the NS tower and a grading of green is obtained. Further, an alert such as “CS Tower-attention required” is issued to the transition manager, who may take corrective action, thus enabling effective transition quality management. Further, table 1 illustrates the percentage completion and the quality grading for each of the towers and their sub towers in accordance with the example.
Table 1

Tower Target
92%
73% Actual
84%
75% Actual/ Target
91% 103% Status Color Sub-Tower Target
92%
92% 92%
65% Actual
95%
72% 85%
64% Actual/ Target
103%
78% 92%
98% Status Color
CS


Yellow(80-95% of Target) UNIX


Green (95% or above target)




Windows


Red (80% or below target)




Database


Yellow(80-95% of target)
NS


Green (95% or above target) Data


Green (95% or above target)

Voice 80% 85% 106% Green (95% or above target)
[0039] In another embodiment, program management related information include in the
program status report may be evaluated by a transition manager. After evaluation, transition manager may add comment about notable accomplishment and challenges for that reporting weeks and action items to address any risky or problematic areas, along with overall program level progress. The program status report may further used for internal review by the third party service provider and a status review with customers. Furthermore, the transition program quality evaluated may effectively steer the transition program, manage customer expectations, and eventually obtain customer acceptance at milestones. In an example, the generated TQKs and the determined quality of the technology teams may be the acceptance criteria for obtaining customer satisfaction and milestone completion.
[0040] Fig. 2 illustrates a communication environment 200 implementing transition
quality management system 202 for information technology infrastructure service transition quality management. For the sake of explanation, the transition quality management system 202 is referred to as the system 202 hereinafter. The present subject matter is explained in the context of a travel company offering a variety of services. It should be appreciated that the present example is provided for sake of illustration only and should not construed as a limitation onto the scope of the present subject matter.
[0041] According to the example, the company may decide to outsource its infrastructure
services to a third party information technology infrastructure service provider in order to achieve cost efficiency and operational efficiency. The company may have offices in multiple countries with, multiple servers for supporting the day to day business operations. According to the present example, the travel company may be using UNIX servers in the Africa region while Asia regionmay be using Windows for operating system. In this case, the UNIX team is located in the South Africa to cover the Africa region transition and the Windows team is located in India to cover Asia region transition. In one implementation these technology teams working at different transition sites may be connected with the system 202 using 204-1, 204-2,..., 204-N

individually referred to as the device 204 and commonly referred to as devices 204 through a transition network 206, referred to as the network 206 hereinafter for transition quality management.
[0042] The system 202 may be implemented within a variety of communication devices,
such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, and the like. The system 202 described herein, can also be implemented in any network environment comprising a variety of network devices, including routers, bridges, servers, computing devices, storage devices, etc.
[0043] The devices 204 may be implemented as, but are not limited to, desktop
computers, hand-held devices, laptops, or other portable computers, tablet computers, mobile phones, PDAs, Smartphone, and the like. The system 202 may be located within the devices 204 or may be located outside the devices 204 at different geographic locations as compared to that of the devices 204. Further, the devices 204 may themselves be located either within the vicinity of each other, or may be located at different geographic locations.
[0044] The network 206 may be a wireless or a wired network, or a combination thereof.
The network 206 can be a collection of individual networks, interconnected with each other and functioning as a single large network (e.g., the internet or Internet of Things (IoT)). Examples of such individual networks include, but are not limited to, Global System for Mobile Communication (GSM) network, Universal Mobile Telecommunications System (UMTS) network, Personal Communications Service (PCS) network, Time Division Multiple Access (TDMA) network, Code Division Multiple Access (CDMA) network, Next Generation Network (NGN), Public Switched Telephone Network (PSTN), and Integrated Services Digital Network (ISDN). Depending on the technology, the network 206 includes various network entities, such as gateways, routers; however, such details have been omitted for ease of understanding.
[0045] In an implementation of the present subject matter, the system 202 is configured
to carry out the transition quality management in information technology infrastructure transition. Initially, the system 202 generates the TQKs based on predefined service parameters. The system 202 assesses the team transition quality based on a generated TQKs and team target data. Further, the system 202 evaluates program transition quality based on a program status report, where the program status report is generated utilizing the transition quality kits.

Furthermore, based on the evaluation and the assessment, the system 202 obtains program quality and program status update and team quality and team status update respectively, for effective transition quality management of the information technology infrastructure transition.
[0046] In one implementation, the system 202 includes a processor(s) 208. The processor
208 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) 208 is configured to fetch and execute computer-readable instructions stored in a memory.
[0047] The functions of the various elements shown in the Fig. 2, including any
functional block labelled as “processor(s)”, may be provided through the use of hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term “processor” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read only memory (ROM) for storing software, random access memory (RAM), non-volatile storage. Other hardware, conventional and/or custom, may also be included.
[0048] Also, the system 202 includes an interface(s) 210. The interface(s) 210 may
include a variety of software and hardware interfaces that allow the system 202 to interact with the entities of the network 206, or with each other. The interface(s) 210 may facilitate multiple communications within a wide variety of networks and protocol types, such as IoT network, including wire networks, for example, LAN, cable, etc., and wireless networks, for example, WLAN, cellular, satellite-based network, etc.
[0049] The system 202 may also include a memory 212. The memory 212 may be
coupled to the processor 208. The memory 212 can include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM), and dynamic random access memory (DRAM), and/or non-volatile memory, such as

read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[0050] Further, the system 202 may include module(s) 214 and data 216. The module(s)
214 may be coupled to the processors 208 and amongst other things, include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. The module(s) 214 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulate signals based on operational instructions.
[0051] Further, the module(s) 214 can be implemented in hardware, instructions executed
by a processing unit, or by a combination thereof. The processing unit can comprise a computer, a processor, a state machine, a logic array or any other suitable devices capable of processing instructions. The processing unit can be a general-purpose processor which executes instructions to cause the general-purpose processor to perform the tasks or, the processing unit can be dedicated to perform the required functions.
[0052] In another aspect of the present subject matter, the module(s) 214 may be
machine-readable instructions (software) which, when executed by a processor/processing unit, perform any of the described functionalities. The machine-readable instructions may be stored on an electronic memory device, hard disk, optical disk or other machine-readable storage medium or non-transitory medium. In one implementation, the machine-readable instructions can be also be downloaded to the storage medium via a network connection.
[0053] In an implementation, the module(s) 214 include a transition module 218, an
assessment module 220, a program module 222, an evaluation module 224, and other module(s) 226. The other module(s) 226 may include programs or coded instructions that supplement applications or functions performed by the system 202. In said implementation, the data 216 includes transition data 228, assessed data 230, program data 232, and other data 234. The other data 234, amongst other things, may serve as a repository for storing data that is processed, received, or generated as a result of the execution of one or more modules in the module(s) 214. Although the data 216 is shown internal to the system 202, it may be understood that the data 216 can reside in an external repository (not shown in the figure), which may be coupled to the

system 202 or be a part of the devices 204. The system 202 may communicate with the external repository through the interface(s) 210 to obtain information from the data 216.
[0054] In an implementation of the present subject matter, the transition module 218 is
configured to obtain service parameters for transition quality management from a data source, where the data source may include a user or a database, as described above for the block 102 of Fig. 1(a). The services parameters include customer preferences, such as target date for program completion, cost of program, and program requirement. Utilizing the service parameters, the transition module 218 is further configured to generate transition quality kits, as described above for the block 104 of Fig. 1(a). Further, the transition module is configured to store the TQKs in the transition data 228. In another implementation the transition module 218 may be configured to provide the TQKs to multiple technology teams. Further, the technology teams may update the TQKs provided at predefined intervals with actual employing the devices 204.
[0055] In the said implementation of the present subject matter, the assessment module
220 is configured to obtain the TQKs from the transition data 228. The TQKs includes the actual data updated by the technology teams at predefined intervals. Further, based on team target data and the transition quality kits, the assessment module 220 is configured to determine the team transition quality, as described above for the block 106 of Fig. 1(a). For the purpose, the assessment module 220 compares the actual data included in the TQKs and the team target data stored in the transition data 228. Further, based on the determination the assessment module 220 may assign a grading based on a grading criteria for the transition quality, using colour codes such as green, yellow, and red as described above for the block 106 of Fig. 1(a). The assessment module 220 is further configured to store the team status and the team quality rating in the assessed data 230.
[0056] Further, the program module 222 is configured to obtain the TQKs from the
transition data 228 and generate a program status report, as described above for the block 108 of Fig. 1(a). Furthermore, the program module 222 may be configured to generate an executive dashboard. The executive dashboard may be understood as a one-page summary version of the program status report. Further, the executive dashboard may be used for internal review of the third party service provider and also for a status review with the customer organization. Further,

the program module 222 may store the program status report and the executive dashboard in the program data 232.
[0057] In an implementation of the present subject matter, the evaluation module 224 is
configured to evaluate the program transition quality to obtain program status update, as described above for the block 110 of Fig. 1(a). The evaluation module 224 is further configured to store the obtained program status updates in the assessed data 230. Further, the evaluation module 224 is configured to generate an alert based on the team status and team quality rating stored in the assessed data 230 and program status updates stored in the assessed data 230. Further, corrective action may be performed for maintaining or improving the transition quality and enabling transition quality management.
[0058] Thus, utilizing the transition quality management techniques which include the
quality assessment at individual team level through the TQKs and quality evaluation at program level through the program status report, effective transition quality management is enabled.
[0059] Although implementations for information technology infrastructure service
transition quality management has been described in language specific to structural features and/or method, it is to be understood that the appended claims are not necessarily limited to the specific features or method described. Rather, the specific features and method are disclosed as exemplary implementations for information technology infrastructure service transition quality management.

I/We claim:
1. A computer implemented method for information technology infrastructure service
transition quality management, the method comprising:
generating a transition quality kit for each of a plurality of technology teams based on one or more service quality parameters, wherein the transition quality kit includes actual team data of the corresponding technology teams, and wherein the one or more service quality parameters include customer preferences and program requirements; and
ascertaining a team transition quality rating of each of the plurality of technology teams based on the corresponding transition quality kit and team target data corresponding to the each of the plurality of technology teams, wherein the team transition quality rating indicates a team status update of the corresponding technology team, wherein the team transition quality rating is used for information technology infrastructure service transition quality management, and wherein the team status update is indicative of progress in completion of information technology infrastructure service transition.
2. The computer implemented method as claimed in claim 1 further comprising:
generating a program status report based on the transition quality kit, wherein the program status report includes the team transition quality rating of each of the plurality of technology teams; and
evaluating a program transition quality based on the program status report and program target data, wherein the program transition quality indicates a program status update of information technology infrastructure service transition.
3. The computer implemented method as claimed in claim 2, wherein the method further
comprising:
generating an alert based on the ascertaining the team transition quality rating and the evaluating the program transition quality, wherein the alert indicates the team status update and the program status update; and

performing corrective actions based on the alert to enable information technology infrastructure service transition quality management.
4. The computer implemented method as claimed in claim 1, wherein the method further comprises obtaining the plurality of service quality parameters from a data source.
5. A transition quality management system (202) for information technology infrastructure service transition quality management, the transition quality management system (202) comprising:
a processor (208);
a transition module (218) coupled to the processor (208), configured to generate a transition quality kit for each of a plurality of technology teams based on one or more service quality parameters, wherein each of the transition quality kit includes actual team data of the corresponding technology teams, and wherein the one or more service quality parameters include customer preferences and program requirements;
an assessment module (220) coupled to the processor (208), configured to ascertain a team transition quality rating of each of the plurality of technology teams based on the corresponding transition quality kit and team target data corresponding to the each of the plurality of technology teams, wherein the team transition quality rating indicates a team status update of the corresponding technology teams;
a program module (222) coupled to the processor (208), configured to generate a program status report based on the transition quality kit, wherein the program status report includes the team transition quality of each of the plurality of technology teams; and
an evaluation module (224) coupled to the processor (208), configured to evaluate a program transition quality based on the program status report and program target data, wherein the program transition quality indicates a program status update of information technology infrastructure service transition.
6. The transition quality management system (202) as claimed in claim 5, wherein the
program module (222) coupled to the processor (208), is further configured to generate
an executive dashboard based on the program status report.

7. A non-transitory computer-readable medium having embodied thereon a computer
readable program code for executing a method, the method comprising:
obtaining one or more service quality parameters from a data source, wherein the plurality of service quality parameters include customer preferences and program requirements;
generating a transition quality kit for a plurality of technology teams based on one or more service quality parameters, wherein each of the transition quality kit includes actual team data of the corresponding technology teams; and
ascertaining a team transition quality rating of each of the plurality of technology teams based on the corresponding transition quality kit and team target data corresponding to the each of the plurality of technology teams, wherein the team transition quality rating indicates a team status update of the corresponding technology teams.
8. The non-transitory computer-readable medium having embodied thereon a computer
readable program code for executing the method as claimed in claim 7, the method
further comprising:
generating a program status report based on the transition quality kit, wherein the program status report includes the team transition quality of each of the plurality of technology teams; and
evaluating a program transition quality based on the program status report and program target data, wherein the program transition quality indicates a program status update of information technology infrastructure service transition.

Documents

Application Documents

# Name Date
1 1315-MUM-2013-RELEVANT DOCUMENTS [26-09-2023(online)].pdf 2023-09-26
1 SPEC IN.pdf 2018-08-11
2 1315-MUM-2013-IntimationOfGrant30-12-2021.pdf 2021-12-30
2 FORM 5.pdf 2018-08-11
3 FORM 3.pdf 2018-08-11
3 1315-MUM-2013-PatentCertificate30-12-2021.pdf 2021-12-30
4 FIGURES IN.pdf 2018-08-11
4 1315-MUM-2013-ABSTRACT [11-12-2019(online)].pdf 2019-12-11
5 ABSTRACT1.jpg 2018-08-11
5 1315-MUM-2013-CLAIMS [11-12-2019(online)].pdf 2019-12-11
6 1315-MUM-2013-FORM 26(13-5-2013).pdf 2018-08-11
6 1315-MUM-2013-DRAWING [11-12-2019(online)].pdf 2019-12-11
7 1315-MUM-2013-FORM 18.pdf 2018-08-11
7 1315-MUM-2013-FER_SER_REPLY [11-12-2019(online)].pdf 2019-12-11
8 1315-MUM-2013-FORM 1(17-4-2013).pdf 2018-08-11
8 1315-MUM-2013-FER.pdf 2019-06-14
9 1315-MUM-2013-CORRESPONDENCE(13-5-2013).pdf 2018-08-11
9 1315-MUM-2013-CORRESPONDENCE(17-4-2013).pdf 2018-08-11
10 1315-MUM-2013-CORRESPONDENCE(13-5-2013).pdf 2018-08-11
10 1315-MUM-2013-CORRESPONDENCE(17-4-2013).pdf 2018-08-11
11 1315-MUM-2013-FER.pdf 2019-06-14
11 1315-MUM-2013-FORM 1(17-4-2013).pdf 2018-08-11
12 1315-MUM-2013-FER_SER_REPLY [11-12-2019(online)].pdf 2019-12-11
12 1315-MUM-2013-FORM 18.pdf 2018-08-11
13 1315-MUM-2013-DRAWING [11-12-2019(online)].pdf 2019-12-11
13 1315-MUM-2013-FORM 26(13-5-2013).pdf 2018-08-11
14 1315-MUM-2013-CLAIMS [11-12-2019(online)].pdf 2019-12-11
14 ABSTRACT1.jpg 2018-08-11
15 1315-MUM-2013-ABSTRACT [11-12-2019(online)].pdf 2019-12-11
15 FIGURES IN.pdf 2018-08-11
16 1315-MUM-2013-PatentCertificate30-12-2021.pdf 2021-12-30
16 FORM 3.pdf 2018-08-11
17 1315-MUM-2013-IntimationOfGrant30-12-2021.pdf 2021-12-30
17 FORM 5.pdf 2018-08-11
18 SPEC IN.pdf 2018-08-11
18 1315-MUM-2013-RELEVANT DOCUMENTS [26-09-2023(online)].pdf 2023-09-26

Search Strategy

1 1315MUM2013search_13-06-2019.pdf
1 SearchAE_29-12-2021.pdf
2 1315MUM2013search_13-06-2019.pdf
2 SearchAE_29-12-2021.pdf

ERegister / Renewals

3rd: 03 Jan 2022

From 04/04/2015 - To 04/04/2016

4th: 03 Jan 2022

From 04/04/2016 - To 04/04/2017

5th: 03 Jan 2022

From 04/04/2017 - To 04/04/2018

6th: 03 Jan 2022

From 04/04/2018 - To 04/04/2019

7th: 03 Jan 2022

From 04/04/2019 - To 04/04/2020

8th: 03 Jan 2022

From 04/04/2020 - To 04/04/2021

9th: 03 Jan 2022

From 04/04/2021 - To 04/04/2022

10th: 03 Jan 2022

From 04/04/2022 - To 04/04/2023

11th: 31 Mar 2023

From 04/04/2023 - To 04/04/2024

12th: 02 Apr 2024

From 04/04/2024 - To 04/04/2025

13th: 02 Apr 2025

From 04/04/2025 - To 04/04/2026