Abstract: An automated system and method for assessing test process maturity levels during a testing project is disclosed. The system also provides key information to arrive at the gaps and strengths of the testing process with respect to industry standards. Significantly the system remains compliant to existing industry models and generates extensive reports on maturity levels across process areas and other compliance reports.
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
INTEGRATED ASSESSMENT FRAMEWORK FOR ASSURANCE
Applicant
TATA Consultancy Services Limited A company Incorporated in India under The Companies Act, 1956
Having address:
Nirmal Building, 9th Floor,
Nariman Point Mumbai 400021,
Maharashtra, India
The following specification particularly describes the invention, and the manner in which it is to be performed.
FIELD OF THE INVENTION
The present invention relates generally to software process engineering and more particularly relates to a systematic approach towards assessment of test process maturity levels in software development process.
BACKGROUND OF THE INVENTION
Organizations are increasingly becoming aware of improving their test process efficiency to meet their end quality standards. Though there are multiple test process assessment approaches available in industry, there is no framework which will provide compliance across various models, managing requests & reporting. These multiple test process assessment approaches also requires considerable amount of manual input and overhead to calculate the test process maturity.
Moreover, the existing prior arts require the user to conduct a separate assessment for prevailing industry models in order to find the compliance. There is also not enough focus provided upon the non-functional testing of the said industry models. The niche testing areas of security testing or usability testing also remains unattended. Amongst the available industry tools none manages the assessment requests, conduct assessments or perform the extensive reporting operation. The other limiting factor is the ability of the available tools to capture gaps only in the operational areas. To add upon the limitation is the absence of any consistent process for consulting.
The art has felt a need for an assessment process that can automatically measure the test process maturity with the best practices of industry models and those practiced within an organization and also generate compliance reports to industry models through a single assessment.
OBJECTIVES OF THE INVENTION
The principle object of the present invention is to provide an automated system and method for assessing the maturity levels of test processes within a software development life cycle through a single assessment.
The other object of the present invention is to make the single assessment system compliant to existing industry models that automatically generate compliance reports with respect to existing industry models.
Other major object of the present invention is to identify gaps and strengths of the testing process with respect to existing industry standards on maturity models.
It is another object of the present invention to provide a web enabled system to take due care of workflow, schedules, preparing of standard exhaustive questionnaire, extensive reports and other maintenance tasks including managing of assessment requests and conducting assessments.
One of the other objects of the present invention is to focus upon non-functional testing and other niche testing areas of security testing and usability testing.
Yet another object of the present invention is to prepare extensive reports on maturity levels of the corresponding process areas, category level compliance and related assurance compliance reports.
Another object of the present invention is to improve productivity, test efficiency and coverage, overall test management and planning, clarity in understanding and uniformity in test processes of the testing team.
Yet another object of the present invention is to reduce time spent in performing testing process and save costs by enabling early defect detection.
Still other object of the present invention is to provide efficient training planning, efficient knowledge management, effective resource planning and work allocation, improved effort management, better ownership and accountability, improved interaction and communication between stakeholders with better coordination and clarity in responsibilities.
SUMMARY
Before the present methods, systems, and hardware enablement are described, it is to be understood that this invention is not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments of the present Invention which are not expressly illustrated in the present disclosures. It is also to be understood that the terminology used in the description is for the purpose of describing the particular versions or embodiments only, and is not intended to limit the scope of the present invention which will be limited only by the appended claims.
The invention relates to an automated system and method of assessing the test process maturity levels based on heuristic analysis and compliant to various other existing industry models like TMMi and TPI Next. The system of the present invention comprises five functions and associated test process areas wherein the said test processes are further associated with rated checkpoints and mapped with four maturity levels based on test process maturity score. In an alternative embodiment, the system generates extensive reports of maturity levels across process areas, category level compliances and assurance compliance reports.
In accordance with one preferred embodiment of the present invention, the method of test process maturity level assessment comprises of the following steps: defining and categorizing plurality of test process areas across one or more functions; defining a set of checkpoints for each of the test process area to identify compliance level of the process and mapping the checkpoints to a set of indicators to assess maturity level of the process area; assigning a maturity score to each of the identified process area based upon the process definition and the compliance levels identified from the defined checkpoints; mapping each of the test process area with corresponding maturity levels based upon the assigned maturity score; and generating the maturity level and the compliance level report of the assessment test processes.
In one of the other preferred embodiments of the present invention, a test process maturity assessment system compliant to plurality of maturity models is presented, wherein the system comprises of: a user management module hosted on a server and connected to the test assessment system for determining the accessibility of plurality of users to a set of questions associated with test process areas defined and categorized
across one or more functions; a question management module hosted on the system and connected to the server for containing the question set in compliance to one or more maturity model; an assessment module hosted on the system to assess the question set associated with the defined and categorized process area and provide the corresponding maturity scores for each of the test process area; and a report generating module hosted on the system for generating maturity level reports and the compliance level reports for the assessment test processes.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the invention. These and other features of the present invention will become more fully apparent from the following description, or may be learned by the practice of the invention as set forth hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing summary, as well as the following detailed description of preferred embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, there is shown in the drawings example constructions of the invention; however, the invention is not limited to the specific system and method disclosed in the drawings:
Figure 1 shows the categorization of test process areas across multiple functions for assessing the maturity levels in accordance with most preferred embodiment of the present invention.
Figure 2 represents a sample maturity profile of the test process assessment system in accordance with a disclosed embodiment of the present invention.
Figure 3 is a sample compliance report in accordance with one of the preferred embodiments of the present invention.
Figure 4 is a sample subgroup level maturity report generated in accordance with one of the preferred embodiments of the present invention.
Figure 5 is a graphical illustration of category level maturity report in accordance with disclosed embodiment of the present invention.
Figure 6 represents the compliance of the test assessment system to TPI Next industry model in accordance with a disclosed embodiment of the present invention.
Figure 7 is a representation of the compliance of the test assessment system to TMMi model in accordance with a disclosed embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
Some embodiments of this invention, illustrating all its features, will now be discussed in detail. The words "comprising/' "having," "containing," and "including," and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items.
It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present invention, the preferred, systems and methods are now described.
As depicted in the figures and as described herein, the present invention provides a method and system for assessing an organization's testing maturity capabilities and its compliance with the best practices prevalent in existing industry models. The method assesses the maturity levels of the test processes and helps to identify gaps and strengths of the testing process with respect to industry standards.
The present invention presents following set of key features:
Tool Driven- The present invention is a web enabled tool that takes due care of workflow, schedules, questionnaire, reporting and other maintenance tasks.
Complete Focus- The focus area of the invention also includes Non functional testing along with Niche areas like security testing or usability testing.
Industry Compliant - The data types of all process areas are considered and the system is made compliant to existing industry models. The system also provides compliance reports with respect to industry models - TPI Next and TMMi from single assessment.
Extensive Reporting - Extensive reports on maturity levels of the corresponding process areas, category level compliance and related assurance compliance reports are
generated.
The system of the present invention comprises of varying components that are responsible for assessing the test process maturity levels based on heuristic analysis and various existing industry models like that of TPI Next and TMMi. The system further includes checkpoints that are grouped under functions and associated test process areas. A maturity level profile is generated using the system and accordingly the extensive compliance reports with respect to TMMi and TPI Next gets generated. Further, the system takes care of workflow, schedule, questionnaire, reporting and other maintenance tasks.
Referring now to Figure 1, an automated system is created to assess the test process maturity levels wherein the system comprises of five functions - Test Value Chain, Test Tools & Automation, Test Data & Environment, Process and People. Each function has test process areas which further have checkpoints. The checkpoints are rated 0, 1, 2 depending on the process definition and compliance. These five functions and their associate process areas are shown in Figure 1. The test process areas are categorized across five functions of the system. The invention has identified 17 test processes that are grouped under five functions. Table 1 below shows the association between the
functions and test processes.
Function Test Process Areas
Test Value Chain Test Life Cycle & Integration
Test Policy & Strategy
Estimation & Planning
Test Case Design & Execution
Non Functional Testing
Test Data & Environment Test Data & Environment
Test Tools & Automation Test Tools & Automation
Process Test Process Management
Metrics & Reporting
Defect Management
Testware Management
People Test Organization
Work Environment
Commitment and Motivation
Staffing
Knowledge Management
Communication
Table 1
Each process area is mapped with four maturity levels based on test process maturity score. For each of this level, a range has been provided based on heuristic analysis. The definition of these maturity levels is depicted in Table 2 below:
• Reactive
• Managed
• Leading
• Optimizing
Reactive No processes defined, testing activities are on criticaf path, people dependant
Managed Processes are defined, corrective actions are taken when required
Leading processess are mature, critical defects dentified at lower cost in minimum time
optimizing process are in place, continuously Optimizing adapting to changes, PDCA is adopted
Table 2
The maturity score varies depending on the importance of the process area in real time and has been depicted in Table 3 below:
Test Process Areas Reactive Managed Leading Optimizing
1- Test Kife Cycle & Integration 0.30 31-50% 51-70% 71-100%
2- Test Policy & Strategy 0-30% 31-50% 5 1-70% 71-100%
3- Estimation & Planning 0-30% 31-60% o 1-80% 81-100%
4- Test Case Design S Execution 0-50% 51-70% 71-90% 91-100%
5 - Test Data & Environment 0-30% 31-60% 81-80% 81-100%
6-TestTools& Automation 0-30% 31-60% 61-70% 71-100%
7- Non Functional Testing 0-30% 31-60% 61-80% 81-100%
8- Test Process Management 0-40% 41-70% 71-90% 91-100%
9- Metrics & Reporting 0-30% 31-60% 61-80% 81-100%
10- Defect Management 0-40% 41-70% 71-90% 91-100%
15- Testware Management 0-40% 41-70% 71-90% 91-100%
12- Test Organization 0-50% 51-80% 81-95% 96-100%
13- Work Environment 0-50% 51-80% 81-95% 96-100%
14- Commitment &. Motivation 0-50% 51-80% 81-95% 96-100%
15-Staffing 0-50% 51-80% 81-95% 96-100%
16-Knowledge management 0-50% 51-80% 81-95% 90-100%
17-Communication 0-50% 51-80% 81-95% 96-100%
Table 3
Characteristics of Test Process Areas: Description of test process areas and typical characteristics on each maturity level are given below:
a) Test Life Cycle & Integration:
The purpose of this process area is to improve predictability and controllability of test process, by categorizing the activities into phases and interactions with Software Development Life Cycle (SDLC) activities thus increasing focus on the activities while enabling to plan and monitor them effectively.
b) Test policy and strategy :
This facilitates planning and systematic implementation of testing while providing the mechanism for monitoring the deviations, risks and controlling the process. The objective is checked both on organization (Policy) arid at program/project level (Strategy).
c) Estimation and Planning:
This test process area provides insight into the resource requirements and the timelines needed. This enables making efficient use of the resources to achieve the goals of the project most effectively.
d) Test Case Design and Execution:
This test process area is to provide a standardized way of deriving test cases from source information to produce optimal number of reusable, quality test cases for efficient test execution and planning.
e) Test Data and Environment:
This test process area is to ensure that the effective change control process enables efficient testing with minimal disturbances and with robust test data and environment.
f) Test tools and Automation:
This test process area is to achieve consistency in testing, reduce cost, increase in test coverage, and better utilization of resources through test tools and automation.
g) Non Functional Testing:
This test process area is to analyze scalability, reliability of the application before rollout, and helps to reduce business risk at end user site. This also includes aspects on usability, security, browser capability, accessibility and other niche areas.
h) Test Process Management:
This test process area is to facilitate planning and implementation of the test process
while providing a mechanism for monitoring the deviations and controlling the process.
i) Metrics and reporting:
This test process area is to develop and sustain a measurement capability that is used to support management information needs. It substantiates the current state, and enables monitoring and control of the process.
j) Defect management:
This test process area is to provide an integrated way of tracking the life-cycle of
defects, support the analysis of quality trends and provide quality advice.
k) Test ware management:
This test process area is to facilitate a controlled creation/update of reusable test deliverables, integrate the deliverables and their sources at various stages and provide insight into the test coverage.
I) Test organization :
This test process area provides an insight into test organization structure, services provided, organization's performance and various communication channels to disseminate information across the organization.
m) Work Environment:
This test process area is to provide an encouraging environment for testing team in
terms of office infrastructure, security and according to work environment standards.
n) Commitment and motivation:
This test process area is to ensure that teams maintain good relationship and motivation and also the team maintains accountability and responsibility for their own work and continuous improvement of their work process.
o) Staffing:
This test process area is to ensure that the team is sufficiently staffed and resource
requirements are addressed according to the demand.
p) Knowledge management:
This test process area is to ensure the knowledge management and retention within the
team happens and plan for knowledge enhancements on continuous basis.
q) Communication:
This test process area is to provide an effective mechanism to synchronize various
activities across all levels of the test organization.
Maturity level based Characteristics for each Process Area is mentioned in Table 4 as given below:
Test Process Reactive Managed Leading Optimizing
Areas
1- Test LifeCycle Testing phases Early Effectively Testing team
& Integration are not involvement of coupled IT - participate in
distinguished testing team SDLC and TDLC project evaluation,
No Interaction and business Lessons Learnt and
documentation on model defined processes, Best practices of
test process and Testing activities testing team are
interactions with are valued outside used for future
SDLC activities test organization projects setup
2- Test Policy & No policy is in Test strategy Test Strategy is a Test Strategy
Strategy place, Strategy templates/guide living document. addresses the need
not available or lines available Test policy is and requirements of
does not cover all based on business reuse strategy.
requirements for needs & The strategies used
testing objectives, product/ application risk analysis. Various
techniques have been used for effective testing have been continuously evaluated with respect to incidents and refined.
3- Estimation & Testing scope is Estimation & Standard At least two
Planning not finalized, Planning Empirical models estimation
Resources are not ; process is in are in place. Set techniques are
available or place of Estimation used.Increase in
inadequate techniques and principles are maintained at an organizational level.
Metrics are collected to support estimation and planning productivity is demonstrated through estimation. tightly coupled demand management, estimation and training
4- Test Case Test cases are not Procedures for Best practices RCAS Evaluation of
Design & documented, test case like effective test test design
Execution adhoc testing writing, design, smoke techniques
performed logging, defect testing, resulting in
logging in place regression profiling, risk based testing, static testing, readiness tests, progressive regression, early integration testing in place improvement of
test cases
5 - Test Data & Test environment Clear Clear ownership Reuse of test data
Environment may not be documentation on test and automation of
available well on environment environment, test data creation
before the and data needs service catalogue
execution is available. Formal tracking of incidents for environment issues Readiness tests to capture defects early, Test Data strategy & readiness
6- Test Tools & No tools or Automation Uniformity in the Best practices and
Automation automation in framework is use and expertise on test
place defined, deployment of tools collected and
pockets of tools/automations used for future
excellence in Automated projects
automation, tesl : Regression test Regular evaluation
tools are .Early and of tools against test
available repeated execution of automated scripts tool policy-
• 7- Non Functional Not much focus Process defined Risk based non- Use of best
Testing on non functional on handling non functional testing, practices, artifacts
testing, non functional Definition of and templates for
functional requirement are not analyzed early in the life cycle requirements SLA for business scenarios the entire life cycle
8- Test Process No formal review A test Mandatory, Coordination
Management process, Review methodology is Conditional and between static and
is done on need available with Optional dynamic testing
basis list of activities, elements of test Continuous
supporting methodology are improvement on
artifacts -guidelines, templates etc available. Test Process Improvements are in place, Focus is on product quality. Projects are complied to centralized processes. test methodology
9- Metrics & No metrics Metrics Tools are used in Optimization of
Reporting collected or framework is in metrics collection metrics is in place.
metrics not used place and reporting. Metrics are used for
for monitoring Product quality is current and future
and control monitored Trends are collected. improvement of the project.
10-Defect Defect tracking is ; Defect Formal Defect Use of defect
Management manual or defect management triage meetings prediction models,
life cycle is not and tracking is are conducted. Periodic Causa]
documented in place SLA driven
defect
management Analysis along with all stakeholders
11 - Testware No version Test artifacts Use of tools for Reuse of test
Management control system is are managed on testware artifacts, guidelines
in place and procedural way management & for conserving
traceability is not maintained. for delivery, registration, archival traceability artifacts for reuse
12-Test No independent Organization Test organization Test organization
Organization test organization structure& is well performance is
communication positioned, regularly evaluated
model is in empowered to & compared with
place take decision based on outcome industry
13- Work Issues with office Good office Work Sufficient
Environment infrastructure infrastructure environment standards established and maintained Infrastructure for weekend support or after office hours
14- Commitment There might be Intergroup Testing team is Testing team
& Motivation some activities are highly motivated strives towards
commitment and motivation issues performed well accountability and responsibility for their own work and continuous improvement of their work process
15-Staffing Shortage of Tasks defined, Resource Teams are grouped
skilled resources allocated and allocation is as as per functions
executed in line
with
expectations per estimation and optimization of resources
16- Knowledge No knowledge Knowledge Trainings on Feedback
management repository exists repository is in business, testing, collection,
place process are conducted , evaluation mechanism,
metrics are in place continuous improvement in training
17- No clarity on RAC1 matrix Information flow Lessons learnt &
Communication roles and exists mechanism exists best practices are
responsibilities across all levels. There are well laid gating mechanisms. Testing team is part of change control board evaluated at the end ofthe project and taken into consideration for future projects
Table 4
Checkpoints Classification: The checkpoints of each test process area are mapped to four categories - Basic, Emerging, Matured and Optimizing as shown below:
• Basic - Checkpoints that relate to the processes that are mandatory, basic requirements to fulfill the needs
• Emerging - Checkpoints that relate to the definition of process
• Matured - Checkpoints that relate to the definition of process that are mature, provide benefits on cost / time / quality
• Optimizing - Checkpoints that relate to optimization of process, collect feedback and refine the process
The above classification is done to help the consultants or assessors to ask questions depending on the current compliance level of test process on the categories.
The checkpoints are also mapped to four Assurance indicators - Business, Testing, Process, and Delivery. This is to provide confidence to various stakeholders on the maturity of process pertaining to Business, Testing, Design & Development and Process.
• Business - This group provides business assurance compliance which includes involvement of Business in testing activities, handshake, satisfaction level, confidence provided for production rollout
• Process - This group provides process assurance compliance which includes existence of process definition and the compliance
• Delivery - This group provides delivery assurance compliance which includes involvement of design/ development team in testing life cycle, handshake with development team and the confidence provided to the design and development team
• Testing - This group provides compliance on key testing activities, confidence to testing team on moving to next stage
Reports: Now referring to Figure 2, a maturity profile of the test process assessment system is shown wherein the current test process maturity is indicated in blue fine, As an example, for 1-Test Life Cycle & Integration the current maturity level (blue line) is above the red dotted line and below purple dotted line and hence the maturity stands at Managed.
Similarly, Figure 3 shows a sample report depicting the compliance across Assurance Indicators, Functions and TCS Best Practices while Figure 4 shows a sample report for Process Area Maturity level across subgroups.
Another category of report generated is shown in Figure 5 depicting Category level wise scores for each Process Area. A graphical depiction of compliance to TPI Next is shown in Figure 6 while Figure 7 shows compliance to models like TMMi through a single assessment system.
As shown in Figure 1, varying components of the system are shown to assess the test process and to reflect their strengths and weaknesses. The system also captures gaps in tactical and strategic levels and is backed by the supporting components to take care of workflow, schedule, questionnaire, reporting and other maintenance tasks. The system generates extensive reports on Maturity level across Process areas, Category level compliance and Assurance Compliance Reports. Also, the needs of external and internal assessment and category based assessments are taken care of by the system.
In accordance with a preferred embodiment of the present invention, the single test process assessment system, a user management module is hosted on a server that creates an admin profile to perform multiple user associated tasks. These tasks can include adding new user with mandatory fields like user name, name, user-type, password, mobile, email id, security question and an answer. The admin is also given the flexibility to modify the user details, but the user name that remains freezed and cannot be modified. An admin can also delete an existing or active user, can change passwords, view list of users, add them with new roles and privileges, modify role privileges, create new accounts, view them, add a new program, modify an existing program, view program, add a new project, new Key Process Areas (KPA), modify KPA, delete KPA or view KPA.
The question management module of the system comprises of a question set which the admin user is allowed to modify. All details like Question details, KPA, category indicator, assurance indicator, model selection, options and scores are modifiable except the question id. The admin user is enabled to map a question to a KPA during the question creation stage within the module. An exhaustive questionnaire is prepared that includes questions collected from the best practices adopted by an organization and various industry maturity models. Each question is mapped with Key test process area, category indicator, assurance indicator, alignment to models - TPI Next, TMMi, option and scores (for eg yes mapped to rating 2, no mapped to rating 0, sometimes mapped to rating 1).
The question set is assessed within the assessment module of the system by the assessor. Within the module, Lead Assessor can add or modify the assessment period by defining the starting and ending quarters. However, the assessment period name remains non editable. Also, the schedule of an assessment period is modifiable wherein all other details like account, program, project, assessment period and others are editable except the assessment name. Also, the Lead Assessor, for the purposes of assessment can add any KPA with its name, description, function across which it is categorized and the maturity level to which it belongs (reactive, managed, leading or optimizing). The lead assessor is allowed to modify a KPA or delete it. The lead assessor can modify the question set contained within the question bank management
module with respect to question details or mapping of questions to a KPA for assigning the corresponding maturity score. The lead assessor can modify the assessment period, schedule an assessment for a project in an assessment period or modify the schedule of an assessment for a project.
Additionally, the lead assessor can assign a assessor to an assessment and can also decide the type of assignment. The assessor thereon takes the assessment; provide recommendations and assessment notes for assessment. The reports specific to the assessment gets generated by the assessor that includes assessment details, participants, challenges, recommendations, artifacts submitted or additional notes. These are later reviewed and approved by the lead assessor, Once the assessment report is approved by Lead assessor, the assessee can view the recommendations provided, maturity profile, agree or disagree to the report. The lead assessor closes the assessment and that completes the assessment process for a project/program/account.
Following from the assessment done by the assessment module of the system, a corresponding TP\ Next compliance report and a TMMi compliance report gets generated by the report generating module of the system. The report is generated by specifying the report name and selecting the assessment for which the report needs to be generated. The reports reflecting the test process maturity across the process area also gets generated after the assessment process. The compliance graph on assurance, subgroups and categories can also be generated (as shown in Figure 3, 4 and 5). The process area where subgroups are available, a subgroup compliance report is generated. This provides the overall maturity score of a process area and then a maturity score of each subgroup. For e.g. for a process area - Estimation and Planning, a bar chart with the overall score including score of estimation (subgroup 1) and score of planning (subgroup 2) is provided. This gives the visibility of the maturity of subgroup on a process area, as shown in Figure 4.
For the purposes of the present invention, the data types of all process areas are considered and the system is made compliant to existing industry models. The system thus obviates the need to conduct a separate assessment for different industry models in order to find the compliance. The non functional testing features of the said industry models are recognized and focused upon along with other niche testing areas like
security testing or usability testing. The checkpoints associated with niche testing areas are also included while assessing the test process maturity levels. The system is further supportive to manage the assessment request, schedule assessment, and conduct assessment and reporting.
The foregoing description has been directed to one or more specific embodiments of this invention. It will be apparent; however, that other variations and modifications may be made to the described embodiments, with the attainment of some or all of their advantages. For instance, it is expressly contemplated that the teachings of this invention can be implemented as software, including a computer-readable medium having program instructions executing on a computer, hardware, firmware, or a combination thereof. In addition, it is understood that the data structures described herein can include additional information while remaining within the scope of the present invention. Accordingly this description is to be taken only by way of example and not to otherwise limit the scope of the invention. Therefore, it is the object of the appended claims to cover all such variations and modifications as come within the true spirit and scope of the invention.
We Claim:
1) A computer implemented test process maturity assessment method, the method having computer executable code tangibly embodied on a computer readable storage medium and comprising;
a) defining and categorizing plurality of test process areas across one or more functions;
b) defining a set of checkpoints for each of the test process area to identify compliance level of the process and mapping the checkpoints to a set of indicators;
c) assigning a maturity score to each of the identified process area based upon the process definition and the compliance levels identified from the defined checkpoints;
d) mapping each of the test process area with corresponding maturity levels based upon the assigned maturity score; and
e) generating the maturity level and the compliance level report of the assessed test processes.
2) The method of claim 1, wherein the test process areas are categorized across five functions, the functions including test value chain, test tools and automation, test data and environment, people and process.
3) The method of claim 1, wherein the test process areas that are categorized across functions includes test life cycle and integration, test policy and strategy, estimation and planning, test case design and execution, non functional testing, test data and environment, test tools and automation, test process management, metrics and reporting, defect management, testware management, test organization, work environment, commitment and motivation, staffing, knowledge management and communication.
4) The method of claim 1, wherein the checkpoints are mapped to four assurance indicators including business, process, delivery and testing as well as the category indicators.
5) The method of claim 1, wherein the checkpoints are further mapped to four
categories of basic, emerging, matured and optimizing.
6) The method of claim 1, wherein the checkpoints are further mapped as industry models or organization best practices.
7) The method of claim 1, wherein the test process area is mapped across four maturity levels namely reactive, managed, leading and optimizing.
8) The method of claim 1, wherein the compliance level report is generated with respect to TMMi and TPI Next assessment models.
9) A test process maturity assessment system compliant to plurality of maturity models, the system comprising:
a) user management module hosted on a server and connected to the test
assessment system for determining the accessibility of plurality of users to a set
of questions associated with test process defined and categorized across one or
more functions;
b) a question management module hosted on the system and connected to the server for containing the question set in compliance to one or more maturity model;
c) an assessment module hosted on the system to assess the question set associated with the defined and categorized process areas and provide the corresponding maturity scores for each of the test process area; and
d) a report generating module hosted on the system for generating maturity levels and the compliance level reports for the assessed test processes.
10) The system of claim 9, further comprising a model management module hosted on the system for generating compliance across a question set of models TPI Next and TMMi, and the question set of the question management module.
11) The system of claim 9, further comprising assessment period management and a assessment schedule management module, the modules hosted on the system and configured to record assessment period of the assessment module and schedule of the assessment.
12) The system of claim 9, further comprising assessment progress management .module hosted on the system to assess progress of assessment by way of workflows on completion of tasks assigned to plurality of users.
13) The system of claim 9, wherein the plurality of users include admin user, lead assessor, assessor or assessee with varied accessibility.
14) The system of claim 9, wherein the test process areas are categorized across five functions, the functions including test value chain, test tools and automation, test data and environment, people and process.
15) The system of claim 9, wherein the test process areas that are categorized across functions includes test life cycle and integration, test policy and strategy, estimation and planning, test case design and execution, non functional testing, test data and environment, test tools and automation, test process management, metrics and reporting, defect management, testware management, test organization, work environment, commitment and motivation, staffing, knowledge management and communication.
16) The system of claim 9, wherein the assessment module provides a maturity score based on the process area definition and the compliance levels identified from checkpoints mapped to a set of indicators.
17) The system of claim 9, wherein the assessment module further maps each of the test process area with corresponding maturity levels based on the assigned maturity score.
18) The system of claim 9, wherein the report generation module further generates compliance graphs on assurance indicators, Functions, Organization Best practices, subgroups and categories.
19) The system of claim 9, wherein the compliance level report is generated with respect to TMMi and TPI Next assessment models.
20) The system of claim 9, where the report generation further includes report on progress of the assessment.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 3126-MUM-2011-FORM 26(23-11-2011).pdf | 2011-11-23 |
| 1 | 3126-MUM-2011-PETITION UNDER RULE 137 [06-04-2022(online)].pdf | 2022-04-06 |
| 2 | 3126-MUM-2011-CORRESPONDENCE(23-11-2011).pdf | 2011-11-23 |
| 2 | 3126-MUM-2011-RELEVANT DOCUMENTS [06-04-2022(online)].pdf | 2022-04-06 |
| 3 | 3126-MUM-2011-Written submissions and relevant documents [23-03-2022(online)].pdf | 2022-03-23 |
| 3 | 3126-MUM-2011-OTHERS [21-07-2018(online)].pdf | 2018-07-21 |
| 4 | 3126-MUM-2011-FER_SER_REPLY [21-07-2018(online)].pdf | 2018-07-21 |
| 4 | 3126-MUM-2011-Correspondence to notify the Controller [23-02-2022(online)].pdf | 2022-02-23 |
| 5 | 3126-MUM-2011-FORM-26 [23-02-2022(online)]-1.pdf | 2022-02-23 |
| 5 | 3126-MUM-2011-DRAWING [21-07-2018(online)].pdf | 2018-07-21 |
| 6 | 3126-MUM-2011-FORM-26 [23-02-2022(online)].pdf | 2022-02-23 |
| 6 | 3126-MUM-2011-COMPLETE SPECIFICATION [21-07-2018(online)].pdf | 2018-07-21 |
| 7 | 3126-MUM-2011-US(14)-HearingNotice-(HearingDate-10-03-2022).pdf | 2022-02-14 |
| 7 | 3126-MUM-2011-CLAIMS [21-07-2018(online)].pdf | 2018-07-21 |
| 8 | 3126-MUM-2011-ABSTRACT(27-2-2012).pdf | 2018-08-10 |
| 8 | 3126-MUM-2011-ABSTRACT [21-07-2018(online)].pdf | 2018-07-21 |
| 9 | 3126-MUM-2011-ABSTRACT.pdf | 2018-08-10 |
| 9 | ABSTRACT1.jpg | 2018-08-10 |
| 10 | 3126-MUM-2011-CLAIMS(27-2-2012).pdf | 2018-08-10 |
| 10 | 3126-MUM-2011-FORM 5(27-2-2012).pdf | 2018-08-10 |
| 11 | 3126-MUM-2011-CORRESPONDENCE(27-2-2012).pdf | 2018-08-10 |
| 11 | 3126-MUM-2011-FORM 3(27-2-2012).pdf | 2018-08-10 |
| 12 | 3126-MUM-2011-CORRESPONDENCE(4-5-2012).pdf | 2018-08-10 |
| 12 | 3126-MUM-2011-FORM 2.pdf | 2018-08-10 |
| 13 | 3126-MUM-2011-CORRESPONDENCE.pdf | 2018-08-10 |
| 13 | 3126-MUM-2011-FORM 2(TITLE PAGE).pdf | 2018-08-10 |
| 14 | 3126-MUM-2011-DESCRIPTION(COMPLETE)-(27-2-2012).pdf | 2018-08-10 |
| 14 | 3126-MUM-2011-FORM 2(TITLE PAGE)-(27-2-2012).pdf | 2018-08-10 |
| 15 | 3126-MUM-2011-DESCRIPTION(PROVISIONAL).pdf | 2018-08-10 |
| 15 | 3126-MUM-2011-FORM 2(27-2-2012).pdf | 2018-08-10 |
| 16 | 3126-MUM-2011-DRAWING(27-2-2012).pdf | 2018-08-10 |
| 16 | 3126-MUM-2011-FORM 18(27-2-2012).pdf | 2018-08-10 |
| 17 | 3126-MUM-2011-FORM 1.pdf | 2018-08-10 |
| 17 | 3126-MUM-2011-DRAWING.pdf | 2018-08-10 |
| 18 | 3126-MUM-2011-FER.pdf | 2018-08-10 |
| 18 | 3126-MUM-2011-FORM 1(4-5-2012).pdf | 2018-08-10 |
| 19 | 3126-MUM-2011-FORM 1(27-2-2012).pdf | 2018-08-10 |
| 20 | 3126-MUM-2011-FER.pdf | 2018-08-10 |
| 20 | 3126-MUM-2011-FORM 1(4-5-2012).pdf | 2018-08-10 |
| 21 | 3126-MUM-2011-DRAWING.pdf | 2018-08-10 |
| 21 | 3126-MUM-2011-FORM 1.pdf | 2018-08-10 |
| 22 | 3126-MUM-2011-DRAWING(27-2-2012).pdf | 2018-08-10 |
| 22 | 3126-MUM-2011-FORM 18(27-2-2012).pdf | 2018-08-10 |
| 23 | 3126-MUM-2011-DESCRIPTION(PROVISIONAL).pdf | 2018-08-10 |
| 23 | 3126-MUM-2011-FORM 2(27-2-2012).pdf | 2018-08-10 |
| 24 | 3126-MUM-2011-FORM 2(TITLE PAGE)-(27-2-2012).pdf | 2018-08-10 |
| 24 | 3126-MUM-2011-DESCRIPTION(COMPLETE)-(27-2-2012).pdf | 2018-08-10 |
| 25 | 3126-MUM-2011-FORM 2(TITLE PAGE).pdf | 2018-08-10 |
| 25 | 3126-MUM-2011-CORRESPONDENCE.pdf | 2018-08-10 |
| 26 | 3126-MUM-2011-CORRESPONDENCE(4-5-2012).pdf | 2018-08-10 |
| 26 | 3126-MUM-2011-FORM 2.pdf | 2018-08-10 |
| 27 | 3126-MUM-2011-CORRESPONDENCE(27-2-2012).pdf | 2018-08-10 |
| 27 | 3126-MUM-2011-FORM 3(27-2-2012).pdf | 2018-08-10 |
| 28 | 3126-MUM-2011-CLAIMS(27-2-2012).pdf | 2018-08-10 |
| 28 | 3126-MUM-2011-FORM 5(27-2-2012).pdf | 2018-08-10 |
| 29 | 3126-MUM-2011-ABSTRACT.pdf | 2018-08-10 |
| 29 | ABSTRACT1.jpg | 2018-08-10 |
| 30 | 3126-MUM-2011-ABSTRACT [21-07-2018(online)].pdf | 2018-07-21 |
| 30 | 3126-MUM-2011-ABSTRACT(27-2-2012).pdf | 2018-08-10 |
| 31 | 3126-MUM-2011-US(14)-HearingNotice-(HearingDate-10-03-2022).pdf | 2022-02-14 |
| 31 | 3126-MUM-2011-CLAIMS [21-07-2018(online)].pdf | 2018-07-21 |
| 32 | 3126-MUM-2011-FORM-26 [23-02-2022(online)].pdf | 2022-02-23 |
| 32 | 3126-MUM-2011-COMPLETE SPECIFICATION [21-07-2018(online)].pdf | 2018-07-21 |
| 33 | 3126-MUM-2011-FORM-26 [23-02-2022(online)]-1.pdf | 2022-02-23 |
| 33 | 3126-MUM-2011-DRAWING [21-07-2018(online)].pdf | 2018-07-21 |
| 34 | 3126-MUM-2011-FER_SER_REPLY [21-07-2018(online)].pdf | 2018-07-21 |
| 34 | 3126-MUM-2011-Correspondence to notify the Controller [23-02-2022(online)].pdf | 2022-02-23 |
| 35 | 3126-MUM-2011-Written submissions and relevant documents [23-03-2022(online)].pdf | 2022-03-23 |
| 35 | 3126-MUM-2011-OTHERS [21-07-2018(online)].pdf | 2018-07-21 |
| 36 | 3126-MUM-2011-RELEVANT DOCUMENTS [06-04-2022(online)].pdf | 2022-04-06 |
| 36 | 3126-MUM-2011-CORRESPONDENCE(23-11-2011).pdf | 2011-11-23 |
| 37 | 3126-MUM-2011-FORM 26(23-11-2011).pdf | 2011-11-23 |
| 37 | 3126-MUM-2011-PETITION UNDER RULE 137 [06-04-2022(online)].pdf | 2022-04-06 |
| 1 | PatSeer_05-01-2018.pdf |