Sign In to Follow Application
View All Documents & Correspondence

Digital Test Value Chain

Abstract: Systems and methods related to automating the software testing lifecycle are described. In one implementation, a method of creating a digital test value chain (140) for automating the software testing life cycle comprises identifying automatable test value points from a plurality of test value points based on automation feasibility rules (130). Each of the plurality of test value points corresponds to a key activity in a testing phase of the software testing life cycle. Further, the method comprises associating automated solutions (110) to the automatable test value points, where at least one automated solution (110) is associated to each automatable test value point. Furthermore, the method comprises determining inter-operability of the automated solutions (110). Based on the inter-operability determination, the automated solutions (110) are integrated to create the digital test value chain (140).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
11 April 2011
Publication Number
09/2014
Publication Type
INA
Invention Field
ELECTRICAL
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2020-02-03
Renewal Date

Applicants

TATA CONSULTANCY SERVICES LIMITED
Nirmal Building  9th Floor  Nariman Point  Mumbai  Maharashtra  India

Inventors

1. SHAH  Vipul
Tata Research Development & Design Centre  54  Hadapsar Industrial Estate  Hadapsar  Pune - 411 013  Maharashtra  India
2. GOPAL  Vijayalakshmi
M/s. Tata Consultancy Services Ltd.  79  IT Highway  Karapakkam  Chennai-600096 Tamil Nadu  India
3. ARUMUGHAM  Prabhu
M/s. Tata Consultancy Services Ltd.  79  IT Highway  Karapakkam  Chennai - 600096  Tamil Nadu  India
4. GANESAN  Siva
M/s. Tata Consultancy Services Ltd.  200 Ft. Thoraipakkam - Pallavaram Ring Road  Chennai - 600096  Tamil Nadu  India
5. NARAYANASWAMY  Kumaresan
M/s. Tata Consultancy Services Ltd.  200 Ft. Thoraipakkam - Pallavaram Ring Road  Chennai - 600096  Tamil Nadu  India
6. NATARAJAN  Jayashree
M/s. Tata Consultancy Services Ltd.  200 Ft. Thoraipakkam - Pallavaram Ring Road  Chennai - 600096  Tamil Nadu  India
7. NATARAJASUNDARAM  Shriram
M/s. Tata Consultancy Services Ltd.  200 Ft. Thoraipakkam - Pallavaram Ring Road  Chennai - 600096  Tamil Nadu  India
8. JOSEPH  Spencer
M/s. Tata Consultancy Services Ltd.  200 Ft. Thoraipakkam - Pallavaram Ring Road  Chennai - 600096 Tamil Nadu  India

Specification

FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)
1. Title of the invention: DIGITAL TEST VALUE CHAIN
2. Applicant(s)
NAME NATIONALITY ADDRESS
TATA CONSULTANCY Indian Nirmal Building, 9th Floor, Nariman Point,
SERVICES LIMITED Mumbai-400021, Maharashtra, India
3. Preamble to the description
COMPLETE SPECIFICATION
The following specification particularly describes the invention and the manner in which it is to be performed.

TECHNICAL FIELD
[0001] The present subject matter, in general, relates to the field of software testing, in
particular to a system and a method for automating a software testing lifecycle.
BACKGROUND
[0002] Software testing is an important phase of the software development process.
Software testing describes the process of interacting with software with the aim of revealing errors. Software testing is necessary in order to ensure that the software actually performs what it is supposed to do, and does so correctly. The complete software testing process, i.e., software testing lifecycle, which is generally referred to as a test value chain, includes a series of testing phases that may be iterative in nature.
[0003] With an increase in complexity of the software as well as stringent quality
requirements to which it should adhere, the need for effective testing has increased. Furthermore, the testing lifecycle is very exhaustive and consumes immense resources and software organization. It is estimated that almost half of the total effort in the software development process is dedicated to testing and debugging of the software. Moreover, the efficiency and profitability of the software organization, in part, depends highly on its testing capabilities.
SUMMARY
[0004] This summary is provided to introduce concepts related to automating the
software testing lifecycle and the concepts are further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[0005] In one implementation, a method of creating a digital test value chain for
automating a software testing life cycle comprises identifying automatable test value points from a plurality of test value points based on automation feasibility rules. Each of the plurality of test value points corresponds to a key activity in a testing phase of the software testing life cycle. Further, the method comprises associating automated solutions to the automatable test

value points, where at least one automated solution is associated to each automatable test value point. Furthermore, the method comprises determining inter-operability of the automated solutions. Based on the inter-operability determination, the automated solutions are integrated to create the digital test value chain.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] The detailed description is provided with reference to the accompanying
figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and components.
[0007] Fig. 1(a) illustrates a network environment implementing a test automation
system, in accordance with an embodiment of the present subject matter.
[0008] Fig. 1(b) illustrates components of the test automation system, in accordance
with an embodiment of the present subject matter.
[0009] Fig. 1(c) illustrates a digital test value chain, in accordance with an
embodiment of the present subject matter.
[0010] Fig. 2 illustrates a method for automating a software testing life cycle, in
accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[0011] The present subject matter relates to systems and methods for automating a
software testing lifecycle. Typically, the software testing lifecycle of a program code, hereinafter referred to as a build, includes a series of testing phases. Examples of such phases include a planning phase, a design phase, an execution phase, and a closure phase.
[0012] The requirements from the build are specific to the performance parameters and
functionalities that are associated with the build. For example, specifications to be fulfilled by a user interface (UI) of the build can be one requirement and a desired central processing unit (CPU) usage by the build can be another requirement, etc. These requirements are often defined by a client for whom an application is being developed and are identified and documented by, say, a business analyst. In certain cases, non-functional requirements, such as

the requirements for performance testing of the build may also be specified. Documentation of the requirements may involve listing the performance as well as the functional requirements of the build in a requirements specification document. These requirements are typically communicated to a development team that creates a program code, i.e., the build, which conforms to these requirements, and simultaneously to a testing team that is responsible for testing the fulfillment of these requirements.
[0013] A series of activities performed by the testing team to ensure that a given build
fulfills a set of predefined requirements is referred to as the software testing lifecycle. The software testing lifecycle begins with a planning phase. The planning phase of the software testing lifecycle involves requirements modeling, preparation of test plans, etc. During requirement modeling, a plurality of requirements, such as performance and functional requirements to be fulfilled by the build, are identified. Typically, this process involves an expert to analyze the requirements and is largely manual.
[0014] During test plan preparation, the testing team defines a process for the testing to
be done on the build. For example, the test plan preparation may involve defining the steps and the approach that are to be followed and a timeline that needs to be maintained for timely completion of the software testing lifecycle. Conventionally, this process is again carried out manually.
[0015] Subsequent to the planning phase, the design phase begins in the software testing
lifecycle. The design phase, for example, involves preparing test scenarios, test data, and creating test cases based on the test scenarios and the test data. The test cases are prepared in accordance with the requirements of the build. The test cases are optimized scenarios having certain input values to allow generation of certain desired output values when executed on the build. Conventionally, though some activities in this phase may be performed using automated tools, a significant amount of manual work is nevertheless involved.
[0016] Further to preparation of the test cases, the test cases are executed on the build
in the execution phase of the software testing lifecycle. The test execution involves executing the test cases on the build to test fulfillment of a given set of requirements, and identification of any defects or bugs occurring during the execution of the test cases on the build. Occurrence of a defect indicates failure of a requirement. Once the defects are identified, the

defects are communicated to the development team so that the development team may fix or rectify the defects. The build may thereafter be modified by the development team to remove the defect(s). The build is tested again iteratively, until there are no more defects identified. Once all the defects are fixed, the build is sent for release in the closure phase, and a test report for the whole testing process is created containing test details such as requirements, the test cases and the defects identified during the testing process. Typically, the execution phase too, alike the previous phases, involves manual efforts.
[0017] It is to be understood that the phases described above are only by way of
example, various alternative to the same are possible. For example, one or more of these phases can be combined or considered as one phase. Also, a phase can be split into two or more phases and considered as separate phases in the software testing lifecycle.
[0018] Accordingly, along with the stringent quality requirements to be fulfilled by
the software during the software testing activities, the significant amount of manual intervention involved in the iterative software testing activities utilize extensive amount of time and resource of the organization.
[0019] Conventionally, several approaches have been followed to automate one or
more of the above mentioned phases of the software testing lifecycle. Although, the extent of automation achieved in some phases may be high, for example, automation level achieved in the execution phase may be about 60% within some organizations, considering an aggregate of all the phases of the software testing lifecycle, conventional approaches have been successful in achieving only around 20% automation of the software testing lifecycle. Further, the conventional approaches automate the software testing lifecycle, by individually automating each of the phases. This, more often than not, results in a situation where an automated solution derived in one phase of the software testing lifecycle is incompatible with the next phase. In other words, conventional approaches have a compartmentalized approach towards automation of the software testing lifecycle, where each phase is treated individually for the purposes of automation.
[0020] In accordance with the present subject matter, methods and systems for
automating the software testing lifecycle are described herein. The methods and systems achieve end-to-end automation of the software testing lifecycle by viewing the complete

software testing lifecycle in its entirety for the purposes of automation. The system and method involves creating a digital test value chain for providing an end to end automation of the software testing lifecycle.
[0021] In an implementation, key activities performed in each phase of the software
testing lifecycle, which is also referred to as a test value chain are treated as test value points. The system and method creates a digital test value chain that automates the testing lifecycle by automating each of the key activities or test value points. For the purpose of such automation, a determination of all the various test value points involved in each phase of software testing is made. For example, activities, such as creation of test scenario and data, test case creation, and test case execution may be identified as test value points.
[0022] In an implementation, the various test value points in each phase of the
software testing lifecycle are identified. Once identified, the test value points are classified as either manual test value points or test value points that may be automated (referred to as automatable test value points, hereinafter). For each of the automatable test value points, an automated solution, interchangeably referred to as a boxed solution, is provided ensuring that the automated solution provided for any test value point is compatible with other test value points. The automated solutions provided for the test value points are integrated to create the digital test value chain.
[0023] Fig. 1(a) illustrates a network environment 100 implementing a test automation
system 102, in accordance with an embodiment of the present subject matter. In one implementation, the network environment 100 can be a public network environment, including thousands of personal computers, laptops, various servers, such as blade servers, and other computing devices. In another implementation, the network environment 100 can be a private network environment, such as the network of an enterprise, with a limited number of personal computers, servers, laptops and other computing devices.
[0024] The test automation system 102 is communicatively connected to a plurality of
user devices 104-1, 104-2,...104-N, collectively referred to as the user devices 104 and individually referred to as a user device 104, through a network 106. The test automation system 102 includes a digital test value chain system 108 and a plurality of testing systems 110-1, 110-2,...110-N, collectively referred to as the testing systems 110 and individually

referred to as a testing system 110, associated with the digital test value chain system 108 through the network 106. In one implementation, a plurality of users, such as test analysts and subject matter experts (SMEs) may use the user devices 104 to communicate with the test automation system 102 for automating one or more phases of the software testing life cycle.
[0025] The digital test value chain system 108, the testing systems 110, and the user
devices 104 may be implemented as any of a variety of conventional computing devices, including, servers, a desktop personal computer, a notebook or portable computer, a workstation, a mainframe computer, a mobile computing device, and a laptop. Further, in one implementation, the digital test value chain system 108 and the testing systems 110 can be a single integrated system. In said implementation, the various components of the digital test value chain system 108 and the testing systems 110 may be implemented as a part of the same computing device.
[0026] The test automation system 102 is connected to the user devices 104 over the
network 106 through one or more communication links. The communication links between the test automation system 102 and the user devices 104 are enabled through a desired form of communication, for example, via dial-up modem connections, cable links, digital subscriber lines (DSL), wireless or satellite links, or any other suitable form of communication.
[0027] The network 106 may be a wireless network, a wired network, or a
combination thereof. The network 106 can also be an individual network or a collection of many such individual networks, interconnected with each other and functioning as a single large network, e.g., the Internet or an intranet. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and such. The network 106 may either be a dedicated network or a shared network, which represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), etc., to communicate with each other. Further, the network 106 may include network devices, such as network switches, hubs, routers, for providing a link between the test automation system 102 and the user devices 104. The network devices within the network 106 may interact with the test automation system 102 and the user devices 104 through the communication links.

[0028] In one implementation, the test automation system 102 provides for automation
of test value points. In one implementation, the test automation system 102 receives test value points as input from a user, such as a test analyst. As indicated previously, the test value points are key activities performed in each phase of the software testing lifecycle. The test value points may include, for example, creation of test scenario and test data, test case creation, and test case execution. Subsequent to receiving the test value points, the test automation system 102 processes the received test value points to create a digital test value chain. For this purpose, the digital test value chain system 108 within the test automation system 102 communicates with the testing systems 110. Each of the testing systems 110 is configured to provide an automated solution to the digital test value chain system 108. In other words, each of the testing systems 110 is configured to automate a test value point associated with the digital test value chain system 108.
[0029] In one embodiment, the automated solutions may be configured as routines,
programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. In said embodiment, one or more of the automated solutions may be implemented as an integrated testing system 110 or separate testing systems 110.
[0030] The interaction of the digital test value chain system 108 with the various
testing systems 110 to achieve automation of the various test value points is elaborated with respect to Fig. 1(b) and Fig. 1(c).
[0031] As shown in the Fig. 1(b), the digital test value chain system 108 includes one
or more processor(s) 112, a memory 114 coupled to the processor(s) 112, and interface(s) 116. The processor(s) 112 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) 112 are configured to fetch and execute computer-readable instructions and data stored in the memory 114.
[0032] The interface(s) 116 may include a variety of software and hardware
interfaces, for example, interface for peripheral device(s) such as a keyboard, a mouse, an external memory, a printer, etc. Further, the interface(s) 116 may enable the digital test value

system 108 to communicate over the network 106, and may include one or more ports for connecting the digital test value system 108 with other computing devices, such as the testing systems 110, web servers and external databases. The interface(s) 116 may facilitate multiple communications within a wide variety of protocols and networks, such as a network, including wired networks, e.g., LAN, cable, etc., and wireless networks, e.g., WLAN, cellular, satellite, etc.
[0033] The memory 114 may include any computer-readable medium known in the art
including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 114 also includes modules 118 and data 120.
[0034] The modules 118 include routines, programs, objects, components, data
structures, etc., which perform particular tasks or implement particular abstract data types. The modules 118 further include an analysis module 122, an integration module 124, a reporting module 126, and other module(s) 128. The other module(s) 128 may include programs or coded instructions that supplement applications and functions on the digital test value chain system 108, for example, programs in the operating system.
[0035] The data 120, amongst other things, serves as a repository for storing data
processed, received, and generated by one or more of the modules 118. The data 120 includes automation feasibility rules 130, integration data 132, reporting data 134, and other data 136. The other data 136 may include data generated as a result of the execution of one or more modules in the other module(s) 128.
[0036] In one implementation, the analysis module 122 identifies automatable test
value point from the plurality of test value points based on predefined automation feasibility rules 130. The automation feasibility rules 130, for example, map each of the test value points with the testing systems 110 to determine whether the testing systems 110 that provides automated solutions exists for the test value points. Such testing systems 110 are also referred as automated solutions 110. Thus, those test value points for which corresponding automated solutions 110 exists are identified as automatable test value points.

[0037] In one implementation, the automated solutions 110 include a requirement
modeler 110-1, a test scenario generator 110-2, a test data generator 110-3, a test case generator 110-4, a script creator 110-5, a script executor 110-6, and a test planner 110-7. In one embodiment, each of the automated solutions 110 may be configured as a separate testing system 110-1, 110-2, 110-3... .and so on. For example, the requirement modeler 110-1 may be configured as the testing system 110-1, the test scenario generator may be configured as the testing system 110-2, etc. In said implementation, various other automated solutions 110 can also be provided configured as other testing system(s) 110-N, as shown in the fig. 1(b).
[0038] In one implementation, a user, such as a automated solution developer may
determines feasibility of creation of a new automated solution, if the automated solution for any test value point does not exist. The new automated solution can be created and associated with the digital test value chain system 108.
[0039] Once the automatable test value points are identified, an integration module
124 within the digital test value chain system 108, associates automated solutions 110 to the automatable test value points. Thereafter, the integration module 124 integrates the automated solutions 110 to create a digital test value chain, where one automated solution 110 is linked to another automated solution 110 providing end-to-end automation. While integration, the integration module 124 determines inter-operability of the automated solutions 110 based on the integration data 132 containing details of various automated solutions. For example, the integration module 124 determines whether output of one automated solution can be readable or interpretable by another automated solution.
[0040] When the result of the inter-operability determination indicates that the
automated solutions 110 are inter-operable, the integration module 124 integrates the automated solutions 110 in the form of a chain, which is referred herein as the digital test value chain. Once, the digital test value chain is created, an end-to-end automation can be achieved. Accordingly, any input received from the user devices 104 in the digital test value chain system 108 passes through a series of automated solutions 110 and final testing results are provided to the user.
[0041] On the other hand, when the result of the inter-operability determination
indicates that the automated solutions are not inter-operable, in one implementation, the

integration module 124 makes the automated solutions inter-operable or compatible. In said implementation, the integration module 124 receives the output of one automated solution makes it compatible with the input of next automated solution in the digital test value chain, for example, using various converters and compatibility tools known in the art. As an instance, if the output obtained from the test case generator 110-4 is not in a form readable or understandable by the script creator 110-5, the integration module 124 receives output of the test case generator 110-4, converts it into a format readable by the script creator 110-5, and provides the same to the script creator 110-5 as input.
[0042] A digital test value chain 140 according to an embodiment of the present
subject matter is shown in the Fig. 1(c). In said embodiment, the digital test value chain 140 comprises the automated solutions 110 including the requirement modeler 110-1, the test scenario generator 110-2, the test data generator 110-3, the test case generator 110-4, the script creator 110-5, the script executor 110-6, and test planner 110-7.
[0043] In said embodiment, the integration module 124 receives business requirement
data as input from a user device 104, say, the user device 104-1. Upon receiving the business requirement data, the integration module 124 provides the business requirement data as input to the requirement modeler 110-1. The business requirement data may include a plurality of requirements such as performance and functional requirements to be fulfilled by the build. These requirements are specific to the performance parameters and functionalities that the build is required to fulfill.
[0044] The requirement modeler 110-1 organizes the received business requirement
data. The requirement modeler 110-1, for example, may organize the business requirement data by classifying the business requirements into various templates. Further, the requirement modeler 110-1 may organize the business requirements in form of a flowchart or a process flow diagram with a set of business constraints incorporated therein. The requirement modeler 110-1 is therefore an automated solution that receives raw business requirement data as input and generates organized business requirements in form of templates, flowcharts etc. In one implementation, the business requirement data may be processed and organized by a Subject Matter Expert (SME).

[0045] The integration module 124 receives the organized business requirements thus
generated by the requirement modeler 110-1 and provide these organized business requirements as input to the test scenario generator 110-2, and the test data generator 110-3. In case the requirement modeler 110-1 is not compatible with the test scenario generator 110-2 and/or test data generator 110-3, the integration module 124 makes the test scenario generator 110-2 and/or test data generator 110-3 compatible or inter-operable. For example, considering the organized business requirements are received in form of a template, and the test scenario generator 110-2 is configured to receive business process flow diagrams as input, the integration module 124 may convert the organized business requirements into the business process flow diagrams and provides the business process flow diagram as input to the test scenario generator 110-2.
[0046] Once the organized business requirements are provided as input to the test
scenario generator 110-2, and the test data generator 110-3, the test scenario generator 110-2, which is also an automated solution generates test scenarios based on the business requirements. Further, the test data generator 110-3 generates test data, based on the business requirements. The test scenarios and the test data generated by the test scenario generator 110-2 and the test data generator 110-3, respectively, are provided as an input to the test case generator 110-4, by the integration module 124, for the purpose of generating test cases. The test cases generated by the test case generator 110-4 may be high level or optimized test scenarios. In one implementation, the integration module 124 may make the test scenario generator 110-2, and the test data generator 110-3 inter-operable with the test case generator 110-4, if the output of the test scenario generator 110-2 or the test data generator 110-3 is not interpretable by the test case generator 110-4.
[0047] The integration module 124 provides the test cases generated by the test case
generator 110-4 as an input to the script creator 110-5. Upon receiving the test cases, the script creator 110-5 generates automation scripts based on the test cases. In case, the script creator 110-5 accepts input as English like statements or keywords, the integration module 124 converts the test cases into the English like statements or keywords, and provides such statements or keywords as input to the script creator 110-5.

[0048] The automation scripts generated by the script creator 110-5 are thereafter
provided as an input to the script executor 110-6 by the integration module 124. The script executor 110-6 executes these test scripts for testing the build, and identifies bugs or defects in the build. Further, the script executor 110-6 reports the identified bugs to the development team, for example, via an email. The development team fixes the bugs or defects in the build and the iterative software testing is performed, until all the bugs are fixed in the build. Once all the bugs or defects are fixed, the build may be considered as ready for release.
[0049J At every stage of the testing process, the reporting module 126 receives,
reporting data 134 generated throughout the stages, collate the reporting data 134, and provide the collated reporting data 134 as a test report in a suitable reporting format, such as dashboard, excel, or XML. Therefore, the test report is generated containing details including business requirements that have been tested, test cases executed for the same and the defects identified during the testing process.
[0050] In one implementation, the test planner 110-7 receives the generated test
report, analyze the test report and identify learning points from the test report for improvement of the future testing process. Further, the test planner 110-7 may also track various testing activities and their schedule. The test planner 110-7 may generate a test planning report containing learning points identified from the test report and/or common defects/bugs identified during the testing process. Such a test planning report can be accessed and analyzed by the planning and development team for the purpose of improving the testing process.
[0051] Fig. 2 illustrates a method 200 for automating software testing life cycle, in
accordance with an implementation of the present subject matter. The order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method or an alternate method. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein.
[0052] The method 200 is initiated at block 202, where automatable test value points
from a plurality of test value points are indentified. The test value points may be understood as granular level activities in a test value chain. As indicated above, the test value chain may

be understood as a software testing lifecycle. The test value chain typically involves a series of phases, such as the planning phase, the design phase, the execution phase and the closure phase. Each key activity that is carried out in various phases of the test value chain is treated as a test value point. In one example, the test value points may be considered as those activities in the test value chain which result in a predefined output. For example, key activities, such as modeling requirements, creating test plan, creating test strategy, creating test scenarios, creating test data, creating high level or optimized test scenarios, identifying bugs, reporting bugs, fixing bugs, result in generation of a definite output, such as generation of a list of requirements, test plans and strategy, test cases and so on, and are considered as test value points.
[0053] A plurality of such test value points is received. Subsequent to receiving the
test value points, the test value points are classified into a manual test value point or an automatable test value point. The manual test value point may be understood as a test value point or key activity that is considered to be not feasible for automation. On the other hand, the automatable test value point may be understood as a test value point that is considered to be feasible for automation. For example, creating test scenarios, creating test data, creating test cases, identifying bugs, reporting bugs, preparing test summary reports, etc., may be classified as automatable test value points, while, fixing defects may be identified as a manual test value point, i.e., not feasible for automation.
[0054] Thus, automatable test value points are identified. In one implementation, the
analysis module 122 within the digital test value chain system 108 identifies those test value points as automatable test value points for which automated solutions 110 associated with the digital test value chain system 108 exists. As indicated previously, an automated solution automates a given test value point. The analysis module 122 may identify such automatable test value points based on automation feasibility rules 130.
[0055] In one implementation, a user may be provided with an option of selecting the
automatable test value points identified by the analysis module 122 for automation. For example, a user may choose to automate creating test scenarios, creating test data, creating test cases, identifying bugs, reporting bugs etc. for automation, and may reject preparing test summary report for automation.

[0056] At block 204, automated solutions are associated with the automatable test
value points. In one implementation, the integration module 124 within the digital test value chain system 108 associates an automated solution to each of the test value points. For instance, creating test scenarios may be an automated test value point, and the integration module 124 may associate an automated solution 110, such as test scenario generator within such automatable test value point to automatically create test scenarios based on business requirements. Thus, for the each selected automatable key activity, an automated solution 110 is provided, i.e., each automatable activity has one-to-one mapping with each automated solution 110.
[0057] At block 206, inter-operability of the automated solutions is determined, to
ensure that the output of one automated solution can be provided an input to another automated solution. In one implementation, the integration module 124 determines the interoperability of the automated solutions based on the integration data 132, such as details regarding the input and output format of the automated solutions 110.
[0058] At block 208, the automated solutions are integrated to create a digital test
value chain based on the determining. In one implementation, when the automated solutions 110 are found to be inter-operable, the integration module 124 integrates the automated solutions to create the digital test value chain 140. In another implementation, when an automated solution 110 is found to be not compatible with another automated solution 110, the integration module acts as an intermediate to make such automated solutions 110 compatible or inter-operable. For example, the integration module 124 may convert the output of one automated solution 110 in a format acceptable by another automated solution 110. In another implementation, the integration module 124 may provide the result of interoperability determination to the automated solution development team, which may work on the automated solutions 110 to make it compatible. Thus, the integration module 124 create a chain of automated solutions 110 that inter-operate with one another to achieve automation of entire test value chain, thereby saving manual efforts and resources of an enterprise. Such chain of automated solutions is referred as digital test value chain 140.
[0059] Although embodiments for automating the software testing lifecycle have been
described in language specific to structural features and/or methods, it is to be understood that

the present subject matter is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as exemplary implementations for automating the software testing lifecycle in an organization.

I/We claim:
1. A method of creating a digital test value chain (140) for automating a software testing
lifecycle, the method comprising:
identifying automatable test value points from a plurality of test value points based on automation feasibility rules (130), wherein each of the plurality of test value points corresponds to a key activity in a testing phase of the software testing life cycle;
associating automated solutions (110) to the automatable test value points, wherein at least one automated solution (110) is associated to each automatable test value point;
determining inter-operability of the automated solutions (110); and
integrating the automated solutions (110) based on the determining, to create the digital test value chain (140).
2. The method as claimed in claim 1, wherein the automatable test value points comprises modeling business requirements, creating test scenarios, creating test data, creating test cases, creating test scripts, executing test scripts, generating test report, and generating test planning report.
3. The method as claimed in claim 1, wherein the automated solutions (110) comprises a requirement modeler (110-1), a test scenario generator (110-2), a test data generator (110-3), a test case generator (110-4), a script creator (110-5), a script executor (110-6), and a test planner (110-7).
4. The method as claimed in claim 1, wherein the determining comprises identifying compatibility of an output generated by one automated solution amongst the automated solutions (110) with an input of next automated solution amongst the automated solutions (110).
5. The method as claimed in claim 1, wherein the integrating comprises:
converting output of one automated solution amongst the automated solutions (110) in a format interpretable by next automated solution amongst the automated solutions (110); and
providing the converted output as an input to the next automated solution.
6. A digital test value chain system (108) comprising:
a processor (112); and

a memory (114) coupled to the processor (112), the memory (114) comprising: an analysis module (122) configured to identify automatable test value points from a plurality of test value points based on automation feasibility rules (130), wherein each of the plurality of test value points corresponds to a key activity in a testing phase of the software testing life cycle; and an integration module (124) configured to:
associate automated solutions (110) to the automatable test value points, wherein at least one automated solution (110) is associated to each automatable test value point; and
integrate the automated solutions (110) to create a digital test value chain (140) providing automation of software testing life cycle, wherein the automated solutions (110) are inter-operable.
7. The digital test value chain system (108) as claimed in claim 6 further comprises a
reporting module (126) configured to generate a test report containing test results associated with each of the automated solutions (110).
8. The digital test value chain system (108) as claimed in claim 6, wherein the automated
solutions (110) comprises a requirement modeler (110-1), a test scenario generator (110-2), a test data generator (110-3), a test case generator (110-4), a script creator (110-5), a script executor (110-6), and a test planner (110-7).
9. The digital test value chain system (108) as claimed in claim 6, wherein the integration
module (124) is further configured to convert an output of one automated solution amongst the automated solutions (110) in a format compatible to be provided as an input to next automated solution amongst the automated solutions (110).
10. A computer-readable medium having embodied thereon a computer program for
executing a method comprising:
identifying automatable test value points from a plurality of test value points based on automation feasibility rules (130), wherein each of the plurality of test value points corresponds to a key activity in a testing phase of a software testing life cycle;
associating automated solutions (110) to the automatable test value points, wherein at least one automated solution (110) is associated to each automatable test value point;

determining inter-operability of the automated solutions (110); and integrating the automated solutions (110) based on the determining, to create a digital test value chain (140).

Documents

Application Documents

# Name Date
1 1199-MUM-2011-FORM 5(28-12-2011).pdf 2011-12-28
2 1199-MUM-2011-FORM 3(28-12-2011).pdf 2011-12-28
3 1199-MUM-2011-FORM 2(TITLE PAGE)-(28-12-2011).pdf 2011-12-28
4 1199-MUM-2011-FORM 2(28-12-2011).pdf 2011-12-28
5 1199-MUM-2011-FORM 18(28-12-2011).pdf 2011-12-28
6 1199-MUM-2011-FORM 1(28-12-2011).pdf 2011-12-28
7 1199-MUM-2011-DRAWING(28-12-2011).pdf 2011-12-28
8 1199-MUM-2011-DESCRIPTION(COMPLETE)-(28-12-2011).pdf 2011-12-28
9 1199-MUM-2011-CORRESPONDENCE(28-12-2011).pdf 2011-12-28
10 1199-MUM-2011-CORRESPONDENCE(28-12-2011)-.pdf 2011-12-28
11 1199-MUM-2011-CLAIMS(28-12-2011).pdf 2011-12-28
12 1199-MUM-2011-ABSTRACT(28-12-2011).pdf 2011-12-28
13 1199-MUM-2011-OTHERS [06-06-2018(online)].pdf 2018-06-06
14 1199-MUM-2011-FER_SER_REPLY [06-06-2018(online)].pdf 2018-06-06
15 1199-MUM-2011-COMPLETE SPECIFICATION [06-06-2018(online)].pdf 2018-06-06
16 1199-MUM-2011-CLAIMS [06-06-2018(online)].pdf 2018-06-06
17 Form-3.pdf 2018-08-10
18 Form-1.pdf 2018-08-10
19 Drawings.pdf 2018-08-10
20 ABSTRACT1.jpg 2018-08-10
21 1199-MUM-2011-FORM 26(17-6-2011).pdf 2018-08-10
22 1199-MUM-2011-FORM 1(9-5-2011).pdf 2018-08-10
23 1199-MUM-2011-FER.pdf 2018-08-10
24 1199-MUM-2011-CORRESPONDENCE(9-5-2011).pdf 2018-08-10
25 1199-MUM-2011-CORRESPONDENCE(17-6-2011).pdf 2018-08-10
26 1199-MUM-2011-HearingNoticeLetter-(DateOfHearing-16-12-2019).pdf 2019-11-18
27 1199-MUM-2011-Correspondence to notify the Controller (Mandatory) [25-11-2019(online)].pdf 2019-11-25
28 1199-MUM-2011-FORM-26 [12-12-2019(online)].pdf 2019-12-12
29 1199-MUM-2011-ORIGINAL UR 6(1A) FORM 26-201219.pdf 2019-12-23
30 1199-MUM-2011-Written submissions and relevant documents (MANDATORY) [31-12-2019(online)].pdf 2019-12-31
31 1199-MUM-2011-PatentCertificate03-02-2020.pdf 2020-02-03
32 1199-MUM-2011-RELEVANT DOCUMENTS [28-09-2021(online)].pdf 2021-09-28
33 1199-MUM-2011-RELEVANT DOCUMENTS [27-09-2022(online)].pdf 2022-09-27
34 1199-MUM-2011-RELEVANT DOCUMENTS [26-09-2023(online)].pdf 2023-09-26

Search Strategy

1 1199_27-11-2017.pdf

ERegister / Renewals

3rd: 04 Feb 2020

From 11/04/2013 - To 11/04/2014

4th: 04 Feb 2020

From 11/04/2014 - To 11/04/2015

5th: 04 Feb 2020

From 11/04/2015 - To 11/04/2016

6th: 04 Feb 2020

From 11/04/2016 - To 11/04/2017

7th: 04 Feb 2020

From 11/04/2017 - To 11/04/2018

8th: 04 Feb 2020

From 11/04/2018 - To 11/04/2019

9th: 04 Feb 2020

From 11/04/2019 - To 11/04/2020

10th: 04 Feb 2020

From 11/04/2020 - To 11/04/2021

11th: 20 Mar 2021

From 11/04/2021 - To 11/04/2022

12th: 23 Mar 2022

From 11/04/2022 - To 11/04/2023

13th: 04 Apr 2023

From 11/04/2023 - To 11/04/2024

14th: 08 Apr 2024

From 11/04/2024 - To 11/04/2025

15th: 08 Apr 2025

From 11/04/2025 - To 11/04/2026