Abstract: Methods and systems for testing document composition system (105) migration are described. In one implementation, a method of preparing test data (132) for testing a document composition system (105) migration includes structuring unstructured historical data to generate structured historical data (124). The structured historical data (124) comprises at least one variant extracted from the unstructured historical data mapped to at least one corresponding business parameter. Further, the method includes determining a test coverage of the structured historical data (124) by mapping the structured historical data (124) to a plurality of test cases in a test coverage matrix (126). The test coverage is indicative of sufficiency of the structured historical data (124) for testing functionality of the document composition system (105) across the plurality of test cases. Based on the test coverage, the test data (132) is generated that is used for testing the document composition system (105).
FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)
1. Title of the invention: TESTING DOCUMENT COMPOSITION SYSTEM MIGRATION
2. Applicant(s)
NAME NATIONALITY ADDRESS
TATA CONSULTANCY Indian Nirmal Building, 9th Floor, Nariman Point, SERVICES LIMITED Mumbai-400021, Maharashtra, India
3. Preamble to the description
COMPLETE SPECIFICATION
The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD
[0001] The present subject matter, in general, relates to the field of testing, in particular, to
systems and methods for test data preparation to test a document composition system migration.
BACKGROUND
[0002] In general, organizations, such as insurance, banking, finance, and government
organizations generate millions of documents or printed communications every year. Such documents may include personalized letters, contracts, policies, forms, and so on. The content within these documents is often based on various details that may be related to the respective users. Such documents may be generated using document composition systems (interchangeably referred to as system), which tend to automate the entire document generation process. As will be appreciated, with growing business the IT infrastructure supporting the business or corresponding business function may periodically require upgradation. In such a case, document generation process can be moved or migrated to a more robust and modern system or to a newer version of the existing system.
[0003] With the advent of technology, the document composition market has evolved with
the development of a variety of advanced document composition systems. Amongst a variety of
document composition systems available in the market, it is challenging for organizations to
differentiate between such document compositions systems. Despite clear requirements, it is not
always easy for the organizations to select the right document composition system for a given
document application. More than just the system's design interface, functionality of the system
needs to be assessed in selecting the best suited document composition system.
[0004] Once a new or advanced system is selected, before implementing selected system in
place, the selected system has to be tested in order to ensure that the selected system meets all the business and functional requirements. The functional requirements may include data handling, document layout, and printer support. While, the business requirements may include examining workflow capabilities and determining whether or not the selected system align with the document development process of the organization. Based on the results of the testing, the selected system can be implemented in place, and the entire document generation process can be migrated from the old document composition system to a newly implemented document composition system.
SUMMARY
[0005] This summary is provided to introduce concepts related to testing document
composition system migration and the concepts are further described below in the detailed description. This summary is neither intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[0006] In one implementation, a method of preparing test data for testing a document
composition system migration includes structuring unstructured historical data to generate structured historical data. The structured historical data comprises at least one variant extracted from the unstructured historical data mapped to at least one corresponding business parameter. Further, the method includes determining a test coverage of the structured historical data by mapping the structured historical data to a plurality of test cases in a test coverage matrix. The test coverage is indicative of sufficiency of the structured historical data for testing functionality of the document composition system across the plurality of test cases. Based on the test coverage, the test data is generated that is used for testing the document composition system.
BRIEF SUMMARY OF THE DRAWINGS
[0007] The detailed description is described with reference to the accompanying figures.
In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and components.
[0008] Fig. I illustrates a network environment implementing a test data preparation
system, in accordance with an embodiment of the present subject matter.
[0009] Fig. 2 illustrates a method of preparing test data for testing a document
composition system migration, in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[0010] The present subject matter relates to systems and methods for testing a document
composition system migration. As described previously, various functional or business groups within an organization may generate a large number of documents for their clients. Such communications are typically generated by document composition systems which substantially automate the entire communication generation process. As will be appreciated, with growing
business the IT infrastructure supporting the business or the relevant business function may periodically require upgrading. Similarly, generation of documents, hereinafter referred to as document generation process, can be moved or migrated to another system. However, before such migrations can be implemented, the system that is selected for implementation has to be tested.
[0011] For example, insurance companies print numerous documents or communication
letters that are to be dispatched to customers. The documents can be related to various insurance
products or policies which the customers may hold. These documents may be generated based on
one or more pre-stored formats or templates. The document composition systems, may then
populate the template based on a number of parameters, say formatting options, customer related
data, one or more rules, etc., which can be used for generating the documents.
[0012] In cases where the existing document composition system needs to be changed, for
example, upgraded, various templates, forms, internal tables etc., have to be migrated from the existing document composition system to another document composition system. The another document composition system may be understood as a system that is selected for implementation in place of the existing document composition system and is hereinafter referred to as the selected document composition system. In such a case, the selected document composition system needs to be tested, before migration. The testing is required to validate whether the selected document composition system is capable of generating documents in the same manner as that of the existing document composition system.
[0013] Conventional mechanism for testing involves manually identifying business related
scenarios or test cases based on which the selected document composition system can be tested. In such a case, one or more Subject Matter Experts (SMEs) and/or test analysts take considerable time to come up with various business related scenarios to obtain a coverage matrix and test data for testing the new document composition system that is to be implemented. Once the test coverage matrix is in place, the test cases have to be triggered to determine if the new document composition system is working in an expected or planned manner. In such a case, the test data has to be manually entered. For situations or systems involving a large number of cases, the extent of the test data that is to be entered would be a manually tedious process. Furthermore, some of the test cases may not have been contemplated for the existing document composition system. In such a case, it is quite likely that test data for triggering such test cases may not be
provided, thus making the system susceptible to unexpected behavior in the likely event of such a condition occurring.
[0014] As would be evident to a person skilled in the art, such a process is heavily
dependent on the SMEs and test analysts for different businesses, making the system susceptible to human errors. Furthermore, manual analysis is a time consuming process. For example, a print request file takes a significant amount of time, for example, nearly half a day, to be matched with a test case owing to the complexity of the print request file. A test coverage matrix can be based on various business requirements and testing conditions. Such test coverage matrix may be considered as a checklist in order to ensure that the functionality of the given software is checked in all possible combinations of test conditions. As will be noted, these conventional techniques require lot of manual efforts. Specifically, the historical data, which is also referred to as previous or old production data, for example, production data of last 2 years, is required to be manually verified by comparing it with the test coverage matrix to ensure that all the testing conditions are met. As explained previously, these conventional techniques can be tedious and time consuming. Further, such techniques are prone to errors if strict timelines are provided for conducting such validations.
[0015] In accordance with the present subject matter, systems and methods for testing a
document composition system migration are described. In one implementation, document composition system, such as printing software is tested based on test data that is generated in turn based on the historical data. The historical data may include data that is related to the business function. For example, continuing with the insurance example provided above, historical data may include data related to one or more users who are clients of the insurance business function.
[0016] The historical data to be printed, in its native form, is typically in an unstructured or
unreadable format. For example, communication or letters to be printed may have data that is static and data that is dynamic, such as data that is related with the user to whom the communication may be addressed. In such a case, the communication to be printed is represented by a file format which is often only understandable by the document composition system. In such a case, the historical data is in an unstructured format. In one implementation, the historical data is analyzed or processed to generate structured historical data. The structured historical data is mapped and associated with one or more various test cases within a test coverage matrix.
[0017] The test coverage matrix, in one implementation, can indicate the scope of the test
coverage of the system, say, the new document composition system. The test cases within the
test coverage matrix can be based on cases or business related scenarios based on which the new
document composition system is to be tested. In this manner, whenever a mapped test case is
selected, the associated structured historical data is picked as test data. As will be evident, the
associated structured historical data would be utilized as test data for the mapped test cases.
[0018] It may also be the case, that not all test cases are mapped to the structured historical
data. In such cases, the presence of an unmapped test case would be indicative of one or more real-life scenarios which may not have occurred yet, and consequently the existing document composition system had not yet processed. For example, referring to the insurance example, clients may have not opted for a XYZ insurance policy. Accordingly, the existing document composition system may have not yet processed any communication letters corresponding to the XYZ insurance policy. In such a case, for the unmapped test cases, the relevant structured data can be defined by a user, such as a tester, the test analyst and the SME. In such a case, when tested, the response of the new document composition system can be observed and corrective action when required can be taken.
[0019] As will be appreciated by a person skilled in the art, the test data for the test cases is
not manually entered but is based on the historical data. In the absence of manual entry of the
test data for the test cases within the test coverage matrix, the amount of effort required is
reduced drastically. Furthermore, conditions which had not yet occurred in relation to the
existing document composition system can be further tested and response of the new document
composition system be assessed, based on the structured data defined by the user. In this way,
full test coverage for testing the new document composition system can be achieved.
[0020] The manner in which the document composition system migration is tested shall be
explained in detail with respect to the Figs. 1-2. While aspects of systems and methods can be implemented in any number of different computing systems environments, and/or configurations, the embodiments are described in the context of the following exemplary system architecture(s). Furthermore, the present description has been provided with implementations that are specific to certain business functions or certain businesses. It would be appreciated that other implementations are also covered without deviating from the scope of the present subject matter.
[0021] Fig. 1 illustrates a network environment 100 implementing a test data preparation
system 102, in accordance with an embodiment of the present subject matter. In one implementation, the network environment 100 can be a public network environment, including thousands of personal computers, laptops, various servers, such as blade servers, and other computing devices. In another implementation, the network environment 100 can be a private network environment with a limited number of personal computers, servers, laptops and other computing devices.
[0022] The test data preparation system 102 (hereinafter referred to as system 102) is
communicatively connected to a plurality of user devices 104-1, 104-2,...104-N, collectively
referred to as the user devices 104 and individually referred to as a user device 104, through a
network 106. In one implementation, a plurality of users, such as test analysts and subject matter
experts (SMEs) may use the user devices 104 to communicate with the system 102 for preparing
test data, which can be utilized for testing a document composition system 105.
[0023] The system 102 and the user devices 104 may be implemented as any of a variety of
conventional computing devices, including, servers, a desktop personal computer, a notebook or
portable computer, a workstation, a mainframe computer, a mobile computing device, and a
laptop. Further, in one implementation, the system 102 may itself be a distributed or centralized
network system in which different computing devices may host one or more of the hardware or
software components of the system 102. In another implementation, the various components of
the system 102 may be implemented as a part of the same computing device.
[0024] The system 102 is connected to the user devices 104 over the network 106 through
one or more communication links. The communication links between the system 102 and the user devices 104 are enabled through a desired form of communication, for example, via dial-up modem connections, cable links, digital subscriber lines (DSL), wireless or satellite links, or any other suitable form of communication.
[0025] The network 106 may be a wireless network, a wired network, or a combination
thereof. The network 106 can also be an individual network or a collection of many such individual networks, interconnected with each other and functioning as a single large network, e.g., the Internet or an intranet. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and such. The network 106 may either be a dedicated network or a shared network,
which represents an association of the different types of networks that use a variety of protocols,
for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet
Protocol (TCP/IP), etc., to communicate with each other. Further, the network 106 may include
network devices, such as network switches, hubs, routers, for providing a link between the
system 102 and the user devices 104. The network devices within the network 106 may interact
with the system 102 and the user devices 104 through the communication links.
[0026] In one implementation, the system 102 receives historical data from a data source
107. Although the data source 107 is shown as an external repository in the figure, it is to be understood that the data source 107 may be an internal repository within the document composition system 105. The historical data may be understood as any data that is persistently stored and used to conduct business processes. In example, the historical data in banking organizations may include but not limited to customer related data, such as customer name, mailing address, contact information, account number, type of account, and transactions made by the customers. The data source 107 may be a database or a repository containing historical data stored therein. For example, the data source 107 may be a repository associated with existing or currently deployed document composition system 105.
[0027] Subsequent to receiving the historical data, the system 102 processes the received
historical data. For this purpose, the system 102 includes one or more processor(s) 108, a memory 112 coupled to the processor(s) 108, and interface(s) 110. The processor(s) 108 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) 108 are configured to fetch and execute computer-readable instructions and data stored in the memory 112.
[0028] The interface(s) 110 may include a variety of software and hardware interfaces, for
example, interface for peripheral device(s), such as a keyboard, a mouse, an external memory, a printer, etc. Further, the interface(s) 110 may enable the system 102 to communicate over the network 106, and may include one or more ports for connecting the system 102 with other computing devices, such as web servers and external databases. The interface(s) 110 may facilitate multiple communications within a wide variety of protocols and networks, such as a
network, including wired networks, e.g., LAN, cable, etc., and wireless networks, e.g., WLAN, cellular, satellite, etc.
[0029] The memory 112 may include any computer-readable medium known in the art
including, for example, volatile memory, such as static random access memory (SRAM) and
dynamic random access memory (DRAM), and/or non-volatile memory, such as read only
memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and
magnetic tapes. The memory 112 also includes modules 114 and data 116.
[0030] The modules 114 include routines, programs, objects, components, data structures,
etc., which perform particular tasks or implement particular abstract data types. The modules 114
further include an interpretation module 118, an execution module 120, and other module(s) 122.
The other module(s) 122 may include programs or coded instructions that supplement
applications and functions on the system 102, for example, programs in the operating system.
[0031] The data 116, amongst other things, serves as a repository for storing data
processed, received, and generated by one or more of the modules 114. The data 116 includes
structured historical data 124, test coverage matrix 126, mapping information 128, user-defined
data 130, test data 132, print request file 134, and other data 136. The other data 136 may include
data generated as a result of the execution of one or more modules in the other module(s) 122.
[0032] In one implementation, the interpretation module 118 of the system 102 receives
historical data from the data source 107. It is to be noted that the historical data retrieved from the data source 107 is typically present in an unstructured or unreadable format (hereinafter referred to as unstructured historical data). In one implementation, the unstructured historical data is in form of print request files. The interpretation module 118 processes this unstructured historical data to generate test data 132 for testing the document composition system 105. In one implementation, the generated test data 132 is used for testing printing software within the document composition system 105. As a part of the processing, the interpretation module 118 analyses the unstructured historical data and organize the unstructured historical data to generate structured historical data 124.
[0033] In an implementation, the interpretation module 118 organizes the unstructured
historical data in form of a matrix or in a tabular format. In said implementation, the interpretation module 118 scans the unstructured historical data and identify various key elements, say variants corresponding to a set of dynamic business parameters/variables. For
example, the business parameters or variables can be name, age, type of policy, date of policy,
date of maturity, and the key elements or variants could be the value corresponding to each of
these variables. Upon identification, the interpretation module 118 extracts those key elements
and arranges the key elements in form of a matrix. The matrix may be indicative of the
parameters/variables as headers and identified key elements as columns under the respective
parameters/variables. The matrix therefore represents the structured historical data 124. In one
implementation, the interpretation module 118 displays the matrix to the user.
[0034] In one implementation, the interpretation module 118 maps the structured historical
data 124 to a plurality of test cases within the test coverage matrix 126. The interpretation
module 118 may store such information in relation to the mapping as mapping information 128.
As a result of the mapping, the test cases within the test coverage matrix 126 are associated with
the structured historical data 124. As it will be evident that with such an association, whenever a
mapped test case, say one of the test cases within the test coverage matrix 126, is triggered, the
structured historical data 124 associated with the test case is picked as test data 132.
[0035J The test coverage matrix 126 described herein may be indicative of the test case
attributes/parameters as headers and test cases as columns under the respective test cases attributes. Continuing with above mentioned insurance example, the attributes may be type of policy, payment on schedule, etc. One of the test cases includes type of policy as 'group' and payment on schedule as 'fixed', while, the another test case includes type of policy as 'group' and payment on schedule as 'flexible'.
[0036] In one implementation, a test coverage indicating the number of test cases that are
covered by the structured historical data 124 can be determined based on the mapping. The test coverage may be around 100%, which is considered as sufficient test coverage, when all the test cases have corresponding structured historical data 124 mapped to it. It may also be the case, that not all test cases within test coverage matrix 126 are mapped to the structured historical data 124. In such cases, the presence of an unmapped test case would be indicative of one or more real-life scenarios which may not have occurred yet, which consequently the existing composition system had not yet processed. In such a case, for the unmapped test cases, the relevant structured data can be defined by a user, such as a tester, a test analyst, and a SME say a testing administrator. Such structured data defined by the user is hereinafter referred to user-defined data 130.
[0037] The user-defined data 130 can then be associated with the relevant test cases within
test coverage matrix 126 by the interpretation module 118. In such a case, when tested the response of the document composition system 105 can be observed and corrective action when required can be taken.
[0038] As will be appreciated by a person skilled in the art, the test data for the test cases
within the test coverage matrix 126 is not manually entered but is based on the historical data. In the absence of manual entry of the test data for the test cases, the amount of effort required is reduced drastically. Furthermore, conditions which had not yet occurred in relation to the existing document composition system can be further tested and response of the document composition system 105 be assessed, based on the pre-specified test data. In this way, sufficient test coverage, for example, around 100% test coverage, for testing the document composition system 105 can be achieved.
[0039] In one implementation, organizations with various business functions, such as
insurance companies that uses a legacy protection system creates data print request files that are picked up by printing software. Conventionally, in order to generate the variety of customer specific documents, the print request files are manually analyzed to identify the appropriate file for each test case. However, the interpretation module 118 of the system 102 interpret each data print request file and creates a user friendly data sheet with all the variable for the documents by structuring historical data and mapping it to the test cases.
[0040] Subsequent to mapping, the execution module 120 executes the test cases within the
test coverage matrix 126 to generate the test data 132. In one implementation, the execution module 120 executes all the test cases in the test coverage matrix 126. In said implementation, the execution module 120 picks up structured data including the structured historical data 124 and the user-defined data 130 associated with all the test cases as test data 132. In another implementation, the execution module 120 picks up structured data associated with user selected test cases as test data 132. For example, the user may select one or more tests cases within the test coverage matrix 126. In said implementation, the execution module 120 determines the set of the structured data that corresponds to the user selected test cases within the test coverage matrix 126 based on the mapping information 128. Once the set of the structured historical data is determined, the execution module 120 picks up structured data corresponding to only the user
selected test cases as test data 132. Such test data 132 obtained by the execution of user selected test cases is a reduced set or subset of available structured data.
[0041] The test data, thus obtained, can be converted into an unstructured data that is
understandable by the document composition system 105. In one implementation, the test data is converted into a print request file 134. In said implementation, the print request file 134 generated by the execution of the user selected test cases represents a subset of the print request file 134 generated by the execution of ail the test cases in the test coverage matrix. In testing, such as function testing, where only certain functionality of a software is required to be tested, a subset or reduced set of the print request file 134 can be generated. While in other cases, the complete the print request file 134 can be generated to test the document composition system 105 in all the business related scenarios.
[0042] Fig. 2 illustrates a method 200 of test data preparation for testing a document
composition system migration, in accordance with an embodiment of the present subject matter. The method may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, and the like that perform particular functions or implement particular abstract data types. The method may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communication network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0043] The order in which the method 200 is described is not intended to be construed as a
limitation, and any number of the described method blocks can be combined in any order to implement the method, or an alternative method. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein. The method 200 is described with reference to system 102, however the method 200 may be implemented in other systems, albeit with a few modifications, as will be understood by a person skilled in the art.
[0044] At block 202, the historical data is received by the test data preparation system 102.
As indicated previously, the historical data may be understood as any data that is persistently stored and used by professionals to conduct business processes. The historical data is typically
present in an unstructured format that is difficult to analyze and interpret. In an implementation,
the test data preparation system 102 retrieves all the historical data from the data source 107. The
data source 107 may be, for example, a repository associated with the currently deployed
document composition system. In another implementation, the test data preparation system 102
retrieves some of the historical data from the data source 107. For example, the test data
preparation system 102 retrieves the historical data for the last two years.
[0045] At block 204, the unstructured historical data is organized to generate structured
historical data. The interpretation module 118 analyzes the unstructured historical data and organizes the unstructured data in a structured format in order to generate structured historical data 124. For example, the interpretation module 118 organizes the unstructured historical data in form of a matrix or a table. The generated structured historical data 124 may be displayed to a user in a suitable viewable format.
[0046] At block 206, the structured historical data is mapped to a plurality of test cases
within the test coverage matrix. In one implementation, the interpretation module 118 maps the
structured historical data 124 to the test cases within the test coverage matrix 126. The mapping
of the structured historical data 124 to the test cases, for example, may determine a test coverage,
i.e., coverage of the structured historical data 124 with respect to the test cases within the test
coverage matrix 126. Data pertaining to the mapping may be stored as mapping information 128.
[0047] At block 208, a determination is made whether all the test cases have corresponding
structured historical data mapped to it. It is to be noted that there might be certain conditions that has never occurred during operation of the currently deployed document composition system. Accordingly, no historical data is available for those conditions. Therefore, the interpretation module 118 determines if all the test cases have corresponding structured historical data 124 mapped to it, or there are certain test cases corresponding to which the structured historical data 124 is missing.
[0048] If the determination indicates that all the test cases do not have corresponding
structured historical data mapped to it ("No" Branch from block 208), the interpretation module 118 receives user-defined data from a user, such as a test analyst or a Subject Matter Expert (SME) at block 210. For example, the interpretation module 118 prompts the user indicating that some of the test cases do not have corresponding structured historical data 124 mapped to it. The interpretation module 118 then receives the user-defined data corresponding to such test cases
from the user and associates the received user-defined data with such test cases to complete the mapping. The determination at block 208 is performed iteratively until complete mapping is achieved.
[0049] On the other hand, if the determination indicates that the structured historical data
124 is mapped to all the test cases ("Yes" Branch from block 208), the test data 132 is generated for testing the document composition system 105 that is selected for migration at the block 212. In one implementation, all the test cases in the test coverage matrix 126 are executed to generate the test data 132. During execution, the execution module 120 picks up the structured data including the structured historical data 124 and user-defined data 130 corresponding to all the test cases based on the mapping information 128. The picked data can be thereafter utilized as test data 132 for testing purposes. In said implementation, the test data 132 can be converted into a test file, such as a print request file 134, which can be straight away used for testing the document composition system 105, for example, new printing software within the document composition system 105 selected for implementing migration.
[0050] In another implementation, the user is provided with a provision to select one or
more test cases of interest amongst the plurality of test cases in the test coverage matrix 126 for
execution. For example, if the document composition system is to be tested only for certain test
cases, then those test cases can be selected for generating reduced set of test data 132. Once the
test cases are selected, the execution module 120 picks up the structured historical data
corresponding to the selected test cases (also referred to as user selected test cases) as test data
132 for testing the document composition system 105. In said implementation, the test data 132
can be converted into a test file, say a print request file 134. Such a print request file 134
represents a subset of the print request file 134, which is generated for all the test cases.
[0051] Although implementations for test data preparation to test a document composition
system migration have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as exemplary implementations for the test data preparation system 102.
I/We Claim:
1. A method of preparing test data (132) for testing a document composition system (105)
migration, the method comprising:
structuring unstructured historical data to generate structured historical data (124), wherein the structured historical data (124) comprises at least one variant extracted from the unstructured historical data mapped to at least one corresponding business parameter;
determining a test coverage of the structured historical data (124) by mapping the structured historical data (124) to a plurality of test cases in a test coverage matrix (126), wherein the test coverage is indicative of sufficiency of the structured historical data (124) for testing functionality of the document composition system (105) across the plurality of test cases; and
generating the test data (132) based on the test coverage, wherein the test data (132) is used for testing the document composition system (105).
2. The method as claimed in claim 1, wherein the method further comprising:
identifying one or more unmapped test cases amongst the plurality of test cases based on the determining; and
associating user-defined data (130) for the one or more unmapped test cases.
3. The method as claimed in claim 1, wherein the generating the test data (132) comprises executing the plurality of test cases, wherein the executing is based at least on the structured historical data (124) and user-defined data (130).
4. The method as claimed in claim 1, wherein the generating the test data (132) comprises executing user selected test cases amongst the plurality of test cases, wherein the executing is based on at least one of the structured historical data (124) and user-defined data (130).
5. The method as claimed in claim I, wherein the method further comprising creating a print request file, for testing the document composition system (105), based on the generated test data (132).
6. A test data preparation system (102) comprising:
a processor (108); and
a memory (112) coupled to the processor (108), the memory (112) comprising: an interpretation module (118) configured to:
prepare structured historical data (124) based on extraction of at least one variant from unstructured historical data and mapping the at least one variant to at least corresponding business parameter; and
map the structured historical data (124) to a plurality of test cases in a test coverage matrix (126) to determine a test coverage of the structured historical data (124); and
an execution module (120) configured to generate test data (132) based on the test coverage, wherein the test data (132) is used for testing a document composition system (105).
7. The test data preparation system (102) as claimed in claim 6, wherein the interpretation
module (118) is further configured to:
identify one or more unmapped test cases amongst the plurality of test cases based on mapping information (128) obtained as a result of the mapping of the structured historical data (124) to the plurality of test cases in the test coverage matrix (126);
obtain user-defined data (130) for the one or more unmapped test cases; and associate obtained user-defined data (130) with the corresponding one or more unmapped test cases.
8. The test data preparation system (102) as claimed in claim 6, wherein the execution module (120) is configured to generate the test data (132) by executing the plurality of test cases, wherein the execution is based on the structured historical data (124) and user-defined data (130).
9. The test data preparation system (102) as claimed in claim 6, wherein the execution module (120) is configured to generate the test data (132) by executing selected test cases amongst the plurality of test cases, wherein the execution is based on at least one of the structured historical data (124) and user-defined data (130).
10. A computer-readable medium having embodied thereon a computer program for executing a method comprising:
structuring unstructured historical data to generate structured historical data (124), wherein the structured historical data (124) comprises at least one variant extracted from the unstructured historical data mapped to at least one corresponding business parameter;
determining a test coverage of the structured historical data (124) by mapping the structured historical data (124) to a plurality of test cases in a test coverage matrix (126), wherein the test coverage is indicative of sufficiency of the structured historical data (124) for testing functionality of a document composition system (105) across the plurality of test cases; and
generating test data (132) based on the test coverage, wherein the test data (132) is used for testing the document composition system (105).
| # | Name | Date |
|---|---|---|
| 1 | 1288-MUM-2011-PETITION UNDER RULE-138(31-10-2011).pdf | 2011-10-31 |
| 1 | 1288-MUM-2011-Written submissions and relevant documents (MANDATORY) [05-09-2019(online)].pdf | 2019-09-05 |
| 2 | 1288-MUM-2011-FORM 1(31-10-2011).pdf | 2011-10-31 |
| 2 | 1288-MUM-2011-HearingNoticeLetter21-08-2019.pdf | 2019-08-21 |
| 3 | 1288-MUM-2011-CORRESPONDENCE(31-10-2011).pdf | 2011-10-31 |
| 3 | 1288-MUM-2011-Correspondence to notify the Controller (Mandatory) [22-07-2019(online)].pdf | 2019-07-22 |
| 4 | 1288-MUM-2011-FORM 5(28-12-2011).pdf | 2011-12-28 |
| 4 | 1288-MUM-2011-CORRESPONDENCE(17-6-2011).pdf | 2018-08-10 |
| 5 | 1288-MUM-2011-FORM 3(28-12-2011).pdf | 2011-12-28 |
| 5 | 1288-MUM-2011-FER.pdf | 2018-08-10 |
| 6 | 1288-MUM-2011-FORM 26(17-6-2011).pdf | 2018-08-10 |
| 6 | 1288-MUM-2011-FORM 2(TITLE PAGE)-(28-12-2011).pdf | 2011-12-28 |
| 7 | ABSTRACT1.jpg | 2018-08-10 |
| 7 | 1288-MUM-2011-FORM 2(28-12-2011).pdf | 2011-12-28 |
| 8 | Drawings.pdf | 2018-08-10 |
| 8 | 1288-MUM-2011-FORM 18(28-12-2011).pdf | 2011-12-28 |
| 9 | 1288-MUM-2011-FORM 1(28-12-2011).pdf | 2011-12-28 |
| 9 | Form-1.pdf | 2018-08-10 |
| 10 | 1288-MUM-2011-DRAWING(28-12-2011).pdf | 2011-12-28 |
| 10 | Form-3.pdf | 2018-08-10 |
| 11 | 1288-MUM-2011-ABSTRACT [28-05-2018(online)].pdf | 2018-05-28 |
| 11 | 1288-MUM-2011-DESCRIPTION(COMPLETE)-(28-12-2011).pdf | 2011-12-28 |
| 12 | 1288-MUM-2011-CLAIMS [28-05-2018(online)].pdf | 2018-05-28 |
| 12 | 1288-MUM-2011-CORRESPONDENCE(28-12-2011).pdf | 2011-12-28 |
| 13 | 1288-MUM-2011-CLAIMS(28-12-2011).pdf | 2011-12-28 |
| 13 | 1288-MUM-2011-COMPLETE SPECIFICATION [28-05-2018(online)].pdf | 2018-05-28 |
| 14 | 1288-MUM-2011-ABSTRACT(28-12-2011).pdf | 2011-12-28 |
| 14 | 1288-MUM-2011-FER_SER_REPLY [28-05-2018(online)].pdf | 2018-05-28 |
| 15 | 1288-MUM-2011-OTHERS [28-05-2018(online)].pdf | 2018-05-28 |
| 16 | 1288-MUM-2011-ABSTRACT(28-12-2011).pdf | 2011-12-28 |
| 16 | 1288-MUM-2011-FER_SER_REPLY [28-05-2018(online)].pdf | 2018-05-28 |
| 17 | 1288-MUM-2011-COMPLETE SPECIFICATION [28-05-2018(online)].pdf | 2018-05-28 |
| 17 | 1288-MUM-2011-CLAIMS(28-12-2011).pdf | 2011-12-28 |
| 18 | 1288-MUM-2011-CORRESPONDENCE(28-12-2011).pdf | 2011-12-28 |
| 18 | 1288-MUM-2011-CLAIMS [28-05-2018(online)].pdf | 2018-05-28 |
| 19 | 1288-MUM-2011-ABSTRACT [28-05-2018(online)].pdf | 2018-05-28 |
| 19 | 1288-MUM-2011-DESCRIPTION(COMPLETE)-(28-12-2011).pdf | 2011-12-28 |
| 20 | 1288-MUM-2011-DRAWING(28-12-2011).pdf | 2011-12-28 |
| 20 | Form-3.pdf | 2018-08-10 |
| 21 | 1288-MUM-2011-FORM 1(28-12-2011).pdf | 2011-12-28 |
| 21 | Form-1.pdf | 2018-08-10 |
| 22 | 1288-MUM-2011-FORM 18(28-12-2011).pdf | 2011-12-28 |
| 22 | Drawings.pdf | 2018-08-10 |
| 23 | 1288-MUM-2011-FORM 2(28-12-2011).pdf | 2011-12-28 |
| 23 | ABSTRACT1.jpg | 2018-08-10 |
| 24 | 1288-MUM-2011-FORM 2(TITLE PAGE)-(28-12-2011).pdf | 2011-12-28 |
| 24 | 1288-MUM-2011-FORM 26(17-6-2011).pdf | 2018-08-10 |
| 25 | 1288-MUM-2011-FORM 3(28-12-2011).pdf | 2011-12-28 |
| 25 | 1288-MUM-2011-FER.pdf | 2018-08-10 |
| 26 | 1288-MUM-2011-FORM 5(28-12-2011).pdf | 2011-12-28 |
| 26 | 1288-MUM-2011-CORRESPONDENCE(17-6-2011).pdf | 2018-08-10 |
| 27 | 1288-MUM-2011-CORRESPONDENCE(31-10-2011).pdf | 2011-10-31 |
| 27 | 1288-MUM-2011-Correspondence to notify the Controller (Mandatory) [22-07-2019(online)].pdf | 2019-07-22 |
| 28 | 1288-MUM-2011-HearingNoticeLetter21-08-2019.pdf | 2019-08-21 |
| 28 | 1288-MUM-2011-FORM 1(31-10-2011).pdf | 2011-10-31 |
| 29 | 1288-MUM-2011-Written submissions and relevant documents (MANDATORY) [05-09-2019(online)].pdf | 2019-09-05 |
| 29 | 1288-MUM-2011-PETITION UNDER RULE-138(31-10-2011).pdf | 2011-10-31 |
| 1 | PatSeersearchresult_14-08-2017.pdf |
| 1 | PatSeersearchstrategy_14-08-2017.pdf |
| 2 | PatSeersearchresult_14-08-2017.pdf |
| 2 | PatSeersearchstrategy_14-08-2017.pdf |