Sign In to Follow Application
View All Documents & Correspondence

System And Method For Software Testing

Abstract: The present subject matter discloses system and method for testing software service implemented in SOA. The user selects WSDL file and operation to be tested, whereby the operation is associated with the software service. The system parses the WSDL file to generate first Request XML message and first Response XML message corresponding to the operation. The system creates test data template comprising input data sheet and expected data sheet. Further, test data sheet is created when the user populates the input data sheet and the expected data sheet with input test data and expected test data respectively. The input test data and the expected test data along with the assertions configured in the test data sheets is converted test cases by the system. Further, system executes the test cases to test the operation selected.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
25 June 2014
Publication Number
01/2016
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2025-03-28
Renewal Date

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point, Mumbai 400021, Maharashtra, India

Inventors

1. NAYAK, Sita
Tata Consultancy Services Limited, IT/ITES Special Economic Zone, Plot - 35, Chandaka Industrial Estate, Patia, Chandrasekharpur, Bhubaneswar - 751024, Odisha, India
2. HATI, Rama Krishna
Tata Consultancy Services Limited, IT/ITES Special Economic Zone, Plot - 35, Chandaka Industrial Estate, Patia, Chandrasekharpur, Bhubaneswar - 751024, Odisha, India
3. KODERI, Sidharth
Tata Consultancy Services Limited, Vismaya Building, Infopark, Kusumagiri P.O. Kakkanad, Kochi 682030, Kerala, India
4. SUDHA, Sundararaman
Tata Consultancy Services Limited, (ETL Infrastructure Services Ltd. SEZ) 200 Ft. Thoraipakkam - Pallavaram Ring Road, Thoraipakkam, Chennai - 600096, Tamil Nadu, India

Specification

CLIAMS:WE CLAIM:

1. A method for testing a software implemented in a service oriented architecture (SOA), the method comprising:
enabling, by a processor, a user to select
a Web Services Description Language (WSDL) file comprising one or more operations associated with a software, and
an operation of the one or more operations to be tested;
parsing, by the processor, the WSDL file in order to generate a first Request XML message and a first Response XML message corresponding to the operation selected;
creating, by the processor, a test data template comprising an input data sheet and an expected data sheet, wherein the input data sheet and the expected data sheet are created based upon the first Request XML message and the first Response XML message respectively;
receiving, by the processor, an input test data and an expected test data by the user into the input data sheet and the expected data, of the test data template, respectively in order to create a test data sheet, wherein the input test data and the expected test data are received based test requirements of the operation to be tested, and wherein the input test data comprises test scenario, and wherein the expected test data comprises expected output for the test scenario;
converting, by the processor, the input test data and the expected test data into a second Request XML message and a second Response XML message respectively;
creating, by the processor, one or more test cases based upon the second Request XML message and the second Response XML message; and
executing, by the processor, the one or more test cases in order to test the operation associated with the software.

2. The method of claim 1, wherein the executing further comprises verifying the expected output corresponding to each test scenario.

3. The method of claim 1 further comprises validating the first Request XML message and the first Response XML file corresponding to the operation selected.

4. The method of claim 1, wherein the test data template further comprises template detail sheet and namespace sheet, and wherein
the template detail sheet further comprises template name, number of operations associated with the test data template, operation sequence, operation name, binding name, port type name, and at least one of a WSDL file name and WSDL uniform resource locator (URL), and
the namespace sheet further comprises operation sequence column, a message column, a prefix column, and an uniform resource identifier (URI) column, wherein
the operation sequence column comprises the operation sequence identifying the operation,
the message column comprises plurality of elements, wherein each element, of the plurality of elements, is either associated with the first Request XML message or the first Response XML message,
the prefix column comprises plurality of element prefixes corresponding to the plurality of elements such that each element has a corresponding element prefix, and
the URI column comprises plurality of URI corresponding to the plurality of elements such that each element has a corresponding URI.

5. The method of claim 4, wherein for each element of the plurality of elements a unique address is generated, and wherein the unique address generated is represented as XPath in the input data sheet and the expected data sheet of the test data template.

6. A system 102 for testing a software implemented in a service oriented architecture (SOA), the system 102 comprising:
a processor 202;
a memory 206 coupled to the processor 202, wherein the processor 202 executes a plurality of modules 208 stored in the memory 206, and wherein the plurality of modules 208 comprising:
selection module 210 enabling an user to select,
a Web Service Description Language (WSDL) file comprising one or more operations associated with a software, and
an operation of the one or more operations to be tested;
parsing module 212 to parse the WSDL file in order to generate a first Request XML message and a first Response XML message corresponding to the operation selected;
template creation module 214 to create a test data template comprising an input data sheet and an expected data sheet, wherein the input data sheet and the expected data sheet are created based on the first Request XML message and the first Response XML message respectively;
receiving module 216 to receive an input test data and an expected test data by the user into the input data sheet and the expected data sheet, of the test data template, respectively in order to generate a test data sheet, wherein the input test data and the expected test data are populated based on the test requirements of the operation to be tested, and wherein the input test data comprises test scenarios, and wherein the expected test data comprises expected output for the test scenarios;
test case creation module 218 to
convert the input test data and the expected test data into a second Request XML message and a second Response XML message respectively, and
create one or more test cases based upon the second Request XML message and the second Response XML message; and
execution module 220 to execute the one or more test cases in order to test the operation associated with the software.

7. The system of claim 6 further comprising a reporting module 222 to verify the expected output corresponding to each test scenario.

8. The system of claim 6, wherein the test data template further comprises template detail sheet and namespace sheet, and wherein
the template detail sheet further comprises template name, number of operations associated with the test data template, operation sequence, operation name, binding name, port type name, and at least one of a WSDL file name and WSDL uniform resource locator (URL), and
the namespace sheet further comprises operation sequence column, a message column, a prefix column, and an uniform resource identifier (URI) column, wherein
the operation sequence column comprises the operation sequence identifying the operation,
the message column comprises plurality of elements, wherein each element, of the plurality of elements, is either associated with the first Request XML message or the first Response XML message,
the prefix column comprises plurality of element prefixes corresponding to the plurality of elements such that each element has a corresponding element prefix, and
the URI column comprises plurality of URI corresponding to the plurality of elements such that each element has a corresponding URI.

9. The system of claim 8, wherein for each element of the plurality of elements a unique address is generated, and wherein the unique address generated is represented as XPath in the input data sheet and the expected data sheet of the test data template.

10. The system of claim 6 further comprises a stubbing module 224 to create a stub by using the test data sheet created for the one or more test cases.

11. A non-transitory computer readable medium embodying a program executable in a computing device for testing a software implemented in a service oriented architecture (SOA), the program comprising:
a program code for enabling a user to select,
a Web Services Description Language (WSDL) file comprising one or more operations associated with a software, and
an operation of the one or more operations to be tested;
a program code for parsing the WSDL file in order to generate a first Request XML message and a first Response XML message corresponding to the operation selected;
a program code for creating a test data template comprising an input data sheet and an expected data sheet, wherein the input data sheet and the expected data sheet are created based upon the first Request XML message and the first Response XML message respectively;
a program code for receiving an input test data and an expected test data by a user into the input data sheet and the expected data sheet, of the test data template, respectively in order to create a test data sheet, wherein the input test data and the expected test data are populated based on the test requirements of the operation to be tested, and wherein the input test data comprises test scenarios, and wherein the expected test data comprises expected output for the test scenarios;
a program code for converting the input test data and the expected test data into a second Request XML message and a second Response XML message respectively;
a program code for creating one or more test cases based upon the second Request XML message and the second Response XML message; and
a program code for executing the one or more test cases in order to test the operation associated with the software. ,TagSPECI:FORM 2

THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003

COMPLETE SPECIFICATION
(See Section 10 and Rule 13)

Title of invention:
SYSTEM AND METHOD FOR SOFTWARE TESTING

Applicant:
Tata Consultancy Services Limited
A company Incorporated in India under The Companies Act, 1956
Having address:
Nirmal Building, 9th Floor,
Nariman Point, Mumbai 400021,
Maharashtra, India

The following specification describes the invention and the manner in which it is to be performed.

CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY
[001] The present application does not claim priority from any patent application.
TECHNICAL FIELD
[002] The present subject matter described herein, in general, relates to a method and a system for software testing, more specifically, testing software services by creating test cases.
BACKGROUND
[003] In today’s software testing environment, well qualified testers are required for testing and/or validating software applications. The testers are expected to be well versed with usage of different languages like Extensible Markup Language (XML), JavaScript Object Notation (JSON) and the like. Sometimes, a user who may be a customer having no knowledge about these technologies/languages may wish to perform tests in order to check the functionalities/ operations of a software application’s underlying web service. Unfortunately, due to lack of technical knowledge, the user is not able to perform the test, and hence he/she has dependency on a professional tester having the desired skill-sets to perform service testing. Thus, it becomes a challenge for such non-technical testers (usually the customers) to test the services.
[004] Also, for performing service testing, appropriate service testing tools or testing frameworks are required to be installed in individual user’s device at user location. Installation of testing tools or frameworks locally upon the user devices leads to high installation efforts and cost which has to be borne by the users or organizations facilitating the software testing. As a part of the installation and setup, significant time and effort is spent on pre and post installation configuration activities on all user devices where the testing tool or testing framework gets installed. Also, installations consume storage spaces on all individual user devices.
SUMMARY
[005] This summary is provided to introduce aspects related to systems and methods for the testing software implemented are further described below in the detailed description. This summary is not intended to identify essential features of subject matter nor is it intended for use in determining or limiting the scope of the subject matter.
[006] In one implementation, a system for testing software implemented in service oriented architecture (SOA) is disclosed. The system comprises a processor and a memory coupled to the processor for executing a plurality of modules stored in the memory. The plurality of modules comprises a selection module, a parsing module, a template creation module, receiving module, a test case creation module, an execution module, a reporting module and a stubbing module. The selection module enables a user to select a Web Service Description Language (WSDL) file comprising one or more operations associated with software. Further, the selection module enables the user to select an operation of the one or more operations to be tested. The parsing module parses the WSDL file in order to generate a first Request XML message and a first Response XML message corresponding to the operation selected. The template creation module creates a test data template comprising of an input data sheet and an expected data sheet. The input data sheet and the expected data sheet are created based on the first Request XML message and the first Response XML message respectively. Further, the receiving module receives an input test data and an expected test data by the user into the input data sheet and the expected data sheet, of the test data template, respectively in order to create a test data sheet. The input test data and the expected test data are populated based on the test requirements of the operation to be tested, wherein the input test data comprises test scenarios, and wherein the expected test data comprises expected output for the test scenarios. Further, the test case creation module converts the input test data and the expected test data into a second request XML message and a second Response XML message respectively. Further, the test case creation module creates one or more test cases based upon the second request XML message and the second Response XML message. Further, the execution module executes one or more test cases in order to test the operation associated with the software.
[007] In another implementation, a method for testing software implemented in service oriented architecture (SOA) is disclosed. The method may comprise enabling, by a processor, a user to select a Web Services Description Language (WSDL) file comprising one or more operations associated with software, and an operation of the one or more operations to be tested. The method may further comprise parsing, by the processor, the WSDL file in order to generate a first Request XML message and a first Response XML message corresponding to the operation selected. The method may further comprise creating, by the processor, a test data template comprising an input data sheet and an expected data sheet. Further, the input data sheet and the expected data sheet are created based upon the first Request XML message and the first Response XML message respectively. The method may further comprise receiving, by the processor, an input test data and an expected test data by the user into the input data sheet and the expected data sheet, of the test data template, respectively in order to create a test data sheet. Further, the input test data and the expected test data are populated based on the test requirements of the operation to be tested. Further, the input test data comprises test scenarios, and the expected test data comprises expected output for the test scenarios. The method may further comprise converting, by the processor, the input test data and the expected test data into a second Request XML message and a second Response XML message respectively. Further, the method may comprise creating, by the processor, one or more test cases based upon the second Request XML message and the second Response XML message. The method may further comprise executing, by the processor, the one or more test cases in order to test the operation associated with the software.
[008] Yet in another implementation a non-transitory computer readable medium embodying a program executable in a computing device for testing software implemented in service oriented architecture (SOA) is disclosed. The program comprising a program code for enabling a user to select a Web Services Description Language (WSDL) file comprising one or more operations associated with software, and an operation of the one or more operations to be tested. The program further comprises a program code for parsing the WSDL file in order to generate a first Request XML message and a first Response XML message corresponding to the operation selected. The program further comprises a program code for creating a test data template comprising an input data sheet and an expected data sheet. Further, the input data sheet and the expected data sheet are created based upon the first Request XML message and the first Response XML message respectively. Further, the program comprises a program code for receiving an input test data and expected test data by the user into the input data sheet and the expected data sheet, of the test data template, respectively in order to create a test data sheet. Further, the input test data and the expected test data are populated based on the test requirements of the operation to be tested. Further, the input test data comprises test scenarios, and the expected test data comprises expected output for the test scenarios. The program further comprises a program code for converting the input test data and the expected test data into a second Request XML message and a second Response XML message respectively. Further, the program comprises a program code for creating one or more test cases based upon the second Request XML message and the second Response XML message. The program further comprises a program code for executing the one or more test cases in order to test the operation associated with the software.
BRIEF DESCRIPTION OF THE DRAWINGS
[009] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
[0010] Figure 1 illustrates a network implementation of a system for testing software implemented in service oriented architecture (SOA), in accordance with an embodiment of the present subject matter.
[0011] Figure 2 illustrates the system, in accordance with an embodiment of the present subject matter.
[0012] Figure 3A-3C illustrates an example for testing the software in detail, in accordance with an embodiment of the present subject matter.
[0013] Figure 4 illustrates a method for testing the software, in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[0014] Systems and methods for testing software, more particularly, software services implemented in service oriented architecture (SOA) are described. The present disclosure relates to a service validation framework (SVF) for facilitating software service testing or SOA testing. The SVF may be installed upon a centralized server or the system as disclosed the present disclosure. Once the SVF is installed upon the system, there is no need to locally install the SVF upon user devices for performing the testing. The user (usually tester or customer performing the software testing) may access the SVF installed upon the system through a web browser. Thus, the extra installation cost and effort may be saved by eliminating the need of locally installing the SVF in user environment. Also, the SVF is designed in such a manner that it provides simple user-friendly steps for performing the testing by non-technical users like customers or other users who may wish to perform the software testing. In general, such type of software testing is also known as functional testing in which the user tests different functionality/operations associated with the software service. These functionalities/ operations are tested to verify whether the software service is meeting business requirements or any other specific requirements. These business requirements are generally based on need of customers. For example, considering the software service as “Purchase Order Service”, the operations associated with this service (i.e., Purchase Order Service) may be “place_order”, “order_status” and the like. The “Purchase Order Service” and its associated operations (place_order and order_status) are explained later in detail.
[0015] In order to initiate with the testing process, at first, the user may login to the SVF installed upon the system. The system provides access to the SVF only to authenticated users. After the login, the system displays an interface for enabling the user to select a particular application of the SVF. The system also provides a home link, whereby by clicking on the home link the user can change the application anytime on his/her wish. In one example, the user may select “Service Validation” as one of an application of the SVF. After selecting the application (i.e., Service Validation), the system may display various features associated with application to the user. The features may include, but not limited to, test data management, generate middleware test/stub, test suite management, middleware test management, stub service management, QC test project management, and reports.
[0016] The test data management may facilitate the user to create a test data template. For creating the test data template, at first, the system enables the user to select a Web Services Description Language (WSDL) file. The user may select the WSDL file either by a browsing option or by entering uniform resource locator (URL) of the WSDL file. After selecting the WSDL file, list of operations of the WSDL file is displayed by the system for user selection. The list of operations is associated with the software service being tested. For example, the operations (i.e., place_order and order_status) are associated with the software service (i.e., Purchase Order Service). In another example, if the software service is considered to be “calculator”, then the list of operations displayed to the user may be addition, subtraction, multiplication, change length unit and the like. The system further enables the user to select one or more operations from the list of operations displayed to the user.
[0017] Further, the system parses the WSDL file selected by the user to generate default XML messages (request XML and response XML) corresponding to the operation selected. If the default XML messages do not match with a structure required to trigger a test case, the system also facilitates the user to edit the default XML messages or may allow the user to replace the default XML messages completely. The default XML messages comprise plurality of elements (also called as XML elements). In next step, the system may validate the default XML messages for their structure. The system notifies the user whether the default XML messages are valid or invalid along with invalidation errors (if any). Further, the system allows the operation, selected by the user, to be added in the test data template only when the default XML message is found to be valid. After the validation, the system creates the test data template comprising an input data sheet, expected data sheet, template detail sheet, and namespace sheet. The input data sheet and the expected data sheet may be created based upon the default XML messages (request XML and response XML). According to an embodiment of present disclosure, the test data template may be created in an excel format, and the input data sheet, the expected data sheet, the template detail sheet, and the namespace sheet may be represented in a form of worksheet of the excel format.
[0018] The creation of the test data template in the excel format facilitates the user (tester or a non-technical user having no/little knowledge of technologies/languages like JSON/XML) for populating the input data sheet and the expected data sheet with input test data and expected test data respectively. Since, the default XML messages comprise XML elements (or elements), the input data sheet and the expected data sheet lists all the XML elements (i.e., in the excel sheet format). Thus, the conversion of the elements of default XML messages into excel format (i.e., the test data template), helps the users to understand and populate the input test data and the expected test data in a easier manner rather than spending time to understand the default XML message. Thus, after receiving the input test data and the expected test data into the test data template, the test data template is considered to be test data sheet. The input test data and the expected test data are populated based on the test requirements of the operation to be tested. Further, the input test data comprises test scenarios, and the expected test data comprises expected output for the test scenarios. For example, in one of a test scenario for the operation “place_order” of the software service “Purchase Order Service”, the input test data, the expected test data, and conditions for the test scenario is shown below in table 1:
Test Scenario Input test data Expected test data Conditions
Test scenario 1 3 Items added Success message Item 1, Item 2, and Item 3 are available
Test scenario 2 3 Items added Fail message Item 1 present, Item 2 & Item 3 not available
Test scenario 3 3 Items added Error message Order placed having invalid data
Table 1: Test scenarios
[0019] In the test scenario 1 (table 1), it may be observed that for the input test data “3 Items added”, the expected test data entered by the user may be “Success message” if the condition is fulfilled i.e., the Item 1, Item 2, and Item 3 are available. Similarly, in the test scenario 2 it may be observed that, for the input test data “3 Items added”, the expected test data entered by the user may be “Fail message” if the condition is fulfilled i.e., Item 1 present, Item 2 & Item 3 not available. Further, in the test scenario 3 it may be observed that, for the input test data “3 Items added”, the expected test data input by the user may be an “Error message” if the condition is fulfilled i.e., the order placed having invalid data. Further, it may be observed that, in the above test data template, maximum 3 Items are considered for placing the order i.e., Item 1, Item 2, and Item 3.
[0020] After receiving the input test data and the expected test data in the input data sheet and the expected data sheet respectively, of the test data sheet (which is an excel format), the system converts back the input data sheet and the expected data sheet into XML messages, thereby creating a test cases. Further, the system executes the test cases created in order to test the operation (which is to be tested) of the software service. Further, the system may also verify the expected output corresponding to each test scenario. Further, the system may create mock services (i.e., stubs) at click of a button by reusing test data sheet created for the test cases that includes the input test data and the expected test data thereby allowing non-technical users to create the mock services (stubs) which is otherwise not possible without knowledge of a service stubbing tool and scripting languages used within the SVF.
[0021] While aspects of described system and method for software service testing may be implemented in any number of different computing systems, environments, and/or configurations, the embodiments are described in the context of the following exemplary system.
[0022] Referring to Figure 1, a network implementation 100 of system 102 for testing the software service is illustrated, in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 facilitates the testing of the software services along with their operations. Although the present subject matter is explained considering that the system 102 is implemented for testing the software services on a server, it may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, a tablet, a mobile phone, and the like. In one embodiment, the system 102 may be implemented in a cloud-based environment. It will be understood that the system 102 may be accessed by multiple users through one or more user devices 104-1, 104-2…104-N, collectively referred to as user 104 hereinafter, or applications residing on the user devices 104. Examples of the user devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, and a workstation. The user devices 104 are communicatively coupled to the system 102 through a network 106.
[0023] In one implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[0024] Referring now to Figure 2, the system 102 is illustrated in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 is configured to fetch and execute computer-readable instructions or modules stored in the memory 206.
[0025] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with a user directly or through the client devices 104. Further, the I/O interface 204 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[0026] The memory 206 may include any computer-readable medium or computer program product known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, a compact disks (CDs), digital versatile disc or digital video disc (DVDs) and magnetic tapes. The memory 206 may include modules 208 and data 228.
[0027] The modules 208 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 may include a selection module 210, a parsing module 212, template creation module 214, receiving module 216, a test case creation module 218, execution module 220, reporting module 222, stubbing module 224, and other modules 226. The other modules 226 may include programs or coded instructions that supplement applications and functions of the system 102.
[0028] The data 228, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 208. The data 228 may also include a WSDL database 230, a XML database 232, test case database 234, and other data 236.
[0029] Referring now to Figure 3A-3C, illustrates an example for testing the software services in detail, in accordance with an embodiment of the present subject matter. For facilitating the testing of the software services, the services validation framework (SVF) may be implemented in the system 102. The software services come up with a set of functionalities/operations for meeting a business objective. For example, considering the software service under test as “Purchase Order Service”, the operations associated with this service (i.e., Purchase Order Service) may be “place_order”, “order_status” and the like. Further, different test scenarios may be associated with the operation of the software services. In the present example, Scenario 1 may be considered for the operation “place_order” of the software service “Purchase Order Service”. According to the Scenario 1, test case and its description has been shown below in table 2.

Test Case Name Test Case Description Test Data
Pos_PO-Order_TC_1 Place an order for an item which is in stock, and ensure that the service returns a valid orderId. The orderId field should be a number starting with 5. Quantity: 2
productName: I-Pad
Pos_PO-Order_TC_2 Place an order for an item which is in stock, and ensure that the service returns a valid orderId. The orderId field should be a number starting with 5. Quantity: 3
productName: Samsung S3
Neg_PO-Order_TC_2 Place an order for an item which is out of stock, and ensure that the service returns a message indicating that the item is out of stock. The orderId field should have the value “OUT OF STOCK”. Quantity: 1
productName: I-Phone
Table 2: Scenario 1 for the operation place_order
[0030] According to embodiments of present disclosure, the system 102 may provide a login interface to enable the user to login to the SVF installed upon the system 102. After logging into the SVF, at step 302, the selecting module 210 of the system 102 may enable the user to select the Web Service Description Language (WSDL) file corresponding to the software service i.e., the “Purchase Order Service”. The WSDL file may be selected either by a browsing option 302A or by entering uniform resource locator (URL) 302B of the WSDL. The WSDL file selected in the present case is “PurchaseOrderService.wsdl”. Further, the WSDL file may be stored in a WSDL database 230 of the system 102. The selecting module 210 further enables the user to select one or more operations by clicking “list operations” button 302C on the display. In response to the clicking of the “list operations” button 302C, at step 304, list of operations may be displayed for the user selection. In the present example, the list of operations may be displayed as order (i.e., place_order) and order_status. In the present case, user has selected order or place_order operation from the display.
[0031] After selecting the operation (place_order), at step 306, the parsing module 212 of the system 102 parses the WSDL file in order to generate a first Request XML message 306A and a first Response XML message 306B corresponding to the operation selected (i.e., place_order). According to embodiments of present disclosure, the first Request XML message and the first Response XML message may be generated with the help of existing open source Java application programming interface (APIs). In general, these Java APIs are configured to take in the WSDL file and the WSDL operation name as inputs to build default Request/Response XML messages corresponding to the WSDL operation passed. According to embodiments, if the first Request XML message 306A and the first Response XML message 306B do not match with a message structure required for triggering test execution, the system 102 may further facilitate the user to edit/modify the first Request XML message 306A and the first Response XML message 306B. The system 102 may be further configured for validating the first Request XML message 306A and the first Response XML message 306B by providing a “Validate XML” 306C button. By clicking on the “Validate XML” 306C button, the system 102 performs the validation and notifies the user about the validity of the first Request XML message 306A and the first Response XML message 306B. According to embodiments, the validation may be performed by the system 102 to ensure that the edits/changes made by the user has not made the XML messages (the first Request XML message 306A and the first Response XML message 306B) invalid. If the XML messages are valid, the system 102 allows the operation (place_order) to be added in a test data template. The system 102 may further allow the user to add more number of operations in order to be tested in the test data template. According to embodiments of present disclosure, the system 102 may allow a next operation to be added from a different WSDL file or from the same WSDL file. After adding the operation(s), at step 308, a template creation module 214 may create a test data template by receiving a click action of the user on a “Generate Test Data Template” button 308A.
[0032] Once the test data template is created, the system 102 may facilitate the user to download the test data template. According to an embodiment of present disclosure, the test data template may be created in an excel format due to which it might become easier for the user to understand the test data template and feed required data for executing the test. Further, the test data template created comprises template detail sheet, namespace sheet, an input data sheet and an expected data sheet. The input data sheet and the expected data sheet are created based on the first Request XML message and the first Response XML message respectively.
[0033] The template detail sheet further comprises template name, number of operations associated with the test data template, operation sequence, operation name, binding name, port type name, and at least one of a WSDL file name and WSDL uniform resource locator (URL). In one example, the template detail sheet, of the test data template, is shown below in table 3:
Template Name PO-Order
Number of operations 1
Operation sequence 1
Operation Name place_order
Binding name GareAdminWebServicePort
Port type name GareAdminWebServicePort
WSDL file name/WSDL URL PurchaseOrderService.wsdl
Table 3: Template Detail Sheet
[0034] In the table 3, the operation sequence indicates sequence of operation in which the execution may be performed. The operation name indicates the name of the operation (i.e., place_order in this case). Further, the binding name contains a SOAP binding name, port type name contains the type of the port, and the WSDL file name/WSDL URL contains the WSDL file name (which is uploaded) or entered WSDL URL from which the operation has been selected.
[0035] Further, the namespace sheet comprises an operation sequence column, a message column, a prefix column, and a uniform resource identifier (URI) column. In one example, the namespace sheet, of the test data template, is shown below in table 4.
Operation Sequence Message Prefix URI
1 Test Data soapenv http://schemas.xmlsoap
1 Test Data tem http://tempuri.org/
2 Expected Data soapenv http://schemas.xmlsoap
2 Expected Data tem http://tempuri.org/
Table 4: Namespace Sheet
[0036] In the table 4, the operation sequence column comprises the operation sequence identifying the operation. The message column comprises plurality of elements, whereby each element, of the plurality of elements, is either associated with the first Request XML message or the first Response XML message. Further, the prefix column comprises plurality of element prefixes corresponding to the plurality of elements such that each element has a corresponding element prefix. Further, the URI column comprises plurality of URI corresponding to the plurality of elements such that each element has a corresponding URI.
[0037] After creating the test data template, the data sheet creation module 216 creates a test data sheet by enabling the user to populate the input data sheet and the expected data sheet, of the test data template, with an input test data and an expected test data respectively. In the input data sheet, for each element of the plurality of elements present in the first Request XML message, a unique address is generated. The unique address generated may be represented as XPath in the input data sheet. In one example, the input data sheet, of the test data template, is shown below in table 5.
A B C D
Operation sequence XPath XML field
1 //soapenv.Envelope(1)/soapenv:Body(1)/exam:PurchaseOrderType(1)/exam:quantity(1) exam:quantity
2 //soapenv.Envelope(1)/soapenv:Body(1)/exam:PurchaseOrderType(1)/exam:productName(1) exam:productName
Table 5: Input data sheet.
[0038] In the table 5, XPath column, i.e., column B of the input data sheet, indicates the unique addresses of the elements (XML elements) present in the first Request XML message. For creating the test data sheet, column D onwards, the system 102 may allow the user to populate input test data corresponding to the elements for various test cases identified (as per the table 2). The test cases name may be entered by the user in column header of the input data sheet. Further, the input test data populated in column D, E, and F (of table 6), of the input data sheet, indicates the test scenarios (as shown in table 1 above). According to embodiments, the input test data may be populated based on test requirements of the operations to be tested, whereby the input test data comprises different test scenarios. After populating the input test data, test data sheet is created which is shown below in table 6.
C D E F
XML field Pos_PO-Order_TC_1 Pos_PO-Order_TC_2 Pos_PO-Order_TC_3
exam:quantity 2 3 1
exam:productName I-Pad Samsung S3 I-Phone
Table 6: Test data sheet (i.e., Input test data populated in input data sheet)
[0039] Similarly, in the expected data sheet, for each element of the plurality of elements present in the first Response XML message, a unique address is generated. The unique address generated may be represented as XPath in the expected data sheet. In one example, the expected data sheet, of the test data template, is shown below in table 7.
B C D E
XPath XML fields Store in properties Assertion Option
//soapenv.Envelope(1)/soapenv:Body(1)/exam: OrderConfirmationType(1)/exam:orderID(1) exam:orderID Assert with wild card
//soapenv.Envelope(1)/soapenv:Body(1)/exam: OrderConfirmationType(1)/exam:expectedShipDate(1) exam:expectedShipDate Do not assert

Table 7: Expected data sheet
[0040] In the table 7, XPath i.e., column B of the expected data sheet, indicates the unique addresses of the elements (XML elements) present in the first Response XML message. Further, in the expected data sheet, the user may decide over assertion option (column E) for the elements present in the first Response XML message. According to embodiments, the assertion options entered are considered as keywords which may be used for creating test cases. From the table 7, it can be seen that the “orderID” (column C) is asserted using the wildcard option, and the expectedShipDate (column C) is not asserted.
[0041] Further, the receiving module 216 may receive the expected test data along with the type of assertion (input by the user) into the expected data sheet. The test data sheet created after receiving the expected test data into the expected data sheet is shown below in table 8.
E F G H
Assert option Pos_PO-Order_TC_1 Pos_PO-Order_TC_2 Pos_PO-Order_TC_3
Assert with wild card 5* 5* Out of Stock
Do not assert

Table 8: Test data sheet (i.e., Expected test data populated in expected data sheet)
[0042] It may be observed from table 8 that, for first two test cases (i.e., Pos_PO-Order_TC_1 and Pos_PO-Order_TC_2), the assertion with the wildcard indicates that the orderID should be starting with the number “5”. Further, the third test case i.e., Pos_PO-Order_TC_3 indicates that the orderID should have a value “Out of Stock”. Thus, the test data sheet is created (Table 6 and 8) after receiving the input test data and the expected test data into the input data sheet and the expected data sheet, of the test data template.
[0043] After creating the test data sheet, the test case creation module 218 may convert input test data and the expected test data into a second Request XML message and a second Response XML message respectively. According to embodiments, the second Request XML message and the second Response XML message may be used to build test cases. Each test case consists of the second Request XML message created by mapping the data elements corresponding to that test case into the first Request XML message. The test case assertions are built using the assertion option and the expected data that is configured in the expected data sheet for that test case. According to embodiments of present disclosure, the creation of second Request XML message and the second Response XML message may be done with the help of existing open source Java APIs. These APIs are be configured to take in a template XML (i.e., the first Request XML message and the first Response XML message), an XPath expression and a data element, and the API then maps the data element on to the template XML based to the mapping location which gets specified by the XPath expression.
[0044] In the present example, the test cases created are “Pos_PO-Order_TC_1”, “Pos_PO-Order_TC_2”, and “Pos_PO-Order_TC_3”. Further, the test cases created may be stored in the test case database 234 of the system 102. According to embodiments of present disclosure, the stubbing module 224 of the system 102 also facilitates the user to create mock services (i.e., stubs) by using the test data sheet created for the one or more test cases. For creating the stubs, the user may browse the test data sheet already created by the system 102.
[0045] After creation of the test cases, the execution module 220 may be configured to execute the test cases created in order to test the operation associated with the software services. During the test execution, at step 310, the test execution page may be displayed by the system 102. For example, the test cases (“Pos_PO-Order_TC_1”, “Pos_PO-Order_TC_2”, and “Pos_PO-Order_TC_3”) created may be executed by selecting a checkbox 310A associated with the test case. After executing the test cases, a RunID corresponding to the test cases may be created. Further, after executing the test cases, the system 102 may provide the reporting module 222 to verify the expected output corresponding to the test scenario. After the verification performed, at step 312, a report may be displayed to the user indicating the result of the test execution. In the step 312, the result of the test execution has been shown as “Passed” indicating successful execution of all the three test cases i.e., “Pos_PO-Order_TC_1”, “Pos_PO-Order_TC_2”, and “Pos_PO-Order_TC_3”. Thus, the system 102 provides the testing of the software service implemented in the SOA architecture.
[0046] Referring now to Figure 4, the method of testing software service implemented in service oriented architecture (SOA) is shown, in accordance with an embodiment of the present subject matter. The method 400 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method 400 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0047] The order in which the method 400 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 400 or alternate methods. Additionally, individual blocks may be deleted from the method 400 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 540 may be considered to be implemented in the above described system 102.
[0048] At block 402, the system 102 may enable the user to select a WSDL file comprising one or more operations. The system 102 further enables the user to select an operation of the one or more operations to be tested.
[0049] At block 404, the WSDL file is parsed in order to generate a first Request XML message and a first Response XML message corresponding to the operation selected.
[0050] At block 406, a test data template is created, wherein the test data template comprises an input data sheet and an expected data sheet based upon the first Request XML message and the first Response XML message respectively.
[0051] At block 408, a an input test data and an expected test data may be received by the user into the input data sheet and the expected data sheet, of the test data template, respectively in order to create test data sheet. Further, the input test data and the expected test data are populated based on the test requirements of the operation to be tested. Further, the input test data comprises test scenarios, and the expected test data comprises expected output for the test scenarios.
[0052] At block 410, the input test data and the expected test data may be converted into a second Request XML message and a second Response XML message.
[0053] At block 412, one or more test cases may be created based on the second Request XML message and the second Response XML message.
[0054] At block 414, the one or more test cases may be executed in order to test the operation associated with the software
[0055] Although implementations for methods and systems for testing the software services have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for testing the software services implemented in the service oriented architecture (SOA).

Documents

Application Documents

# Name Date
1 2060-MUM-2014-CLAIMS [21-02-2020(online)].pdf 2020-02-21
1 2060-MUM-2014-FORM 1(30-10-2014).pdf 2014-10-30
1 2060-MUM-2014-FORM-26 [06-03-2025(online)]-1.pdf 2025-03-06
1 2060-MUM-2014-IntimationOfGrant28-03-2025.pdf 2025-03-28
2 2060-MUM-2014-COMPLETE SPECIFICATION [21-02-2020(online)].pdf 2020-02-21
2 2060-MUM-2014-CORRESPONDENCE(30-10-2014).pdf 2014-10-30
2 2060-MUM-2014-FORM-26 [06-03-2025(online)].pdf 2025-03-06
2 2060-MUM-2014-PatentCertificate28-03-2025.pdf 2025-03-28
3 Form-3.pdf 2018-08-11
3 2060-MUM-2014-Written submissions and relevant documents [20-03-2025(online)].pdf 2025-03-20
3 2060-MUM-2014-Correspondence to notify the Controller [03-03-2025(online)].pdf 2025-03-03
3 2060-MUM-2014-DRAWING [21-02-2020(online)].pdf 2020-02-21
4 2060-MUM-2014-FER_SER_REPLY [21-02-2020(online)].pdf 2020-02-21
4 2060-MUM-2014-FORM-26 [06-03-2025(online)]-1.pdf 2025-03-06
4 2060-MUM-2014-US(14)-HearingNotice-(HearingDate-06-03-2025).pdf 2025-02-21
4 Form 2.pdf 2018-08-11
5 2060-MUM-2014-CLAIMS [21-02-2020(online)].pdf 2020-02-21
5 2060-MUM-2014-FORM-26 [06-03-2025(online)].pdf 2025-03-06
5 2060-MUM-2014-OTHERS [21-02-2020(online)].pdf 2020-02-21
5 Figure of Abstract.jpg 2018-08-11
6 2060-MUM-2014-COMPLETE SPECIFICATION [21-02-2020(online)].pdf 2020-02-21
6 2060-MUM-2014-Correspondence to notify the Controller [03-03-2025(online)].pdf 2025-03-03
6 2060-MUM-2014-FER.pdf 2019-08-23
6 Drawing.pdf 2018-08-11
7 2060-MUM-2014-CORRESPONDENCE(1-8-2014).pdf 2018-08-11
7 2060-MUM-2014-DRAWING [21-02-2020(online)].pdf 2020-02-21
7 2060-MUM-2014-FORM 26(1-8-2014).pdf 2018-08-11
7 2060-MUM-2014-US(14)-HearingNotice-(HearingDate-06-03-2025).pdf 2025-02-21
8 2060-MUM-2014-CLAIMS [21-02-2020(online)].pdf 2020-02-21
8 2060-MUM-2014-FER_SER_REPLY [21-02-2020(online)].pdf 2020-02-21
8 2060-MUM-2014-FORM 18.pdf 2018-08-11
9 2060-MUM-2014-COMPLETE SPECIFICATION [21-02-2020(online)].pdf 2020-02-21
9 2060-MUM-2014-CORRESPONDENCE(1-8-2014).pdf 2018-08-11
9 2060-MUM-2014-FORM 26(1-8-2014).pdf 2018-08-11
9 2060-MUM-2014-OTHERS [21-02-2020(online)].pdf 2020-02-21
10 2060-MUM-2014-DRAWING [21-02-2020(online)].pdf 2020-02-21
10 2060-MUM-2014-FER.pdf 2019-08-23
10 Drawing.pdf 2018-08-11
11 2060-MUM-2014-CORRESPONDENCE(1-8-2014).pdf 2018-08-11
11 2060-MUM-2014-FER_SER_REPLY [21-02-2020(online)].pdf 2020-02-21
11 2060-MUM-2014-OTHERS [21-02-2020(online)].pdf 2020-02-21
11 Figure of Abstract.jpg 2018-08-11
12 Form 2.pdf 2018-08-11
12 2060-MUM-2014-OTHERS [21-02-2020(online)].pdf 2020-02-21
12 2060-MUM-2014-FORM 18.pdf 2018-08-11
12 2060-MUM-2014-FER_SER_REPLY [21-02-2020(online)].pdf 2020-02-21
13 2060-MUM-2014-DRAWING [21-02-2020(online)].pdf 2020-02-21
13 2060-MUM-2014-FER.pdf 2019-08-23
13 2060-MUM-2014-FORM 26(1-8-2014).pdf 2018-08-11
13 Form-3.pdf 2018-08-11
14 2060-MUM-2014-COMPLETE SPECIFICATION [21-02-2020(online)].pdf 2020-02-21
14 2060-MUM-2014-CORRESPONDENCE(1-8-2014).pdf 2018-08-11
14 2060-MUM-2014-CORRESPONDENCE(30-10-2014).pdf 2014-10-30
14 Drawing.pdf 2018-08-11
15 2060-MUM-2014-CLAIMS [21-02-2020(online)].pdf 2020-02-21
15 2060-MUM-2014-FORM 1(30-10-2014).pdf 2014-10-30
15 2060-MUM-2014-FORM 18.pdf 2018-08-11
15 Figure of Abstract.jpg 2018-08-11
16 2060-MUM-2014-FORM 26(1-8-2014).pdf 2018-08-11
16 2060-MUM-2014-US(14)-HearingNotice-(HearingDate-06-03-2025).pdf 2025-02-21
16 Form 2.pdf 2018-08-11
17 2060-MUM-2014-Correspondence to notify the Controller [03-03-2025(online)].pdf 2025-03-03
17 Form-3.pdf 2018-08-11
17 Drawing.pdf 2018-08-11
18 Figure of Abstract.jpg 2018-08-11
18 2060-MUM-2014-FORM-26 [06-03-2025(online)].pdf 2025-03-06
18 2060-MUM-2014-CORRESPONDENCE(30-10-2014).pdf 2014-10-30
19 Form 2.pdf 2018-08-11
19 2060-MUM-2014-FORM-26 [06-03-2025(online)]-1.pdf 2025-03-06
19 2060-MUM-2014-FORM 1(30-10-2014).pdf 2014-10-30
20 2060-MUM-2014-Written submissions and relevant documents [20-03-2025(online)].pdf 2025-03-20
20 Form-3.pdf 2018-08-11
21 2060-MUM-2014-CORRESPONDENCE(30-10-2014).pdf 2014-10-30
21 2060-MUM-2014-PatentCertificate28-03-2025.pdf 2025-03-28
22 2060-MUM-2014-FORM 1(30-10-2014).pdf 2014-10-30
22 2060-MUM-2014-IntimationOfGrant28-03-2025.pdf 2025-03-28

Search Strategy

1 2060mum2014searchstd_19-08-2019.pdf

ERegister / Renewals

3rd: 21 Jun 2025

From 25/06/2016 - To 25/06/2017

4th: 21 Jun 2025

From 25/06/2017 - To 25/06/2018

5th: 21 Jun 2025

From 25/06/2018 - To 25/06/2019

6th: 21 Jun 2025

From 25/06/2019 - To 25/06/2020

7th: 21 Jun 2025

From 25/06/2020 - To 25/06/2021

8th: 21 Jun 2025

From 25/06/2021 - To 25/06/2022

9th: 21 Jun 2025

From 25/06/2022 - To 25/06/2023

10th: 21 Jun 2025

From 25/06/2023 - To 25/06/2024

11th: 21 Jun 2025

From 25/06/2024 - To 25/06/2025

12th: 21 Jun 2025

From 25/06/2025 - To 25/06/2026