Sign In to Follow Application
View All Documents & Correspondence

An Intelligent Testing System And Method For Automatic Software Application Testing

Abstract: Disclosed is an intelligent testing system and method for automatic software application testing. The disclosed invention provides an intelligent system and method for providing end-to-end automation of application testing by reduced dependency on testing components and binding of test cases with intelligence data provisioning. The automation in testing and Quality Assurance is assured for all layers of a multitier architecture-front end, middleware and backend.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
15 February 2013
Publication Number
50/2014
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2022-05-12
Renewal Date

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point, Mumbai 400021, Maharashtra, India

Inventors

1. GANESAN, Siva Raman
Tata Consultancy Services Limited, (ETL Infrastructure Services Ltd. SEZ) 200 Ft. Thoraipakkam - Pallavaram Ring Road, Chennai - 600096,Tamil Nadu, India
2. MEHROTRA, Vishal
Tata Consultancy Services Limited, Yantra Park , SDC 5 ODC G, Opp Voltas HRD Training Center, Pokhran II, Subhash Nagar, Thane (W) 400601, Maharashtra, India
3. R, Deivasigamani
Tata Consultancy Services Limited, (ETL Infrastructure Services Ltd. SEZ) 200 Ft. Thoraipakkam - Pallavaram Ring Road, Chennai - 600096,Tamil Nadu, India
4. PASUPATHY, Vaithiya Subramani
Tata Consultancy Services Limited, (ETL Infrastructure Services Ltd. SEZ) 200 Ft. Thoraipakkam - Pallavaram Ring Road, Chennai - 600096,Tamil Nadu, India
5. BHATNAGAR, Suraj Kumar
Tata Consultancy Services Limited, TCS Awadh Park, Vibhuti Khand, Gomti Nagar Lucknow - 226 010,Uttar Pradesh
6. PARIHAR, Bhavesh
Tata Consultancy Services Limited, TCS Awadh Park, Vibhuti Khand, Gomti Nagar Lucknow - 226 010, Uttar Pradesh
7. SINGH, Surya
Tata Consultancy Services Limited, TCS Awadh Park, Vibhuti Khand, Gomti Nagar Lucknow - 226 010, Uttar Pradesh
8. MATHUR, Ratnesh
Tata Consultancy Services Limited, TCS Awadh Park, Vibhuti Khand, Gomti Nagar Lucknow - 226 010, Uttar Pradesh

Specification

FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
AN INTELLIGENT TESTING SYSTEM AND METHOD FOR AUTOMATIC SOFTWARE APPLICATION TESTING
APPLICANT:
Tata Consultancy Services Limited A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th floor,
Nariman point, Mumbai 400021,
Maharashtra, India
The following specification describes the invention and the manner in which it is to be performed.

TECHNICAL FIELD
[001] The present subject matter described herein, in general, relates to a method
and system for automated software application testing, and more particularly to system and method that provides intelligent and integrated test environment for evaluating software quality.
BACKGROUND
[002] Software testing that investigates the quality of software product or service
under test, is apparently associated with multiple defects or say, leakage of defects in cases of inadequate software system testing process. Manual testing, as well understood, is accompanied by intensive time consumption, remains more prone to error, is tedious, expensive, and often seen as a tradeoff between reliability and cost/time that makes it generally an unwilling procedure, though a necessity. Further, software testing is also associated with inordinate delays due to non-availability of the integration components needed for system testing and its reliability on multiple applications for performing the task. This challenge becomes formidable, especially when these components are from / controlled by third party.
[003] Further, an end to end software application development requires multiple
sub-systems to be tested independently and then tested together after integration of the sub-systems. Testing such large applications that include multiple sub-systems makes the overall process of testing very error-prone, time consuming and expensive. End to end software application system requires testing of various components during development, post-enhancement and post-maintenance that includes, but not limited to. a front-end testing in which user interface (UT) is tested with rest of the sub-systems, a middleware testing in which web services, apps services (SOA based applications) are required to be tested and a backend testing in which database validation testing is required The overall process of testing increases Time (Time to Market), Cost (Return on Investment) and reduces Quality (bug-free nature) of the application.

[004J Comprehensive test automation is therefore an ardent need, which though
has been endeavored by many, yet remains majorly unfulfilled. A complete testing environment that can provide end-to-end automation across various layers of application is what is understandably required. A complete testing environment that can ensure effective testing of newly developed applications as well as comprehensive testing of applications after enhancement and maintenance is needed. However, in visualizing such an automated testing environment, accurate test case preparation and execution can be difficult and costly. Typically, automatically managing test data and associating it with test cases to simulate a real time transaction/ web services/ applications/ modules has not been actualized so far, and is observed to pose a continual technical challenge to the software testing industry.
[005] In the fight of foregoing proofems, there is exists a need for a system and a
method that can provide one stop solution for automating software testing environment that can effectively and efficiently overcome the deficiencies and technical limitations described hereinabove.
SUMMARY
[006] This summary is provided to introduce Aspects related to systems and
methods for automatic software application testing and the aspects are further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[007] In one implementation, an intelligent testing system and method for
automatic software application testing is disclosed. The computer-implemented system comprises a processor and a memory coupled to the processor for executing a plurality of modules present in the memory. The plurality of modules comprises a receiving module, a scenario generator module, a test data generator module, a test case execution module, a test environment management module, and a report generation module. The receiving module is configured to receive an application data wherein the application data may comprises of a plurality of application requirement, an application design, an application

service artifacts or a combination thereof. The scenario generator module which is configured to generate a plurality of test scenario for further comprises of a user interface scenario generator module, a middleware scenario generator module, and a back-end scenario generator module. The test data generator module is configured to fetch test data based on the plurality of test scenario generated. The test case execution module which is configured to form and execute a plurality of test cases further comprises of an integration module and an execution module. The test environment management module is configured to provision quality assurance environment. The report generation module is configured to produce a test case execution report.
[008] The user interface scenario generator module is configured to auto-generate
a test scenario for user interface testing. The middleware scenario generator module is configured to auto-generate a test scenario for middleware testing. The back-end scenario generator module configured to auto-generate a test scenario for backend testing.
[009] The integration module is configured to integrate the test data and the
plurality of test scenario to generate a plurality of test cases. The execution module is configured to execute the plurality of test cases.
[0010] In one implementation, a computer-implemented method for automatic
software application testing comprises of receiving an application data. The application data received is used for auto-generating a plurality of test scenarios. The plurality of test scenarios generated is further used for fetching test data. The test data and the plurality of test scenarios are integrated to generate a plurality of test cases. Further, the pluralities of test cases are executed in order to generate a test case execution report.
[0011] In one implementation, test data is fetched by assigning a plurality of tags
to each of the meta-data present in the test scenarios generated. The plurality of tags is mapped with a corresponding test schema meta-data table, wherein the test schema metadata table is pre-stored in the repository. The test data is then auto-generated based on the mapping.

BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The detailed description is described with reference to the accompanying
figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
[0013] Figure 1 illustrates a network implementation of an intelligent testing
system for automatic software application testing is shown, in accordance with an embodiment of the present subject matter.
[0014] Figure 2 illustrates the intelligent testing system, in accordance with an
embodiment of the present subject matter.
[0015] Figure 3 illustrates a block diagram of the intelligent testing system, in
accordance with an embodiment of the present subject matter.
[0016] Figure 4 illustrates a block diagram of a scenario generator module in
intelligent testing system, in accordance with an embodiment of the present subject matter.
[0017] Figure 5 illustrates an exemplary test data generation in the intelligent
testing system, in accordance with an embodiment of the present subject matter.
[0018] Figure 6 illustrates an exemplary flowchart for creation of test database, in
accordance with an embodiment of the present subject matter.
[0019] Figure 7 illustrates an exemplary tagging method for test data generation in
the intelligent testing system, in accordance with an embodiment of the present subject matter.
[0020] Figure 8 illustrates a block diagram of a test case execution module in
intelligent testing system, in accordance with an embodiment of the present subject matter,
[0021] Figure 9 illustrates a method for intelligent testing, in accordance with an
embodiment of the present subject matter.
[0022] Figure 10 illustrates a method for auto-generating test data, in accordance
with an embodiment of the present subject matter.

[0023] Figure 11 illustrates system architecture of the intelligent testing system, in
accordance with an embodiment of the present subject matter.
[0024] Figure 12 illustrates a logical architecture of the intelligent testing system,
in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[0025] The present invention generally relates to providing of end-to end
automation in software testing, and more specifically the present invention includes an automated system and method enabling service virtualization, test data management, test environment management, along with UI, middleware and backend testing.
[0026] Furthermore, the intelligent testing hereby, services a solution to ease and
automate the process of application testing by reducing the dependency on testing components availability and binding the test cases with intelligent data provisioning. The solution provisions test environment, and provide business service visualization, test scenario generation, test scenario and test data automation for testing composite application. The intelligent system thus takes care of end-to-end testing in multi tier architecture rightly from presentation-to application-to data tier.
[0027] The system supports automation of testing and importantly quality
assurance (QA) during all phases of Software development Life Cycle (SDLC). Additionally, the system assists in building the test regression suit as well.
[0028] Systems and methods for automatic software application testing are
described. The proposed system may perform software testing concurrently with software development. The present subject matter discloses an effective and efficient system that provides end-to-end integrated environment for software testing lifecycle management. The system further discloses a comprehensive and an automated solution to ease the process of application testing by reducing the dependency on components availability and binding the testing environment with intelligent data provisioning. The system provides an integrated application testing covering front-end testing, middleware testing and back-end testing during the new application development and enhancement(s) and a comprehensive regression testing during application maintenance.

[0029] In one implementation, a computer-implemented system comprises a
processor and a memory coupled to the processor for executing a plurality of modules present in the memory. The plurality of modules comprises a receiving module, a scenario generator module, a test data generator module, a test case execution module, a test environment management module, and a report generation module. The receiving module is configured to receive an application data wherein the application data may comprises of a plurality of application requirement, an application design, an application service artifacts or a combination thereof. The scenario generator module which is configured to generate a plurality of test scenario for further comprises of a user interface scenario generator module, a middleware scenario generator module, and a back-end scenario generator module. The test data generator module is configured to fetch test data based on the plurality of test scenario generated. The test case execution module which is configured to form and execute a plurality of test cases further comprises of an integration module and an execution module. The test environment management module is configured to provision quality assurance environment. The report generation module is configured to produce a test case execution report.
[0030] The user interface scenario generator module is configured to auto-generate
a test scenario for user interface testing. The middleware scenario generator module is configured to auto-generate a test scenario for middleware testing. The back-end scenario generator module configured to auto-generate a test scenario for backend testing.
[0031] The integration module is configured to integrate the test data and the
plurality of test scenario to generate a plurality of test cases. The execution module is configured to execute the plurality of test cases.
[0032] In one aspect of the invention a system for automating the graphical user
interface (UI) testing is disclosed. The system comprises of generating the test scenarios automatically based on the application data received. Further, a plurality of test scenarios is generated based on the application data received and stored for subsequent use in the application lifecycle. Based on the test scenarios generated and user interface scripts are generated automatically. After the creation of the user interface scripts a relevant test data for test scripts is fetched for test execution. The test scripts are further stored for

subsequent use in the application lifecycle. Then the test scripts generated are executed. Further, the system may provide a provision to monitor and support the test environment for maintaining a quality aspect in the system. Finally, the execution results are recorded.
[0033] In another aspect of the invention a system for validating middleware
testing is disclosed. The system comprises of identifying the application data, wherein the application may include but not limited to service artifacts like a Web Services Description Language (WSDLs), Copybooks, XML Schema Definition (XSDs), etc. Further, a plurality of test scenarios is generated based on the application data received and stored for subsequent use in the application lifecycle. After the creation of the test scenarios a relevant test data for test scripts is fetched for test execution. Based on the test data fetched, service test cases for all possible combinations are generated. Further, the service test cases generated' are stored' for subsequent use in fne application life cycle. The system may provide a provision to monitor and support the test environment for maintaining a quality aspect in the system. Further, the service test cases are executed. Finally, the execution results are recorded.
[0034] In another aspect of the invention, a system for creating virtual models of
sub-systems is disclosed. The system includes identifying the application data, wherein the application may include but not limited to service artifacts like WSDLs, Copybooks, XSDs, WADLs, etc. Based on the application data received the virtualized service models are generated. The relevant data set is attached to the service models created.
[0035] While aspects of described system and method for automatic software
application testing may be implemented in any number of different computing systems, environments, and/or configurations, the embodiments are described in the context of the following exemplary system.
[0036] Referring now to Figure 1, a network implementation 100 of an intelligent
testing system for automatic software application testing is illustrated, in accordance with an embodiment of the present subject matter. In one embodiment, intelligent testing system for automatic software application testing is provided. In one embodiment, the intelligent testing system 102 provides end-to-end automation of application testing by

reduced dependency on testing components and binding of test cases with intelligence data provisioning. The automation in testing and Quality Assurance is ensured for all layers of a multitier architecture-front end, middleware and backend
[0037] Although the present subject matter is plained, considering that the
intelligent testing system 102 is implemented as an intelligent testing system on a server, it may be understood that the intelligent testing system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook. a workstation, a mainframe computer, a server, a networ^ server, and the like. It shall be understood that multiple users may access the intelligent testing system 1.02 through one or more user devices 104-1, 104-2... 104-N, coIlectively referred to as user 104 hereinafter, or applications residing on the user devices 104 Examples of the user devices 104 may include, but are not limited to a portable computer, a personal digital assistant, a handheld device, and a workstation. The user devices 104 are communicatively coupled to the intelligent testing system 102 through a network 106.
[0038] In one implementation, the network 106 may be a wireless network, a
wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for examble Hypertext Transfer Protocol (HTTP), Transmission Control Protocol /Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further, the network 106 may include a variety of network devices, including routers bridges, servers, computing devices, storage devices, and the like.
[0039] Referring now to Figure 2, the intelligent testing system 102 is illustrated in
accordance with an embodiment of the present subject matter in one embodiment, the intelligent testing system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries and/or any devices that

manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 206.
[0040] The I/O interface 204 may include a variety of software and hardware
interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the intelligent testing system 102 to interact with a user directly or through the client devices 104. Further, the I/O interface 204 may enable the intelligent testing system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example. LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[0041] The memory 206 may include any computer-readable medium known in
the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
[0042] The modules 208 include routines, programs, objects, components, data
structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 may include a receiving module 212, a scenario generator module 214, a test generator module 216, a test case execution module 218, a test environment management module 220, a report generation module 222 and other modules 224. The other modules 224 may include programs or coded instructions that supplement applications and functions of the intelligent testing system 102.
[0043] The data 210, amongst other things, serves as a repository for storing data
processed, received, and generated by one or more of the modules 208. The data 210 may also include a received database 226. a scenario generated database 228, a test data

generated database 230, a test case executed database 232, a test environment management database 234, a report generated database 236 and other data 236. The other data 236 may include data generated as a result of the execution of one or more modules in the other module 218.
[0044] In one implementation, at first, a user may use the client device 104 to
access the intelligent testing system 102 via the I/O interface 204. The user may register him using the I/O interface 204 in order to use the intelligent testing system 102. The working of the intelligent testing system 102 may be explained in detail in figures explained below. The intelligent testing system 102 may be used for automatic software application testing. In order to perform software application testing, the intelligent testing system 102, at first, receives a plurality of application data. Specifically, in the present implementation, the plurality of application data is received by the receiving module 212.
[0045] In one implementation, the system 102 provides the ability to monitor the
QA environment running as provisioned from the I/O interface 204. It also provides the snapshot of the QA process test cases executed in sub-module 104b, ultimately assists in creating a one-stop dashboard for the end user for viewing QA metrics.
[0046] Referring now to figure 3, a block diagram 300 of the intelligent testing
system 102 is illustrated, in accordance with an embodiment of the present subject matter.
RECEIVING MODULE
[0047] In one implementation, the receiving module 212 is configured to receive a
plurality of application data. The plurality of application data may be obtained from various devices 104 interacting with the intelligent testing system 102. In one example, the plurality of application data may be obtained from the memory 206.
[0048] In one example, the plurality of application data received may included but
not limited to a plurality of application requirement, an application design, an application service artifacts or a combination thereof.
[0049] In one example the application may include but not limited to service
artifacts like a Web Services Description Language (WSDLs). Copybooks, XML, Schema

Definition (XSDs), SQLs, JDBC Query, JNDL EJB, RM1 JSON, REST etc. The data would be received through URLs, HTTP/HTTPS, TCP/IP, application servers and the like.
[0050] In one implementation, a data related to the receiving module 212 is stored
in the receiving database 226. In one example, the receiving database 226 may include but not limited to the data related to the application artifacts, WSDLs, XSDs, and the like.
SCENARIO GENERATOR MODULE
[0051] After the application data is received, the intelligent testing system 102
automatically generates a plurality of test scenarios based on the application data received using scenario generator module 214.
[0052] In one implementation, the plurality of test scenarios is generated} jThe
pluralities of test scenarios are generated using application data which consist of design artifacts and business process flows. The application data is fed into scenario generator module which generates all possible combinations of test iterations for scenarios. The possible combinations are generated by a process which identifies sources and targets states first, and then depending on the correlation between these states, finds all possible paths/flows using switch-coverage algorithm and other standard APIs to generate test scenarios. Each scenario represents a flow from start and end in the application data. All scenarios cover each and every branch of application data.
[0053] In one implementation, the plurality of test scenarios comprises of a user
interface scenario data, a middleware scenario data, a back-end scenario data, or a combination thereof. The details regarding the user interface scenario data, the middleware scenario data and the back-end scenario data, is explained in the figure 4.
[0054] In one implementation, the plurality of test scenarios consists of atleast one
meta-data. The meta-data comprises of atleast one of a service name, an operation name, an element type, an element name and combination thereof.
[0055] In one implementation, the scenario generator module 214 also considers
specific business processes inputted as Business Process Model flows and other artifacts while creating test scenarios to ensure that all requirements of the application are covered.

[0056] In one implementation, a data related to the scenario generator module 214
is stored in the scenario generated database 228. In one example, the scenario generated database 228 may include but not limited to the user interface scenarios, the middleware scenario, the back-end scenario or the combination thereof.
TEST DATA GENERATOR MODULE
[0057] A test data generator module 216 is configured to fetch test data based on
the test scenario generated. The test data generator module 216 is configured to generate the test data by assigning a plurality of tags to each of the meta-data. The plurality of tags is mapped with a corresponding test schema meta-data table, wherein the test schema meta-data table is pre-stored in the repository. The test data is generated based on the mapping.
[0058] The details related to test data creation is explained in figure 5, figure 6 and
figure 7, as an exemplary embodiment below.
[0059] In one implementation, a data related to the test data generator module 216
is stored in the test data generated database 230. In one example, the test data generated database 230 may include but not limited to the data related to user interface, data related to management repository, a service name, an operation name, an element type, an element name, a plurality of schemas and combination thereof.
[0060] In one implementation, test data generated database 230 may include meta-
data repository including a plurality of schemas that includes but not limited to a schema containing meta-data which enables automatic generation of test data on the arrival of a data request, a schema containing meta-data of the Test Database. Test Database is the database present in Client environment and contains Test data, and the like.
TEST CASE EXECUTION MODULE
[0061] A test case execution module 218 is configured to integrate the test data
created and the plurality of test scenarios generated to generate a plurality of test cases. Further, the test case execution module 218 is configured to execute the plurality of test cases.

[0062] In one implementation, the test case execution module 218 comprises of an
integration module 802 and an execution module 804. The details related to integration module 802 and the execution module 804 is explained in figure 8.
[0063] In one implementation, the test case execution module 218 enables in
providing a quintessential functionality of integrating the test scenarios with test data in a real time production environment as it judiciously identify the integration points, as applicable. In an embodiment of the present invention, the system 102 utilizes LISA'S single environment to generate automated test scenarios to test and ensure quality across all the implementation layers. Further, the test case execution module 218 executes the test cases so created and stores them as test suite for subsequent regression tests.
[0064] In one implementation, a data related to the test case execution module 218
is stored in the test case executed database 232. In one example, the test case executed database 232 may include but not limited to the execution details like the execution time of a test case, compiling time of test case and the like.
TEST ENVIRONMENT MANAGEMENT MODULE
[0065] The test environment management module 220 is configured to provision
quality assurance environment for the intelligent testing system 102. The test environment management module 220 is configured to manage test environment, test applications and reservations of test environments. In one example, a reservation management sub-module is configured for reserving different test environments. The test environments may be configured through an environment management sub-module. The application management sub-module helps in configuring and deploying test applications in different test environments.
[0066] In one example, the test environment management module 220 provides
heterogeneous test environment that includes but not limited to multi-protocol, middleware, databases via hardware infrastructure for application testing and/or service virtualization.

[0067] In one implementation, the test environment management module 220
provides the ability to create the QA environment on the fly to support running of the present system 102. The test environment management module 220 services the provisioning of the QA environment, if needed; provisioning & deployment of the virtual business services and the provisioned QA data.
[0068] In one implementation, a data related to the test environment management
module 220 is stored in the test environment management database 234. In one example, the test environment management database 234 may include but not limited to a data related to middle ware testing environment, a data related to the user interface testing environment, a data related to the multi-protocol testing environment and the like.
REPORT GENERATION MODULE
[0069] The report generation module 222 is configured to produce a test case
execution report.
[0070] In one implementation, the report generation module 222 produces a report
depicting the execution and execution status of the plurality of test cases executed.
[0071] In one implementation, a data related to the report generation module 222 is
stored in the report-generated database 234. In one example, the report generated database 234 may include but not limited to a plurality of reports and the like.
[0072] Referring now to figure 4, a block diagram 400 of a scenario generator
module 214 in intelligent testing system 102 is illustrated, in accordance with an embodiment of the present subject matter.
[0073] The scenario generator module 214 further comprises of a user interface
scenario generator module 402, a middleware scenario generator module 404 and a back-end scenario generator module 406. The user interface scenario generator module 402 is configured to generate to auto-generate a test scenario for user interface testing. The middleware scenario generator module 404 is configured to generate to auto-generate a test scenario for middleware testing. The back-end scenario generator module 406 is configured to generate to auto-generate a test scenario for backend testing.

[0074] In one implementation, the user interface scenario generator module 402 is
configured generate a plurality of user interface test scenarios based on the application data received from the receiving module 212.
[0075] In one implementation, a data related to the user interface scenario
generator module 402 is stored in the scenario-generated database 228. The data stored may include but not limited to a mouse position capture, an event on user interface, and the like data.
[0076] In one implementation, the middleware scenario generator module 404 is
configured to generate a plurality of middleware test scenarios based on the application data received from the receiving module 212.
[0077] In one implementation, a data related to the middleware scenario generator
module 404 is stored in the scenario-generated database 228. The data stored may include various services related data that includes but not limited to enterprise application integration, data integration, message oriented middleware (MOM), object request brokers (ORBs), and the enterprise service bus (ESB), and the like data.
[0078] In one implementation, the back-end scenario generator module 406 is
configured to generate a plurality of backend test scenarios based on the application data received from the receiving module 212.
[0079] In one implementation, a data related to the middleware scenario generator
module 406 is stored in the scenario-generated database 228. The data stored may include database testing data, and the like data.
[0080] Referring now to figure 5, an exemplary test data generation in the
intelligent testing system 102 is illustrated, in accordance with an embodiment of the present subject matter.
[0081] In one example, the intelligent testing system 102 generates an
artificial/synthetic test data to test a system that is not available for testing.

[0082] In one example, the intelligent testing system 102 generates a clean and
compliant Test Data for application testing. Test data is generated by profiling, sub-setting and masking of the data from production environment.
[0083] In one implementation, the scenario generator module 214 requests for the
test data mapping the plurality of test scenario generated from the test data generated database 230.
[0084] In one implementation, the data related to the test data generator module
216 is stored in the test data generated database 230. In one example, the test data generated database 230 may include but not limited to the data related to user interface, data related to management repository, a service name, an operation name, an element type, an element name, a plurality of schemas and combination thereof.
[0085] The creation of test data is explained in the figure 6.
[0086] Referring to figure 6, an exemplary flowchart for creation of test database
is illustrated, in accordance with an embodiment of the present subject matter.
[0087] In one implementation, the test data generator module 216 is configured to
generate the test data by assigning a plurality of tags to each of the meta-data. The plurality of tags is mapped with a corresponding test schema meta-data table, wherein the test schema meta-data table is pre-stored in the repository. The test data is generated based on the mapping.
[0088] In one example, the test scenarios generated by the scenario generator
module 214 has at least one meta-data. The meta-data comprises of at least one of a service name, an operation name, an element type, an element name or a combination thereof. The test data generator module 216 extracts the meta-data stored in the scenario-generated database 228. The test data generator module 216 further assigns XML tags to the plurality of meta-data. The plurality of tags is mapped with a corresponding test schema meta-data table, wherein the test schema meta-data table is pre-stored in the repository. The mapping is done by an XML mapper.

[0089] Based on the mapping, the test data generator module 216 generates an
output in the form of a tag mapped with exactly with column name, a tag mapped with multiple matches or column name, and a tag without matching column.
[0090] In order to generate a plurality of test data, the output of the mapping is
used for a query generation, which then based on a join result and based on an input from a user. The plurality of test data generated is stored in the test data generated database 230, which is further provided for creating test cases and service virilizations.
[0091] The process of mapping, building a query and generating test data is
illustrated in figure 7 below.
[0092] Referring now to Figure 7, is an exemplary tagging method for test data
generation in the intelligent testing system, in accordance with an embodiment of the present subject matter.
[0093] Figure 7 (a) illustrates an exemplary mapping of meta-data with the table.
Figure 7 (b) and Figure 7 (c) illustrates an exemplary verification of duplication of tag mapping in the existing table. Figure 7 (d) illustrates an exemplary building a query based on the mapped tags.
[0094] In one implementation, the XML files generated by the scenario generator
including the meta-tags, are read and the corresponding XML tags are loaded in WSDLMeta Data table by Java program, as illustrated in the figure 7 (a). The table may include a service name, an operation name, an element name (XML Tag Name), an element type (Data Type), and the like.
[0095] In one implementation, the XML tags are mapped of with the
corresponding test schema meta-data tables. In one example, during this step, data is being loaded in Map_Table by Java program (Stored Procedure is being used which has been embedded in Java code). While loading data in Map_Table, join is being performed on User_tab_columns & wsd!_meta_data tables, as illustrated in the figure 7 (b).
[0096] In one example, existence of tags is verified in mst_user_map_table, say if
for a particular WSDL operation, 3 new tags are coming & 1 old tag is there. System

performs a lookup on Mst_User_Map_table since it maintains history data for the incoming tags. If the tag is found then load that particular tag in Trans_User_Map_Table with indicator 'C (Data copied from Mst_User_Map_table to Trans_User_Map_TabIe), as illustrated in figure 7 (c).
[0097] In one implementation, once Trans_User_Map Table is built with the tag
meta-data and referential integrity details, the process will further query the data in Trans_User_Map_Table for below scenarios:
[0098] Scenario 1: Tags mapped correctly with columns (1-1 match). This will be
perfect match case where for a particular XML tag corresponding table column is present.
[0099] Scenario 2: Multiple matches (multi-columns) listing for a single XML tag.
The process will query Trans_User_Map_Table . userconstraints, user_cons_columns to get the PK-FK relationship between the tables. These relationship data is inserted into the Trans_User_Map_Table for all non existing (new) tags.
[00100] Scenario 3: Tags without any matches, in this case User inputs will be
needed. This is identified by performing right outer join on wsdl_mata_data and map_table. Insert tester data into Transuser map_table with match_type as 'U' (User Input).
[00101] In one example, based on the inputs, SQL query is generated and loaded
into Mst_WSDL table. Build Test case and write response XML from table Mst_WSDL table, CSV output files are generated, as illustrated in figure 7 (d).
[00102] Referring now to figure 8 illustrates a block diagram 800 of a test case
execution module 218 in intelligent testing system 102, in accordance with an embodiment of the present subject matter.
[00103] The test case execution module 218 further comprises of an integration
module 802 and an execution module 804. The integration module 802 is configured to integrate the test data and the plurality of test scenario to generate a plurality of test cases. The execution module 804 is configured to execute the plurality of test cases generated by the integration module 802.

[00104] In one implementation, the integration module 802 provides a capability for
automated test cases generation from the plurality of scenarios generated and the test data fetched from the scenario-generated database and test data generated database respectively.
[00105] The automated test cases generation is performed by existing techniques of
test case generation.
[00106] In one implementation the plurality of test cases generated are executed by
the execution module 804. The execution module is further configured to store a plurality of testing tools selected from a group comprising of Quick Test Professional (QTP). Selenium, Silk test, Visual Studio Team System (VSTS), and Rational Functional Tester, which may be used for execution of the plurality of test cases generated.
[00107] The principal advantage of invention described above is that it is able to
operate in any location and at any time, and is extremely simple and safe to operate since it uses well known devices like laptops, personal computer, and the like for testing the application.
[00108] The other advantage of invention is that it is able to provide an intelligent
system and method capable of automating software application testing environment.
[00109] The other advantage of invention is that it is able to provide end-to-end
testing across a multi-tier architecture and quality assurance in the entire testing process.
[00110] The other advantage of invention is that it is able to provide auto build of
test cases along with test data to simulate a real transaction / web services / applications / modules.
[00111] The other advantage of invention is that it is able to provide one stop
quality assurance (QA) dashboard to view QA metrics.
[00112] The other advantage of invention is that it is able to provide an intelligent
provisioning of test data to end users / virtual business services / test cases.
[00113] The other advantage of invention is that it is able to automate the building
of test regression suites as well.

[00114] The other advantage of invention is that it is able to provide integrated
solution assuring quality right from test design to test execution
[00115] Referring now to Figure 9, a method 900 for method for intelligent testing
is shown, in accordance with an embodiment of the present subject matter. The method 900 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method 900 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are finked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[00116] The order in which the method 900 is described is not intended to be
construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 900 or alternate methods. Additionally, individual blocks may be deleted from the method 900 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 900 may be considered to be implemented in the above described intelligent testing system 102.
[00117] At block 902, an application data is received. In one implementation, the
application data is received by the receiving module 212.
[00118] In one example, the application data comprises of a plurality of application
requirement, an application design, an application service artifacts or a combination thereof.
[00119] At block 904, a plurality of test scenarios are auto-generated, based on the
application data received. In one implementation, the plurality of test scenarios is auto-generated by the scenario generator module 214.

[00120] In one example, the plurality of test scenarios comprises of a user interface
scenario data, a middleware scenario data, a back-end scenario data, or a combination thereof.
[00121] At block 906. a test data based on the plurality of test scenarios generated is
fetched. In one implementation, the test data is fetched by the test data generator module 216.
[00122] In one example, the test data based on the plurality of test scenarios
generated is fetched by assigning a plurality of tags to each of the meta-data, mapping the
plurality of tags with a corresponding test schema meta-data table, wherein the test
schema meta-data table is pre-stored in the repository, and auto-generating test data based on the mapping.
[00123] At block 908, the test data and the plurality of test scenarios are integrated
in order to generate plurality of test cases. In one implementation, the test case execution module 218 generates the plurality of test cases.
[00124] At block 910, the plurality of test cases is executed. In one implementation,
the test case execution module 218 executes the plurality of test cases.
[00125] In one example, the plurality of test cases are executed using at-least one
testing tool selected from a group comprising of Quick Test Professional (QTP), Selenium, Silk test, Visual Studio Team System (VSTS), and Rational Functional Tester.
[00126] In one implementation, quality assurance environment is maintained after
this step. The test environment management module 220 is used to provision quality assurance environment.
[00127] At block 912, a test case execution report is produced. The test case
execution report is produced by the report generation module 222.
[00128] Referring now to Figure 10, a method 1000 for auto-generating test data, in
accordance with an embodiment of the present subject matter.

[00129] In one implementation, the test data is fetched based on the plurality of test
scenarios generated. In one implementation, the test data is fetched by the test data generator module 216.
[00130] In one implementation, the plurality of test scenarios consists of atleast one
meta-data. In one example, the meta-data comprises of atleast one of a service name, an operation name, an element type, an element name and combination thereof.
[00131] At block 1002, a plurality of tags to each of the meta-data is assigned.
[00132] At block 1004, the plurality of tags with a corresponding test schema meta-
data table are mapped, wherein the test schema meta-data table is pre-stored in the repository.
[00133] At block 1006, the test data based on the mapping is auto-generated.
[00134] The overall process of test data generation is disclosed in figure 7.
[00135] Referring now to figure 11, illustrates system architecture of the intelligent
testing system 102, in accordance with an embodiment of the present subject matter.
[00136] The architecture of intelligent testing system 102 according to an
exemplary embodiment of the present invention is shown. The test automation method of the present invention is applicable to all types of software programs. The intelligent system 102 generally comprises of the following components in an exemplary embodiment:
[00137] Service Virtualization provides the ability to visualize the third party
services or applications or application modules. It reduces the dependency constraints on development and testing teams as it emulates the behavior and functions of specific dependent components that testers need to exercise. In one example, a processes from banking, telecom, retail and other such domain act as third parties controlled components whose behavior the service virtualization emulates in order to complete end-to-end transactions. Thereon, the service virtualization provides automatic virtual business services creation for testing which gets subsequently transferred to test environment management module.

[00138] Test Scenario Generator is configured to create test scenarios for UI testing,
test scenario generator for middleware (web services) configured to create test scenarios specific to middleware testing, and test scenario generator for back-end configured to create test scenarios specific to back-end testing. The test scenario generator also considers specific business processes inputted as Business Process Model flows and other artifacts while creating test scenarios to ensure that all requirements of the application are covered.
[00139] Test data generator assists in automatic creating of test data and thereby in
facilitation of quick data generation or provisioning. The automatic migration of data from production environment to the QA environment is enabled by the test data generator. The test data generator is provided with an ability to apply the regulatory norms on data privacy for the data in QA. The test data generator, on a whole, is strategically organized to produce more efficient and reliable test data that will assist in early detection of defects in process and subsequently accelerated release cycles of systems.
[00140] Test Case Generator is a unique component of system 102 and is
configured to effectively bind the test scenarios generated by the test scenario generator, with test data so created by the test data generator. The test case generator enables in providing a quintessential functionality of integrating the test scenarios with test data in a real time production environment as it judiciously identifies the integration points, as applicable. In an embodiment of the present invention, the intelligent system 102 utilizes LISA's single environment to generate automated test scenarios to test and ensure quality across all the implementation layers. Further, the test execution engine executes the test cases so created and stores them as test suite for subsequent regression tests.
[00141] Test Environment Management provides the ability to create the QA
environment on the fly to support running of the present system 102. The test environment management services the provisioning of the QA environment, if needed; provisioning & deployment of the virtual business services and the provisioned QA data.
[00142] Dashboard and Metrics provides the ability to monitor the QA environment
running as provisioned from the service virtualization. It also provides the snapshot of the

QA process test cases executed, ultimately assists in creating a one-stop dashboard for the end user for viewing QA metrics.
[00143] Another alternate embodiment of the present system 102 comprises of a
data validation module (not shown in Figure 11) for validating Source and Target databases.
[00144] Re-referring to Figure 11, an exemplary intelligent testing system 102 is
illustrated. The system 102 receives application requirements/design and/or other service artifacts as input. The test scenario generators are triggered to generate UI, middleware, and database specific test scenarios respectively. Next, the test data generator automatically generate / create test data best imitating a business application in an actual-use environment such that most specific and reliable conditions are defined for generation of test cases. The test case generator then performs the most important task of integrating test scenarios with test data to generate specific test cases that when performed exercise a particular transaction or function. A test report is then generated containing a result of execution of the test case. The system 102 thus provisions automated test environment utilizing test environment management, and provides business service virtualization, test scenario generation, test scenario automation and test data for testing composite applications.
[00145] Referring now to Figure 12, illustrates a logical architecture of the
intelligent testing system 102. in accordance with an embodiment of the present subject matter.
[00146] In one implementation, the logical architecture of the intelligent testing
system 102 comprises of application data, receiving module, test scenario generator module, test case/ service virtualization module, test case execution module, test data, and test application.
[00147] In one example, the application data application data comprises of a
plurality of application requirement, an application design, an application service artifacts or a combination thereof, is given as input to the receiving module 212. The receiving module 212 forwards the application data to the scenario generator module 214. The

scenario generator module 214 is configured for auto-generating a plurality of test scenarios based on the application data received. The scenario generator module 214 is configured to configured to auto-generate a test scenario for user interface testing, a test scenario for middleware testing, and a test scenario for backend testing.
[00148] The scenario generator module 214 further transfers the scenarios to test
data generator module 216. The scenario generator module is configured to fetch test data based on the test scenario generated. The plurality of test scenarios consists of atleast one meta-data, wherein the meta-data comprises of atleast one of a service name, an operation name, an element type, an element name and combination thereof.
[00149] The test data generator module 216 is configured to fetch the test data
based on the plurality of test scenarios generated. The test data is created by assigning a plurality of tags to each of the meta-data, mapping the plurality of tags with a corresponding test schema meta-data table, wherein the test schema meta-data table is pre-stored in the repository, and auto-generating test data based on the mapping. The test data created is further used for database testing i.e., back-end testing.
[00150] The test case execution module 218 is configured to integrate the test data
and the plurality of test scenario to generate a plurality of test cases and execute the plurality of test cases generated.
[00151] Further, the test case execution module 218 is configured to test the web
services, service virealization, SOA testing, and the like for middleware testing. Also, the test case execution module 218 is further configured to test the functional testing, GUI testing, regressing testing and the like for front end testing.
[00152] All the testing related information is given to test applications which further
includes but not limited to application server, web server, and database server, respectively.
[00153] The Test Environment Management Services provide the ability to create
the QA environment on the fly to support running of the present system 102. The test environment management services the provisioning of the QA environment, if needed: provisioning & deployment of the virtual business services and the provisioned QA data.

[00154] Although implementations for methods and systems for automatic software
application testing have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for automatic software application testing.

WE CLAIM:
1. A computer-implemented method for testing a software application, the method comprising:
receiving an application data comprising data associated with a plurality of layers, wherein each layer of the plurality of layers facilitates execution of the software application:
generating a plurality of test scenarios and a plurality of virtual services for testing the each layer of the plurality of layers, wherein the plurality of test scenarios and the plurality of virtual services are generated based upon the application data received, and wherein the plurality of test scenarios comprises a plurality of metadata;
fetching test data from a test database, wherein the test data is fetched based on the plurality of test scenarios and virtual services generated;
integrating the test data with the virtual services and the plurality of test scenarios, wherein the integration of the test data with the plurality of test scenarios enable to generate a plurality of test cases;
deploying the virtual services to execute the plurality of test cases, wherein the execution of the test cases facilitates the testing of the each layer of the plurality of layers;; and
generating a report indicative of validation of the each layer of the plurality of layers based upon the execution of the test cases, wherein the receiving, the generating, the fetching, the integrating, the deploying, and the generating a report are performed by a processor.
2. The computer-implemented method of claim 1, wherein the application data comprises of a plurality of application requirements, an application design, an application service artifacts or a combination thereof
3. The computer-implemented method of claim I, wherein the plurality of test scenarios comprises a scenario data associated with each layer of the plurality of layers.

and wherein the plurality of layers comprises a user interface layer, a middleware layer and a backend layer.
4. The computer-implemented method of claim 1, wherein the plurality of test scenarios are generated by matching all possible combinations of test iterations for a plurality of scenarios.
5. The computer-implemented method of claim 1, wherein the meta-data comprises of at least one of a service name, an operation name, an element type, an element name and combination thereof.
6. The computer-implemented method of claim 1, wherein the plurality of virtual services comprises of a middleware image and model, a back-end image and model, or a combination thereof.
7. The computer-implemented method of claim 7, wherein the plurality of virtual services is generated by capturing input-output and behavior of original application.
8. The computer-implemented method of claim 1, wherein test data is fetched by:
assigning a tag to a metadata of the plurality of the meta-data;
mapping the tag with a test schema meta-data table from a plurality of test schema meta-data tables, stored in the repository, wherein the test schema meta-data table is associated with the metadata; and
generating the test data based on the mapping of the tag with the test schema meta-data table, and wherein the test schema meta-data table is associated with the test data stored in the test database.
9. The computer-implemented method of claim 1, wherein the plurality of test cases
are generated by;
determining unique iterations from the plurality of test scenarios; and

binding the test data the unique iterations determined to generate the test cases.
10. The computer-implemented method of claim 1, wherein the plurality of test cases are executed using at-least one testing tool selected from a group comprising of Quick Test Professional (QTP), Selenium. Silk test, Visual Studio Team System (VSTS), and Rational Functional Tester.
11. A computer-implemented system for automatic software application testing, the system comprising:
a processor; and
a memory coupled to the processor, wherein the processor is capable of executing a plurality of modules stored in the memory, and wherein the plurality of module comprising:
a receiving module configured to receive an application data comprising data associated with a plurality of layers, wherein each layer of the plurality of layers facilitates execution of the software application;
a scenario generator module configured to generate a plurality of test scenarios and a plurality of virtual services for testing the each layer of the plurality of layers, wherein the plurality of test scenarios and the plurality of virtual services are generated based upon the application data received, and wherein the plurality of test scenarios comprises a plurality of metadata;
a test data generator module configured to fetch test data from a test database, wherein the test data is fetched based on the plurality of test scenarios and virtual services generated;
a test case execution module further comprising:
an integration module configured to integrate the test data with the virtual services and the plurality of test scenarios to generate a plurality of test cases; and
an execution module configured to execute the plurality of test
cases, wherein the execution of the test cases facilitates the testing of each layer of the plurality of layers;

a test environment management module configured to provision quality assurance environment; and
a report generation module configured to generate a report indicative of validation of the each layer of the plurality of layers based upon the execution of the test cases.
12. The computer-implemented system of claim II, wherein the meta-data comprises of at least one of a service name, an operation name, an element type, an element name or a combination thereof.
13. The computer-implemented system of claim 11, wherein the execution module is further configured to store a plurality of testing tools selected from a group comprising of Quick Test Professional (QTP), Selenium, Silk test, Visual Studio Team System (VSTS). and Rational Functional Tester.
14. The computer-implemented system of claim 11, wherein the test data generator module is further configured to generate the test data by performing the steps of:
assigning a tag to a metadata of the plurality of the meta-data;
mapping the tag with a test schema meta-data table from a plurality of test schema meta-data tables, stored in the repository, wherein the test schema meta-data table is associated with the metadata; and
generating the test data based on the mapping of the tag with the test schema meta-data table, and wherein the test schema meta-data table is associated with the test data stored in the test database.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 450-MUM-2013-IntimationOfGrant12-05-2022.pdf 2022-05-12
1 Form 2.pdf 2018-08-11
2 450-MUM-2013-PatentCertificate12-05-2022.pdf 2022-05-12
2 ABSTRACT1.jpg 2018-08-11
3 450-MUM-2013-Written submissions and relevant documents [17-03-2022(online)].pdf 2022-03-17
3 450-MUM-2013-FORM 5(12-8-2013).pdf 2018-08-11
4 450-MUM-2013-FORM 3(12-8-2013).pdf 2018-08-11
4 450-MUM-2013-Correspondence to notify the Controller [23-02-2022(online)].pdf 2022-02-23
5 450-MUM-2013-FORM-26 [23-02-2022(online)]-1.pdf 2022-02-23
5 450-MUM-2013-FORM 26(4-4-2013).pdf 2018-08-11
6 450-MUM-2013-FORM-26 [23-02-2022(online)].pdf 2022-02-23
6 450-MUM-2013-FORM 2(TITLE PAGE)-(12-8-2013).pdf 2018-08-11
7 450-MUM-2013-US(14)-HearingNotice-(HearingDate-11-03-2022).pdf 2022-02-16
7 450-MUM-2013-FORM 2(12-8-2013).pdf 2018-08-11
8 450-MUM-2013-FORM 18(12-8-2013).pdf 2018-08-11
8 450-MUM-2013-CLAIMS [22-11-2018(online)].pdf 2018-11-22
9 450-MUM-2013-COMPLETE SPECIFICATION [22-11-2018(online)].pdf 2018-11-22
9 450-MUM-2013-FORM 1(16-8-2013).pdf 2018-08-11
10 450-MUM-2013-FER.pdf 2018-08-11
10 450-MUM-2013-FER_SER_REPLY [22-11-2018(online)].pdf 2018-11-22
11 450-MUM-2013-DRAWING(12-8-2013).pdf 2018-08-11
11 450-MUM-2013-OTHERS [22-11-2018(online)].pdf 2018-11-22
12 450-MUM-2013-DESCRIPTION(COMPLETE)-(12-8-2013).pdf 2018-08-11
12 450-MUM-2013-PETITION UNDER RULE 137 [22-11-2018(online)].pdf 2018-11-22
13 450-MUM-2013-CORRESPONDENCE(4-4-2013).pdf 2018-08-11
13 450-MUM-2013-RELEVANT DOCUMENTS [22-11-2018(online)].pdf 2018-11-22
14 450-MUM-2013-CORRESPONDENCE(16-8-2013).pdf 2018-08-11
14 450-MUM-2013-FORM 4(ii) [19-09-2018(online)].pdf 2018-09-19
15 450-MUM-2013-ABSTRACT(12-8-2013).pdf 2018-08-11
15 450-MUM-2013-CORRESPONDENCE(12-8-2013).pdf 2018-08-11
16 450-MUM-2013-CLAIMS(12-8-2013).pdf 2018-08-11
17 450-MUM-2013-CORRESPONDENCE(12-8-2013).pdf 2018-08-11
17 450-MUM-2013-ABSTRACT(12-8-2013).pdf 2018-08-11
18 450-MUM-2013-FORM 4(ii) [19-09-2018(online)].pdf 2018-09-19
18 450-MUM-2013-CORRESPONDENCE(16-8-2013).pdf 2018-08-11
19 450-MUM-2013-CORRESPONDENCE(4-4-2013).pdf 2018-08-11
19 450-MUM-2013-RELEVANT DOCUMENTS [22-11-2018(online)].pdf 2018-11-22
20 450-MUM-2013-DESCRIPTION(COMPLETE)-(12-8-2013).pdf 2018-08-11
20 450-MUM-2013-PETITION UNDER RULE 137 [22-11-2018(online)].pdf 2018-11-22
21 450-MUM-2013-DRAWING(12-8-2013).pdf 2018-08-11
21 450-MUM-2013-OTHERS [22-11-2018(online)].pdf 2018-11-22
22 450-MUM-2013-FER.pdf 2018-08-11
22 450-MUM-2013-FER_SER_REPLY [22-11-2018(online)].pdf 2018-11-22
23 450-MUM-2013-COMPLETE SPECIFICATION [22-11-2018(online)].pdf 2018-11-22
23 450-MUM-2013-FORM 1(16-8-2013).pdf 2018-08-11
24 450-MUM-2013-FORM 18(12-8-2013).pdf 2018-08-11
24 450-MUM-2013-CLAIMS [22-11-2018(online)].pdf 2018-11-22
25 450-MUM-2013-US(14)-HearingNotice-(HearingDate-11-03-2022).pdf 2022-02-16
25 450-MUM-2013-FORM 2(12-8-2013).pdf 2018-08-11
26 450-MUM-2013-FORM-26 [23-02-2022(online)].pdf 2022-02-23
26 450-MUM-2013-FORM 2(TITLE PAGE)-(12-8-2013).pdf 2018-08-11
27 450-MUM-2013-FORM-26 [23-02-2022(online)]-1.pdf 2022-02-23
27 450-MUM-2013-FORM 26(4-4-2013).pdf 2018-08-11
28 450-MUM-2013-FORM 3(12-8-2013).pdf 2018-08-11
28 450-MUM-2013-Correspondence to notify the Controller [23-02-2022(online)].pdf 2022-02-23
29 450-MUM-2013-Written submissions and relevant documents [17-03-2022(online)].pdf 2022-03-17
29 450-MUM-2013-FORM 5(12-8-2013).pdf 2018-08-11
30 ABSTRACT1.jpg 2018-08-11
30 450-MUM-2013-PatentCertificate12-05-2022.pdf 2022-05-12
31 450-MUM-2013-IntimationOfGrant12-05-2022.pdf 2022-05-12
31 Form 2.pdf 2018-08-11

Search Strategy

1 Search_01-11-2017.pdf

ERegister / Renewals

3rd: 25 May 2022

From 15/02/2015 - To 15/02/2016

4th: 25 May 2022

From 15/02/2016 - To 15/02/2017

5th: 25 May 2022

From 15/02/2017 - To 15/02/2018

6th: 25 May 2022

From 15/02/2018 - To 15/02/2019

7th: 25 May 2022

From 15/02/2019 - To 15/02/2020

8th: 25 May 2022

From 15/02/2020 - To 15/02/2021

9th: 25 May 2022

From 15/02/2021 - To 15/02/2022

10th: 25 May 2022

From 15/02/2022 - To 15/02/2023

11th: 14 Feb 2023

From 15/02/2023 - To 15/02/2024

12th: 14 Feb 2024

From 15/02/2024 - To 15/02/2025

13th: 15 Feb 2025

From 15/02/2025 - To 15/02/2026