Sign In to Follow Application
View All Documents & Correspondence

Method And System For Script Less Automated Testing Of Services

Abstract: A method and system for script less automated testing of a service is disclosed. In some embodiments, the method includes analyzing (304) an input file comprising at least one service end-point link and a plurality of test attributes. The method further includes identifying (306) an external testing application relevant for each of the at least one service end-point link. The method further includes invoking (308) the associated external testing application to perform a test on a service end-point connected with each of the at least one service end-point link in at least one of a plurality of test environments; receiving (310) at least one test result of the test performed by the associated external testing application for each of the at least one service end-point link; and converting (312) for each of the at least one service end-point link, the at least one test result into one of a set of predefined output formats.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
09 June 2021
Publication Number
25/2021
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
jashandeep@inventip.in
Parent Application

Applicants

HCL Technologies Limited
806, Siddharth, 96, Nehru Place, New Delhi - 110019, India

Inventors

1. Srinivas T
HCL Technologies Ltd. L2 & L3, Building No.H08, Sy.No.30,34,35 & 38, (L&T Phoenix Infoparks Pvt Ltd Serlingampally Mandal, Hyderabad- 500081 Phone Number: +91 9573202405
2. Narender S
HCL Technologies Ltd. L2 & L3, Building No.H08, Sy.No.30,34,35 & 38, (L&T Phoenix Infoparks Pvt Ltd Serlingampally Mandal, Hyderabad- 500081 Phone Number: +91 9989284333
3. Subramanyam P
HCL Technologies Ltd. L2 & L3, Building No.H08, Sy.No.30,34,35 & 38, (L&T Phoenix Infoparks Pvt Ltd Serlingampally Mandal, Hyderabad- 500081 Phone Number: +91 9121091768
4. Jagadish Reddy
HCL Technologies Ltd. L2 & L3, Building No.H08, Sy.No.30,34,35 & 38, (L&T Phoenix Infoparks Pvt Ltd Serlingampally Mandal, Hyderabad- 500081 Phone Number: +91 9036818441

Specification

Generally, the invention relates to software testing. More
specifically, the invention relates to method and system for script less
automated testing of services.
Background
[002] Over past few years, there has been a series of rapid
technological advancement in the field of digital technology. Due to this
advancement, expectations of customers are increasing, which in turn
forces enterprises to digitize products and services in order maximize end
to end customers’ experience. Moreover, digital products and services
delivered to customers are defined based on integrated digital capabilities
or delivery of digital content. Therefore, based on quality of the digital
products and services, the customers are beginning to judge the product
and services not only against enterprises in similar sectors but also against
best customer service provided by enterprises in any sector for their product
and service.
[003] Due to the above listed reasons, enterprises are migrating or
enhancing their product architecture based on their business goals. For this,
the enterprises are opting for product architectures that are easy to
enhance, thereby releasing products more frequently to improve customers’
experience. In current technological world, microservices and webservices
are playing crucial role in providing open architecture to service customers
efficiently and enables frequent product release. The existing process for
testing these software services, i.e., microservices/webservices, requires a
combination of manual and automation testing in order to validate these
Docket No: IIP-HCL-P0053
-3-
services on target systems. Moreover, these existing software testing
processes consumes high manual efforts and high duration to complete
execution on target system. Additionally, in existing software testing
processes there is a need to understand manual test cases and create
automation test scripts using open source or commercial tools.
[004] Apart from the above listed reasons, challenges in the existing
software testing processes for testing these software services include high
manual efforts to execute test cases on multiple/different target
environments, limited test coverage in release cycle, multiple times of
execution before product/application release to market, limited time lines to
perform test execution in Agile and DevOps model, automation test scripts
development in tool specific language and functions, automation developer
supposed to be skilled in tool specific programming language, high
maintenance efforts for any change in product/application functionality, and
many more. Due to the above listed reasons, IT industries are constantly
looking forward for innovative solutions to reduce efforts and time required
to release products/applications in the market in short duration.
[005] Therefore, there is a need of an efficient and reliable method
and system for providing script less automated testing of services.
SUMMARY OF INVENTION
[006] In one embodiment, a method for script less automated
testing of a service is disclosed. The method may include analyzing an input
file comprising at least one service end-point link and a plurality of test
attributes. The method may include identifying for each of the at least one
service end-point link, an external testing application relevant for each of the
at least one service end-point link. The method may include invoking for
each of the at least one service end-point link, the associated external
testing application to perform a test on a service end-point connected with
each of the at least one service end-point link in at least one of a plurality of
Docket No: IIP-HCL-P0053
-4-
test environments. It should be noted that, the service end-point connected
with each of the at least one service end-point link is tested based on at
least one of the plurality of test attributes provided for each of the at least
one service end-point link. The method may include receiving for each of
the at least one service end-point link, at least one test result of the test
performed by the associated external testing application on the service endpoint connected with each of the at least one service end-point link. The
method may include converting for each of the at least one service endpoint link, the at least one test result into one of a set of predefined output
formats.
[007] In another embodiment, a system for script less automated
testing of a service is disclosed. The system includes a processor and a
memory communicatively coupled to the processor. The memory may store
processor-executable instructions, which, on execution, may causes the
processor to analyze an input file comprising at least one service end-point
link and a plurality of test attributes. The processor-executable instructions,
on execution, may further cause the processor to identify for each of the at
least one service end-point link, an external testing application relevant for
each of the at least one service end-point link. The processor-executable
instructions, on execution, may further cause the processor to invoke for
each of the at least one service end-point link, the associated external
testing application to perform a test on a service end-point connected with
each of the at least one service end-point link in at least one of a plurality of
test environments. It should be noted that, the service end-point connected
with each of the at least one service end-point link is tested based on at
least one of the plurality of test attributes provided for each of the at least
one service end-point link. The processor-executable instructions, on
execution, may further cause the processor to receive for each of the at
least one service end-point link, at least one test result of the test performed
by the associated external testing application on the service end-point
connected with each of the at least one service end-point link. The
Docket No: IIP-HCL-P0053
-5-
processor-executable instructions, on execution, may further cause the
processor to convert for each of the at least one service end-point link, the
at least one test result into one of a set of predefined output formats.
[008] It is to be understood that both the foregoing general
description and the following detailed description are exemplary and
explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[009] The present application can be best understood by reference
to the following description taken in conjunction with the accompanying
drawing figures, in which like parts may be referred to by like numerals.
[010] FIG. 1 illustrates a functional block diagram of a system for
script less automated testing of services, in accordance with an
embodiment.
[011] FIG. 2 illustrates a function block diagram of a memory of a
testing device used for script less automated testing of services, in
accordance with an embodiment.
[012] FIG. 3 illustrates a flowchart of a method for script less
automated testing of a service, in accordance with an embodiment.
[013] FIG. 4 illustrates a flowchart of a method for invoking an
existing testing application for performing a test corresponding to at least
one service end-point link, in accordance with an embodiment.
[014] FIGs. 5 illustrate a flowchart of a method for modifying
representation of at least one test result in generated report, in accordance
with an exemplary embodiment.
[015] FIG. 6A – 6B illustrates an exemplary representation of
analysing each of at least one service end-point link corresponding to a
plurality of test attributes, in accordance with an exemplary embodiment.
Docket No: IIP-HCL-P0053
-6-
[016] FIGs. 7A and 7B illustrate an exemplary representation of a
framework used to perform script less automation testing of a service, in
accordance with an exemplary embodiment.
[017] FIG. 8A – 8E illustrates a flow diagram of each of a plurality of
test performed for at least one service end-point link in a plurality of test
environment, in accordance with an exemplary embodiment.
DETAILED DESCRIPTION OF THE DRAWINGS
[018] The following description is presented to enable a person of
ordinary skill in the art to make and use the invention and is provided in the
context of particular applications and their requirements. Various
modifications to the embodiments will be readily apparent to those skilled in
the art, and the generic principles defined herein may be applied to other
embodiments and applications without departing from the spirit and scope
of the invention. Moreover, in the following description, numerous details
are set forth for the purpose of explanation. However, one of ordinary skill
in the art will realize that the invention might be practiced without the use of
these specific details. In other instances, well-known structures and devices
are shown in block diagram form in order not to obscure the description of
the invention with unnecessary detail. Thus, the invention is not intended to
be limited to the embodiments shown, but is to be accorded the widest
scope consistent with the principles and features disclosed herein.
[019] While the invention is described in terms of particular
examples and illustrative figures, those of ordinary skill in the art will
recognize that the invention is not limited to the examples or figures
described. Those skilled in the art will recognize that the operations of the
various embodiments may be implemented using hardware, software,
firmware, or combinations thereof, as appropriate. For example, some
processes can be carried out using processors or other digital circuitry under
the control of software, firmware, or hard-wired logic. (The term “logic”
Docket No: IIP-HCL-P0053
-7-
herein refers to fixed hardware, programmable logic and/or an appropriate
combination thereof, as would be recognized by one skilled in the art to carry
out the recited functions.) Software and firmware can be stored on
computer-readable storage media. Some other processes can be
implemented using analog circuitry, as is well known to one of ordinary skill
in the art. Additionally, memory or other storage, as well as communication
components, may be employed in embodiments of the invention.
[020] A system 100 for performing script less automated testing of
a service, is illustrated in FIG. 1. In an embodiment, the service may also be
referred to as at least one service end-point link. The at least one service
end-point link may correspond to one of a microservice or a webservice. In
particular, the system 100 may include a testing device 102 that may
perform script less automated testing of the service. The testing device 102
may perform script less automated testing based on a plurality of test
attributes in at least one of a plurality of test environments. The plurality of
test attributes may include, but are not limited to at least one parameter,
authentication details, and at least one non-functional detail. Further, the at
least one non-functional detail may include, but is not limited to at least one
of performance details, load details, or security details. In an embodiment,
the plurality of test attributes may correspond to a set of generic attributes.
Examples of the set of generic attributes may include, but is not limited to,
payload, authentication type, authentication, headers, number of threads,
ramp-up, number of queries per second. Moreover, the at least one the
plurality of test environments may include at least one of contract test
environment, load test environment, performance test environment, security
test environment, end-point test environment, or unit testcase generation
environment.
[021] In order to perform script less automation testing, the testing
device 102 may analyze an input file received from a user. The input file
may include at least one service end-point link (also referred as the service)
and the plurality of test attributes. In an embodiment, the user may
Docket No: IIP-HCL-P0053
-8-
correspond to a tester of the at least one service end-point link. Based on
analysis, the testing device 102 may identify an external testing application
relevant for the at least one service end point link. In an embodiment, the
external testing application may correspond to one of Representational
State Transfer (REST) and Simple Object Access Protocol (SOAP). Once
the associated external testing application is identified, the testing device
102 may invoke the associated external testing application to perform a test
on a service end-point connected with the at least one service end-point
link. Moreover, the test on the connected service end-point may be
performed in at least one of the plurality of test environments. Further, based
on the test performed, the testing device 102 may receive at least one test
result from the associated external testing application. This is further
explained in detail in conjunction with FIG. 2 to FIG. 8.
[022] Examples of the testing device 102 may include, but are not
limited to, a server, a desktop, a laptop, a notebook, a tablet, a smartphone,
a mobile phone, an application server, or the like. The testing device 102
may include a memory 104, a processor 106, and a display 108. The display
108 may further include the user interface 110. A user or an administrator
may interact with the testing device 102 and vice versa through the display
108.
[023] By way of an example, the display 108 may be used to display
results (i.e., a report generated based on the at least one test result, a
modified report) based on actions performed by the testing device 102, to
the user (i.e., a tester, a developer or an administrator of the at least one
service end-point link). Moreover, the display 108 may be used to display
the plurality of test attributes associated with the at least one service endpoint link. The plurality of test attributes may change based on a new user
or a new service end-point link introduced. In addition, the display 108 may
be used to display an option provided to the user to select one or more of
the at least one service end-point link for testing.
Docket No: IIP-HCL-P0053
-9-
[024] By way of another example, the user interface 110 may be
used by the user to provide inputs to the testing device 102. Thus, for
example, in some embodiment, the testing device 102 may ingest an input
comprising a user selection of the one or more of the at least one service
end-point link that need to be tested. Further, for example, in some
embodiments, the testing device 102 may render intermediate results (e.g.,
the associated external testing application, the test that needs to be
performed) or final results (e.g., the at least one test result) to the user via
the user interface 110.
[025] The memory 104 may store instructions that, when executed
by the processor 106, may cause the processor 106 to perform script less
automated testing of the at least one service end-point link. The processor
106 may perform script less automated testing based on analysis of the
input file received from the user, in accordance with some embodiments. As
will be described in greater detail in conjunction with FIG. 2 to FIG. 8, in
order to perform script less testing of the at least one service end point link,
the processor 106 in conjunction with the memory 104 may perform various
functions including analyzation of the input file, identification of the
associated external testing application, invoking the associated external
testing application to perform the test, generation of the at least one test
result conversion of the at least one test result, generation of the report,
etc.
[026] The memory 104 also store various data (e.g. the plurality of
test attributes, the plurality of test environments, an external testing
application relevant to each of the at least one service end-point link, etc.)
that may be captured, processed, and/or required by the testing device 102.
The memory 104 may be a non-volatile memory (e.g., flash memory, Read
Only Memory (ROM), Programmable ROM (PROM), Erasable PROM
(EPROM), Electrically EPROM (EEPROM) memory, etc.) or a volatile
memory (e.g., Dynamic Random Access Memory (DRAM), Static RandomAccess memory (SRAM), etc.).
Docket No: IIP-HCL-P0053
-10-
[027] The testing device 102 may be connected to a database 112.
The database 112 may be used to store a data template of each the plurality
of test environments. In addition, the database 112 may store results
generated based on test performed by the associated external testing
application for each of the at least one service end-point link. Additionally,
the database 112 may be periodically updated based on the data template
modified for at least one of the plurality of test environments as per
requirement of the user.
[028] Further, the testing device 102 may interact with a server 114
or external devices 120 over a network 118 for sending and receiving
various data. The external devices 120 may be used by a plurality of users
to provide their selection of an option provided to select one or more of the
at least one service end-point link for testing by the testing device 102. The
external devices 120 may include, but may not be limited to a desktop, a
laptop, a notebook, a netbook, a tablet, a smartphone, a remote server, a
mobile phone, or another computing system/device. The network 118, for
example, may be any wired or wireless communication network and the
examples may include, but may be not limited to, the Internet, Wireless
Local Area Network (WLAN), Wi-Fi, Long Term Evolution (LTE), Worldwide
Interoperability for Microwave Access (WiMAX), and General Packet Radio
Service (GPRS).
[029] In some embodiment, the testing device 102 may fetch
information associated with each of the plurality of test environments from
the server 114. In addition, the server 114 may provide access of the at least
one service end-point link to the plurality of users. The server 114 may
further include a database 116. The database 116 may store information
associated with the at least one service end-point link. By way of an
example, the database 116 may store the information associated with the
at least one service end-point link in order to distinguish a new service endpoint link from existing once. The database 116, may be periodically update
with the new service end-point link that may be successfully tested.
Docket No: IIP-HCL-P0053
-11-
Alternatively, the testing device 102 may receive the user input from one of
the external devices 120.
[030] Referring now to FIG. 2, a functional block diagram of the
memory 104 of the testing device 102 configured to perform script less
automated testing of a service is illustrated, in accordance with an
embodiment. The service may also be referred to as the at least one service
end-point link. In an embodiment, the at least one service end-point link may
correspond to one of a micro-service or a webservice. Initially, an input file
may be received as a user input 202 from at least one of the external devices
120 by the memory 104. The input file may include at least one service endpoint link and a plurality of test attributes. The plurality of test attributes may
include at least one parameter, authentication details, and at least one nonfunctional detail. Moreover, the at least one non-functional detail may
include at least one of performance details, load details, or security details.
In an embodiment, the plurality of test attributes may correspond to a set of
generic attributes. Examples of the set of generic attributes may include, but
is not limited to, payload, authentication type, authentication, headers,
number of threads, ramp-up, number of queries per second.
[031] In an embodiment, the memory 104 may include a reception
module 204, an evaluation module 206, an identification module 208, and
an invocation module 210, a generation module 212, and a modification
module 214. The modules 204-214 may include routines, programs,
objects, components, data structures, etc., which perform particular tasks or
implement particular abstract data types. The modules 204-214 described
herein may be implemented as software modules that may be executed in
a cloud-based computing environment of the testing device 102.
[032] The reception module 204 may be configured to receive the
user input 202. The user input 202 may correspond to the input file. The
input file may include at least one service end-point link and the plurality of
test attributes. In other words, the reception module 204 may be configured
to receive a user selection of one or more of the at least one service end-
Docket No: IIP-HCL-P0053
-12-
point link that needs to be tested as the user input 202. Further, the
reception module 204 may be configured to send the input file received from
the user to the evaluation module 206.
[033] The evaluation module 206 may be configured to receive the
input file from the reception module 204. Upon receiving the input file, the
evaluation module 204 may be configured to analyze the input file received.
In order to analyze the input file received, the evaluation module 206 may
perform analysis of at least one service end-point link and the plurality of
test attributes. In an embodiment, the at least one service end-point link may
correspond to one of a micro-service or a webservice. The plurality of test
attributes may include at least one parameter, authentication details, and at
least one non-functional detail. Moreover, the at least one non-functional
detail may include at least one of performance details, load details, or
security details.
[034] By way of an example, the analysis of one of the plurality of
test attributes, i.e., the authentication details may be done in order to verify
the user (e.g., the tester) of the testing device 102, based on his login
credentials. Once the analysis of the at least one service end-point link and
the plurality of test attributes is performed, the evaluation module 206 may
be configured to results of analysis along with the at least one service endpoint link and the plurality of test attributes to the identification module 208.
[035] Upon receiving the results of analysis along with the at least
one service end-point link and the plurality of test attributes from the
evaluation module 206, the identification module 208 may be configured to
identify an external testing application for each of the at least one service
end-point link. Moreover, the identification module 208 may identify the
external testing application that may be relevant for each of the at least one
service end-point link. In an embodiment, the external testing application
may correspond to one of REST and SOAP. Thereafter, the identification
module 208 may be configured to send the associated external testing
application identified to the invocation module 210.
Docket No: IIP-HCL-P0053
-13-
[036] The invocation module 210 may be configured to receive the
associated external testing application identified for each of at least one
service end-point link form the identification module 208. Further, the
invocation module 210 may be configured to invoke the associated external
testing application to perform a test on a service end-point connected with
each of the at least one service end-point link. Moreover, the invocation
module 210 may perform the test on the service end-point connected with
each of the at least one service end-point link in at least one of a plurality of
test environments. The plurality of test environments may include at least
one of contract test environment, load test environment, performance test
environment, security test environment, end-point test environment, or unit
testcase generation environment. Thereafter, the invocation module 210
may be configured to send test performed by the associated external testing
application for each of the at least one service end-point link to the
generation module 212.
[037] The generation module 212 may be configured to generate
results of test performed by the associated external testing application
invoked by the invocation module 210. In other words, the generation
module 212 may generate at least one test result of the test performed by
the associated external testing application on the service end-points
connected with each of the at least one service end-point link. Further, the
generation module 212 may be configured to generate a report in one of the
set of predefined output formats based on the at least one test result
generated. Further, the generation module 212 may be configured to send
the at least one test result generated to the modification module 214.
[038] Upon receiving the at least one test result, the modification
module 214 may convert the at least one test result generated for each of
the at least one service end-point link into one of a set of predefined output
formats. Moreover, the modification module 214 may be configured to
modify the at least one test result generated for each of the at least one
Docket No: IIP-HCL-P0053
-14-
service end-point link based on inputs received from the user to modify
representation of the at least one test result in the report.
[039] In particular, as will be appreciated by those of ordinary skill in
the art, various modules 204-214 for performing the techniques and steps
described herein may be implemented in the testing device 102, either by
hardware, software, or combinations of hardware and software. For
example, suitable code may be accessed and executed by the one or more
processors on the testing device 102 to perform some or all of the
techniques described herein. Similarly, application specific integrated
circuits (ASICs) configured to perform some or all of the processes
described herein may be included in the one or more processors on the host
computing system. Even though FIGs. 1-2 describe the testing device 102,
the functionality of the components of the testing device 102 may be
implemented in any computing devices.
[040] Referring now to FIG. 3, a flowchart of a method for script less
automated testing of a service is illustrated, in accordance with an
embodiment. At step 302, an input file may be received from a user. In an
embodiment, the user may correspond to a tester of the at least one service
end-point link. Moreover, the input file received may include at least one
service end-point link and a plurality of test attributes. In addition, the
plurality of test attributes may include at least one parameter, authentication
details, and at least one non-functional detail. The at least one nonfunctional detail may include at least one of performance details, load
details, or security details. In an embodiment, the plurality of test attributes
may correspond to a set of generic attributes. Examples of the set of generic
attributes may include, but is not limited to, payload, authentication type,
authentication, headers, number of threads, ramp-up, number of queries per
second. In addition, the at least one service end-point link may correspond
to one of a microservice or the webservice.
[041] In an embodiment, the input file received may be created in at
least one predefined input format. By way of an example, the at least one
Docket No: IIP-HCL-P0053
-15-
predefined input format used for creating the input file may include a
sequence of each of the plurality of attributes for which a user input may be
received along with the at least one end-point link. For example, the
sequence may first include ‘a test ID’. ‘The test ID’ may be unique for each
of the at least one service end-point link. Further, the sequence may include
‘an end-point Uniform Resource Locator (URL)’. The end-point URL may be
used to attach the at least one service end-point link. Further, the sequence
may include ‘authentication details’. The authentication details may be used
to verify the tester of the at least one service end-point link.
[042] Once the input file is received, at step 304, the received input
file may be analyzed. In order to analyze the input file, the at least one
service end-point link and each of the plurality of test attributes received in
the input file may be analyzed. By way of an example, the at least one
service end-point link may be analyzed in order to identify whether the at
least one service end-point received is the microservice or the web service.
In addition, at least one of the plurality of test attributes may be analyzed in
order to verify the user (e.g., the tester) of the at least one service end-point
link. Further, at step 306, an external testing application relevant for each of
the at least one service end-point link may be identified. In an embodiment,
the external testing application may correspond to one of a REST or SOAP.
[043] Once the associated external testing application is identified,
at step 308, the associated external testing application may be invoked for
each of the at least one service end-point link. Moreover, the associated
testing application is invoked to perform a test on a service end-point
connected with each of the at least one service end-point link in at least one
of a plurality of test environments. In an embodiment, the service end-point
connected with each of the at least one service end-point link may be tested
based on at least one of the plurality of test attributes. It should be noted
that, the at least one of the plurality of test attributes may be provided for
each of the at least one service end-point link.
Docket No: IIP-HCL-P0053
-16-
[044] In an embodiment, the plurality of test environments may be
automatically generated based on the at least one service end-point link.
Further, the plurality of test environments may include at least one of
contract test environment, load test environment, performance test
environment, security test environment, end-point test environment, or unit
testcase generation environment. As will be appreciated, the user may
modify each of the plurality of test environments based on his requirement.
In addition, the user may modify each of the plurality of test environments
using a data template associated with each of the plurality of test
environments.
[045] Based on the test performed by the associated external
testing application, at step 310, at least one test result may be received.
Further, at step 312, the at least one test result received may be converted
into one of a set of predefined output formats. The one of the set of
predefined output formats may include at least one of color codes,
predefined textual codes, or predefined pattern codes. In an embodiment,
the at least one of the color codes may be used to represent pass or fail of
the test performed for each of the at least one service end-point link. By way
of an example, when the at least one service end-point link may pass the
test performed, then the result of the test performed may be represented in
green color. Further, when the at least one service end-point link may fail
the test performed, then the result of the test performed may be represented
in red color.
[046] In another embodiment, the predefined textual codes may be
used to represent a detailed summary report of the test performed for each
of the at least one service end-point link. Moreover, the detailed summary
report may represent a detailed reason for the result associated with
execution of the test in at least one of the plurality of test environment. In
yet another embodiment, the predefined pattern codes may be used to
represent a performance report associated with each of the at least one
service end-point link. The performance report may depict a performance
Docket No: IIP-HCL-P0053
-17-
chart, vulnerability assessment charts and reports based on the test
performed in at least one of the plurality of test environment.
[047] Referring now to FIG. 4, a flowchart of a method for invoking
an existing testing application for performing a test corresponding to at least
one service end-point link is illustrated, in accordance with an embodiment.
At step 402, an option may be provided to a user to select one or more of
the at least one service end-point link that requires testing. In reference to
FIG. 1, the option may be provided to the user via the user interface 110. In
an embodiment, the user may correspond to a tester (also referred as a
testing engineer). In addition, the one or more of the at least one service
end-point link may correspond to a microservice or a webservice. Based on
the option provided, at step 404, a user selection may be received from the
user for the provided options. The user selection received may depict
selection of one or more of the at least one service end-point link that the
user wants to test. Once the user selection is received, at step 406, the
associated external testing application may be invoked to perform a test.
The associated external testing application may be invoked based on user
selection of one or more of the at least one service end-point link for testing.
[048] Referring now to FIG. 5, a flowchart of a method for modifying
representation of at least one test result in generated report is illustrated, in
accordance with an exemplary embodiment. In reference to FIG. 3, based
on the test performed by the associated external testing, the at least one
test result may be generated for the at least one service end-point link being
tested. Once the at least one test result is generated, at step 502, a report
may be generated. The report may include the at least one test result in one
of the set of predefined output formats. The set of predefined output formats
may include at least one of color codes, predefined textual codes, or
predefined pattern codes.
[049] Further, at step 504, inputs may be received from a user to
modify representation of the at least one test result in the report. In an
embodiment, the user may correspond to a tester of the at least one service
Docket No: IIP-HCL-P0053
-18-
end-point link. In other words, the user may correspond to a person that may
be performing testing of the at least one service end-point link. By way of an
example, an input may be received from the user to display the report in
second output format (i.e., the predefined textual codes) from the set of
predefined output formats.
[050] Once the inputs are received from the user, at step 506, the
generated report may be modified based on the inputs received from the
user. By way of an example, suppose the report generated for the at least
one service end-point link may only depict an execution report. The
execution report may only display pass or fail of the at least one service end
point link being tested. Now, suppose the user may want to a reason behind
success or failure of the at least one service end-point link under test. For
this the user may provide his selection to modify representation of the at
least one test result in the report to display the reason of success or failure.
In an embodiment, in order to see the reason behind the success or failure
of the at least one service end-point link, a detailed summary report may be
generated. The detailed summary report may represent a detailed reason
behind the at least one test result generated.
[051] Referring now to FIGs. 6A-6B an exemplary representation of
analysing each of at least one service end-point link corresponding to a
plurality of test attributes is illustrated, in accordance with an exemplary
embodiment. In FIG. 6A, a process 600a may represent a standard test
automation process that is being used to perform testing of at least one
service end-point link. In an embodiment, the at least one service end-point
link may correspond to one of a microservice or a web-service. In order to
create automation script for testing each of the at least one service endpoint link, at step 602a, an end-point details associated with each of the at
least one service end-point link may be received and analyzed. Based on
analysis of the end-point details, a set of tools required to perform testing of
each of the at least one service end-point link may be identified. Once the
set of tools are identified, at step 604a, a services automation framework
Docket No: IIP-HCL-P0053
-19-
may be developed. By way of an example, the services automation
framework developed may be utilized for testing a similar type of a plurality
of new service end-point links received.
[052] Once the services automation framework is developed, at step
606a, a set of test scripts required for testing each of the at least one service
end-point link may be created. It should be noted that, each of the set of test
scripts may be created based on the services automation framework
developed. Further, at step 608a, each of the set of test scripts created may
be executed to perform testing of each of the at least one service end-point
link. Moreover, each of the set of test scripts may be executed using one of
a Continuous Integration (CI) or a Continuous Delivery (CD) infrastructure.
Further, based on execution of each of the set of test scripts, at step 610a,
a report may be generated. The report may display a result of execution of
each of the set of test scripts. The displayed result may depict success or
failure of each of the at least one service end-point link.
[053] In FIG. 6B, a process 600b may represent a script less
automated testing of a plurality of service. In an embodiment, the plurality
of services may also be referred as at least one service end-point link that
requires testing. In process 600b, at step 602b, a test data preparation may
be done. By way of an example, the test data preparation may include
creation of the inputs file by a user (also referred as a tester). In other words,
the test data preparation may be done by the tester of the at least one
service end-point link. The input file may include the at least one service
end-point link and the plurality of test attributes. The plurality of test
attributes may include at least one parameter, authentication details, and at
least one non-functional detail. In addition, the at least one non-functional
detail may include at least one of performance details, load details, or
security details.
[054] Once the test data preparation is completed, at step 604b,
execution of a test may be start. In order to start execution of the test, the
external testing application relevant for the at least one service end-point
Docket No: IIP-HCL-P0053
-20-
link may be identified. Once the associated testing application is identified,
at step 606b a test pre-validation check may be performed. The test data
pre-validation check may be performed to check completeness of the input
file received from the user. In one embodiment, based on the validation
check performed, when the input file received is not complete, then the
process 600b may quit processing of the input file, as represented via step
608b. In another embodiment, when the input file received is validated to
be complete, then at step 610b, a test may be executed by the associated
external testing application in at least one a plurality of test environments.
[055] As represented in the process 600b, a set of test environments
from the plurality of test environments in which the test may be executed
may include functional validation 612b, load test environment 614b,
performance test environment 616b, and security test environment 618b. In
other words, the test may be performed on the at least one service endpoint link in each of the set of test environments selected. In reference to
FIG. 3, the test may be executed by the associated external testing
application identified for the at least one service end-point link. Based on
execution of the test, at step 620b, a report may be generated and
represented on an execution dashboard. The report may include the at least
one test result generated for the at least one service end-point link in one of
the set of predefined output formats. The set of predefined output formats
may include at least one of color codes, predefined textual codes, or
predefined pattern codes.
[056] Referring now to FIG. 7A, an exemplary representation of a
framework used to perform script less automation testing of a service is
illustrated, in accordance with an exemplary embodiment. In FIG. 7A, a
template generator 702a is depicted. The template generator 702a may be
responsible to generate various templates responsible for performing script
less automated testing of services (also referred as one or more service
end-point links). In one embodiment, the template generator 702a may be
utilized to generate at least one predefined input format. The at least one
Docket No: IIP-HCL-P0053
-21-
predefined input format may be used to create an input file. In reference to
above FIGs. 1 to 6B, the input file may correspond to the input file including
at least one service end-point link and the plurality of test attributes. In
another embodiment, the template generator 702a may be utilized to create
a set of predefined output formats. Th set of predefined output formats may
be utilized to display at least one test result generated based on the test
performed.
[057] In order to generate the template, the template generator 702a
may receive a swagger 704a. In an embodiment, the swagger 704a may
correspond to a set of rules used for creating a format (i.e., the input format
and the output format) describing REST Application Programming
Interfaces (APIs). Further, in order to generate various templates, the
template generator 702a may include a swagger import 706a, a swagger
validator 708a, and an input processor 710a. Upon receiving the swagger
704a, the swagger import 706a may be responsible to import the swagger
704a. In other words, the swagger import 706a may be responsible to create
and deploy REST APIs to generate a test data template based on the
swagger 704a received. In an embodiment, the test data template
generated may correspond to the input file including the at least one service
end-point link and the plurality of test attributes.
[058] Once the test data template is generated, the swagger
validator 708a may be responsible to validate the generated test data
template for the swagger 704a received. In one embodiment, if the swagger
validator 708a does not validates the generated test data template, then
process 700a represented in the FIG. 7A may fail. Moreover, when the
process 700a fails, then again a new swagger (i.e., the swagger 704a) may
be provided to the template generator 702a. In another embodiment, when
the swagger validator 708a may validate the generated test data template,
then the input processor 710a may process the generated test data
template. Based on processing of the test data template, a template 712a
may be generated. The template 712a may correspond to the analyzed and
Docket No: IIP-HCL-P0053
-22-
validated input file, i.e., the input file comprising the at least one service endpoint link and the plurality of test attributes. Once the template 712a is
generated, then the generated template 712a may be provided as an input
to a core engine 702b of the FIG. 7B.
[059] Upon receiving, the template 712a as the input, the core
engine 702b may perform a process 700b to process the template 712a
received. In order to process the template 712a, the core engine 702b may
validate the template 712a via a completeness validator 704b. The
completeness validator 704a may be responsible to validate completeness
of the template 712a. By way of an example, the completeness validator
704a may validate the at least one service end point link along with details
received for each of the plurality of test attributes.
[060] Once the completeness validator 704a validates the template
712a, then a core test engine 706a may be responsible to identify an
external testing application relevant for each of the at least one service endpoint link. The external testing application may correspond to one of a REST
or a SOAP. Further, the associated external testing application identified
may be responsible to perform a test on a service end-point connected with
each of the at least one service end-point link. The test on the service endpoint connected with each of the at least one service end-point link may be
performed in at least one of the plurality of test environment. Moreover, the
service end-point connected with each of the at least one service end-point
link may be tested based on at least one of the plurality of test attributes
provided for each of the at least one service end-point link.
[061] In an embodiment, the plurality of test environment may
include at least one of contract test environment, load test environment,
performance test environment, security test environment, end-point test
environment, or unit testcase generation environment. As depicted in
present FIG. 7B, the contract test environment may correspond to a contract
test module (TM) 708b. The load test environment may correspond to a load
TM 710b. The performance test environment may correspond to a
Docket No: IIP-HCL-P0053
-23-
performance TM 712b. The security test environment may correspond to a
security TM 714b. The end-point test environment may correspond to an
end-point TM 716b. Lastly, the unit testcase generation environment may
correspond to a unit TM 718b.
[062] Once the associated external testing application is identified
and the at least one of the plurality of test environments are selected, then
a test case (TC) generator 720b may be responsible to generate a test. In
other words, the test generator 720b may be configured to generate various
tests based on the plurality of test attributes shared in a test data template.
It should be noted that, the plurality of test attributes may be associated with
each of the at least one service end-point link. Further, the test generated
may be performed on the service end-point connected with the at least one
service end-point link in at least one of the plurality of test environments
selected. Once the test that needs to be performed is generated, then a test
validator 722b may be responsible to validate the generated test. Based on
validation check of the generated test, when the test is successfully
validated, then the test validator 722b may send the validated test to a test
executor 724b.
[063] The test executor 724b may be responsible to execute the
validate test generated for the at least one service end-point link. Thereafter,
the test executor 724b may be configured to integrate the validated test with
a microservice/webservice 730b via a connector 726b. In an embodiment,
the connector 726b may be configured to send the test generated to the
microservice/webservice 730b. In other words, the connector 726b may be
configured to send and employ the test generated on the
microservice/webservice 730b that need to be tested. Once the validated
test is executed, then result, i.e., the at least one test result generated,
based on execution of the test may be displayed as an output to the user
(also referred as the tester) via a reporting dashboard 728b. In an
embodiment, the at least one test result may be displayed in one of a set of
predefined output formats. The set of predefined output formats may include
Docket No: IIP-HCL-P0053
-24-
at least one of color codes, predefined textual codes, or predefined pattern
codes.
[064] Referring now to FIG. 8A – 8E, a flow diagram of each of a
plurality of test performed for at least one service end-point link in a plurality
of test environment is illustrated, in accordance with an exemplary
embodiment. In FIG. 8A, a test performed on each of the at least one service
end-point link in first test environment, i.e., end-point test environment is
depicted via a process 800a. In the end-point test environment a plurality of
functionalities may be performed in order to test the at least one service
end-point link. The plurality of functionalities performed may include
response validation, status code verification, header validation, creation of
positive and negative tests, internal end-point validation, support provided
for different authentications, i.e., basic authentication, open authentications,
etc., and support provided for both hypertext transfer protocol (http)/
hypertext transfer protocol secure (https).
[065] In process 800a, a data template 802a may be created. The
data template 802a may correspond to an input file created by a user, i.e.,
a tester. The input file created may include at least one service end-point
link and a plurality of test attributes. In an embodiment, the at least one
service end-point link may correspond to a microservice or a webservice. In
addition, the plurality of test attributes may include at least one parameter,
authentication details, and at least one non-functional detail. Moreover, the
at least one non-functional detail may include at least one of performance
details, load details, or security details. Once the input file is received, a data
completeness check may be performed as depicted via a data complete
804a. In one embodiment, when the input file received is not complete, then
the process 800a may fail and restart again with the creation of a new data
template. In another embodiment, when the input file received is determined
to be complete, then a data generator 806a may identify an external testing
application. The external testing application identified may be relevant for
Docket No: IIP-HCL-P0053
-25-
the at least one service end-point link. In an embodiment, the external
testing application may correspond to one of REST or SOAP.
[066] Once the external testing application is identified, then a test
to be performed may be generated by a test case generated 808a. Further,
a test case runner 810a may be responsible to execute the test generated
to test the at least one service end-point link. Based on execution of test, a
connector 812a may integrate result (also referred as at least one test result)
of execution of the test with at least one connector. In an embodiment, each
of the at least one connector may correspond to an entity that is used to
establish connection with the at least one service end-point link, i.e., the
microservice or the webservice, that needs to be tested. Moreover, the
connector may establish connection with the at least one service end-point
link in order to send and employ the test generated on the at least one
service end-point link. In reference to FIG. 7B, the at least one connector
may correspond to the connector 726b. Examples of connector 812a may
include one of Microsoft (MS) services, database, and a switch. Once the
result is generated, a check may be performed to validate the at least one
test result generated as depicted via a step pass 814a. The check may be
performed to identify any error in the at least one test result generated.
[067] In one embodiment, based on the check performed, when no
error is identified, then a result validator/consolidator 816a may generate a
report comprising the at least one test result in one of the set of predefined
output formats. In another embodiment, based on the check performed,
when an error 818a is identified in the at least one test result, then the result
validator/consolidator 816a may generate the report including the error
818a. Thereafter, the generated report may be displayed via a dashboard
820a and an external dashboard 822a. The dashboard 820a may display
the report in one of a set of output formats 824a. Examples of the set of
output formats 824a may include, but is not limited to extra large (XL),
hypertext markup language (html), and portable document format (PDF).
Docket No: IIP-HCL-P0053
-26-
[068] In FIG. 8B, a test performed on each of the at least one service
end-point link in second and third test environment, i.e., performance test
environment and load test environment are depicted via a process 800b. In
the performance test environment, a plurality of functionalities may be
performed in order to test each of the at least one service end-point link.
The plurality of functionalities performed may include determination of
response time and throughput of all end-points and maintaining of historical
data. In should be noted that, a framework (i.e., the framework depicted in
Fig. 7A-7B) may use a Jmeter is used in order to perform testing of the at
least one service end-point link in the performance test environment. The
Jmeter may correspond to an open-source testing software. Additionally, in
the load test environment, a plurality of functionalities may be performed in
order to test each of the at least one service end-point link. The plurality of
functionalities performed may include creation of parallel loads,
determination of average response time, and configurable loads. It should
be noted that, a framework (i.e., the framework depicted in Fig. 7A-7B) may
support load test environment using fortio open-source tool.
[069] In process 800b, a data template 802b may be created. The
data template 802b may correspond to an input file created by a user, i.e.,
a tester. The input file created may include at least one service end-point
link and a plurality of test attributes. In an embodiment, the at least one
service end-point link may correspond to a microservice or a webservice. In
addition, the plurality of test attributes may include at least one parameter,
authentication details, and at least one non-functional detail. Moreover, the
at least one non-functional detail may include at least one of performance
details, load details, or security details. Once the input file is received, a data
completeness check may be performed as depicted via data complete 804b.
In one embodiment, when the input file received is not complete, then the
process 800b may fail and restart again with the creation of a new data
template. In another embodiment, when the input file received is determined
to be complete, then a data generator 806b may identify an external testing
Docket No: IIP-HCL-P0053
-27-
application. The external testing application identified may be relevant for
the at least one service end-point link. In an embodiment, the external
testing application may correspond to one of REST or SOAP.
[070] Once the external testing application is identified, then a test
to be performed may be generated by a test case generated 808b. Further,
a test case runner 810b may be responsible to execute the test generated
to test the at least one service end-point link. Based on execution of test, a
load generator 812b may be configured to configure loads and create
parallel loads. In addition, the load generator 812b may be configured to
determine average response time when the at least one service end-point
link is being tested in the load test environment. Moreover, the load
generator 812b may be configured to provide response time and throughput
of all end-points in case of the performance test environment.
[071] Once the average response time or the response time and
throughput are determined, then a connector 814b may integrate result (also
referred as at least one test result) of execution of the test with at least one
connector. In reference to FIG. 7B, the at least one connector may
correspond to the connector 726b. Examples of connector 814b may include
one of MS services, database, and a switch. Once the result is generated,
a check may be performed to validate the at least one test result generated
as depicted via a step pass 816b. The check may be performed to identify
any error in the at least one test result generated.
[072] In one embodiment, based on the check performed, when no
error is identified, then a result validator/consolidator 818b may generate a
report comprising the at least one test result in one of the set of predefined
output formats. In another embodiment, based on the check performed,
when an error 820b is identified in the at least one test result, then the result
validator/consolidator 818b may generate the report including the error
820b. Moreover, the result validator/consolidator 818b may generate the
report using a test database 822b. Thereafter, the generated report may be
displayed via a dashboard 824b and an external dashboard 826b. The
Docket No: IIP-HCL-P0053
-28-
dashboard 824b may display the report in one of a set of output formats
828b. Examples of the set of output formats 828b may include, but is not
limited XL, html, and pdf.
[073] In FIG. 8C, a test performed on each of the at least one service
end-point link in fourth test environment, i.e., security test environment is
depicted via a process 800c. In the security test environment a plurality of
functionalities may be performed in order to test the at least one service
end-point link. The plurality of functionalities performed may include
vulnerability scan for each end-point and recommendations for each issue
identified. It should be noted that, a framework (i.e., the framework depicted
in Fig. 7A-7B) may utilize open-source tool, i.e., Zed Attack Proxy (ZAP) to
perform testing of the at least one service end-point link in the security test
environment.
[074] In process 800c, a data template 802c may be created. The
data template 802c may correspond to an input file created by a user, i.e.,
a tester. The input file created may include at least one service end-point
link and a plurality of test attributes. In an embodiment, the at least one
service end-point link may correspond to a microservice or a webservice. In
addition, the plurality of test attributes may include at least one parameter,
authentication details, and at least one non-functional detail. Moreover, the
at least one non-functional detail may include at least one of performance
details, load details, or security details. Once the input file is received, a data
completeness check may be performed as depicted via data complete 804c.
In one embodiment, when the input file received is not complete, then the
process 800c may fail and restart again with the creation of a new data
template. In another embodiment, when the input file received is determined
to be complete, then a data generator 806c may identify an external testing
application. The external testing application identified may be relevant for
the at least one service end-point link. In an embodiment, the external
testing application may correspond to one of REST or SOAP.
Docket No: IIP-HCL-P0053
-29-
[075] Once the external testing application is identified, then a test
to be performed may be generated by a test case generated 808c. Further,
a test case runner 810c may be responsible to execute the test generated
to perform testing of the at least one service end-point link. Based on
execution of test, a connector 812c may integrate result (also referred as at
least one test result) of execution of the test with at least one connector..
Examples of connector 812c may include one of MS services, database,
and a switch. Once the result is generated, a check may be performed to
validate the at least one test result generated as depicted via a step pass
814c. The check may be performed to identify any error in the at least one
test result generated.
[076] In one embodiment, based on the check performed, when no
error is identified, then a result validator/consolidator 816c may generate a
report comprising the at least one test result in one of the set of predefined
output formats. In another embodiment, based on the check performed,
when an error 818c is identified in the at least one test result, then the result
validator/consolidator 816c may generate the report including the error
818c. Moreover, the result validator/consolidator 816c may generate the
report using a vulnerability database 820c. Thereafter, the generated report
may be displayed via a dashboard 822c and an external dashboard 824c.
The dashboard 822c may display the report in one of a set of output formats
826c. Examples of the set of output formats 826c may include, but is not
limited to XL, html, and pdf.
[077] In FIG. 8D, a test performed on each of the at least one service
end-point link in fifth test environment, i.e., contract test environment is
depicted via a process 800d. In the contract test environment a plurality of
functionalities may be performed in order to test the at least one service
end-point link. The plurality of functionalities performed may include
execution of integration testing through contract, capturing details of both
provider and consumer side, maintains mock server internally, and varies a
contract.
Docket No: IIP-HCL-P0053
-30-
[078] In process 800d, a data template 802d may be created. The
data template 802d may correspond to an input file created by a user, i.e.,
a tester. The input file created may include at least one service end-point
link and a plurality of test attributes. In an embodiment, the at least one
service end-point link may correspond to a microservice or a webservice. In
addition, the plurality of test attributes may include at least one parameter,
authentication details, and at least one non-functional detail. Moreover, the
at least one non-functional detail may include at least one of performance
details, load details, or security details. Once the input file is received, a data
completeness check may be performed as depicted via a data complete
804d. In one embodiment, when the input file received is not complete, then
the process 800d may fail and restart again with the creation of a new data
template. In another embodiment, when the input file received is determined
to be complete, then a contract data 806d may determine a contract
between a consumer and a provider. The consumer may correspond to a
client who wants to receive some data from an application and
the provider may correspond to an API on a server that provides data the
client needs). Once the contract between the consumer and the provider is
identified, then the contract may pass through a fake server depicted as a
mock server 808d. The fake server may stimulate as a real server. In an
embodiment, the fake server may be configured to test, and check APIs
associated with the at least one service end-point link along with the APIs
responses.
[079] Once the contract passes through the mock server 808d, then
a test to be performed may be generated by a test case generated 810d.
Further, a test case runner 812d may be responsible to execute the test
generated for testing the at least one service end-point link. In addition, in
order to execute the generated test, a set of third-party services may be
mocked as depicted via step 814d. In an embodiment, mocking of third part
services is done in order to add a large set of functions. It should be noted
Docket No: IIP-HCL-P0053
-31-
that, the large set of functions may be added in order ensure that everything
is covered while performing testing of the at least one service end-point link
[080] Based on execution of test, a connector 816d may integrate
result (also referred as at least one test result) of execution of the test with
at least one connector. Examples of connector 816d may include one of MS
services, database, and a switch. Once the result is generated, a check may
be performed to validate the at least one test result generated as depicted
via a step pass 818d. The check may be performed to identify any error in
the at least one test result generated.
[081] In one embodiment, based on the check performed, when no
error is identified, then a result validator/consolidator 820d may generate a
report comprising the at least one test result in one of the set of predefined
output formats. In another embodiment, based on the check performed,
when an error 822d is identified in the at least one test result, then the result
validator/consolidator 820d may generate the report including the error
822d. Thereafter, the generated report may be displayed via a dashboard
824d and an external dashboard 826d. The dashboard 824d may display
the report in one of a set of output formats 826d. Examples of the set of
output formats 826d may include, but is not limited to XL, html, and PDF.
[082] In FIG. 8E, a test performed on each of the at least one service
end-point link in sixth test environment, i.e., unit test environment is depicted
via a process 800e. In the unit test environment a plurality of functionalities
may be performed in order to test the at least one service end-point link.
The plurality of functionalities performed may include generation of unit
testcases for all services classes, and test and adding of the unit test cases
to a code base developed by developers.
[083] In process 800e, a data template 802e may be created. The
data template 802e may correspond to an input file created by a user, i.e.,
a tester. The input file created may include at least one service end-point
link and a plurality of test attributes. In an embodiment, the at least one
service end-point link may correspond to a microservice or a webservice. In
Docket No: IIP-HCL-P0053
-32-
addition, the plurality of test attributes may include at least one parameter,
authentication details, and at least one non-functional detail. Moreover, the
at least one non-functional detail may include at least one of performance
details, load details, or security details. Once the input file is received, a data
completeness check may be performed as depicted via a data complete
804e. In one embodiment, when the input file received is not complete, then
the process 800e may fail and restart again with the creation of a new data
template.
[084] In another embodiment, when the input file received is
determined to be complete, then configuration of a plurality of services (i.e.,
the one or more service end-point links) with a set of classes may be done
as depicted via step 806e. Once the configuration of the plurality of services
with the set of classes is done, then a boundary data may be generated for
each of the plurality of services as depicted via step 808e.
[085] Once the boundary data is generated, then a test to be
performed may be generated by a test case generated 810e. Further, a test
case runner 812e may be responsible to execute the test generated for
testing the at least one service end-point link. In addition, in order to execute
the generated test, a set of mock services may be utilized as depicted via
step 814e. It should be noted that, mock services may corresponds to
services that imitates real REST APIs or SOAP APIs.
[086] Based on execution of test, a connector 816e may integrate
result (also referred as at least one test result) of execution of the test with
at least one connector. Examples of connector 816e may include one of MS
services, database, and a switch. Once the result is generated, a check may
be performed to validate the at least one test result generated as depicted
via a step pass 818e. The check may be performed to identify any error in
the at least one test result generated.
[087] In one embodiment, based on the check performed, when no
error is identified, then a result validator/consolidator 820e may generate a
report comprising the at least one test result in one of the set of predefined
Docket No: IIP-HCL-P0053
-33-
output formats. In another embodiment, based on the check performed,
when an error 822e is identified in the at least one test result, then the result
validator/consolidator 820e may generate the report including the error
822e. Thereafter, the generated report may be displayed via a dashboard
824e and an external dashboard 826e. The dashboard 824e may display
the report in one of a set of output formats 826e. Examples of the set of
output formats 826e may include, but is not limited to XL, html, and PDF.
[088] Various embodiments provide method and system for script
less automated testing of a service. The disclosed method and system may
analyze an input file comprising at least one service end-point link and a
plurality of test attributes. Moreover, the disclosed method and system may
identify an external testing application relevant for each of the at least one
service end-point link. Further, the disclosed method and system may
invoke the associated external testing application to perform a test on a
service end-point connected with each of the at least one service end-point
link in at least one of a plurality of test environments. In addition, the
disclosed method and system may receive at least one test result of the test
performed by the associated external testing application. Thereafter, the
disclosed method and system may convert the at least one test result into
one of a set of predefined output formats.
[089] The system and method provide some advantages like, the
system and the method may enable Shift Left Testing in true sense.
Moreover, the disclosed system and method may accelerate script less test
automation development by 5 times speed and increases test coverage than
any existing testing framework. In addition to increase in the test coverage,
the disclosed system and method may also increase non-functional test
coverage. Further, the disclosed system and method may not require skilled
automation developers to work and is easy to learn and train for any
unskilled testing professional or a testing beginner. The disclosed system
and method may seamlessly integrate with agile and devOps framework.
Also, the disclosed system and method may integrate with CI/CD tools in
Docket No: IIP-HCL-P0053
-34-
order to support continuous testing. Moreover, with the help of the disclosed
system, the users may get faster realization of Return on Investment (ROI)
associated with a microservice or a webservice. Additionally, with the help
of the disclosed system, maintenance of test becomes easy and is less time
consuming. The disclosed system and method may allow manual testers to
create test automation for the microservice or the webservices.
[090] It will be appreciated that, for clarity purposes, the above
description has described embodiments of the invention with reference to
different functional units and processors. However, it will be apparent that
any suitable distribution of functionality between different functional units,
processors or domains may be used without detracting from the invention.
For example, functionality illustrated to be performed by separate
processors or controllers may be performed by the same processor or
controller. Hence, references to specific functional units are only to be seen
as references to suitable means for providing the described functionality,
rather than indicative of a strict logical or physical structure or organization.
[091] Although the present invention has been described in
connection with some embodiments, it is not intended to be limited to the
specific form set forth herein. Rather, the scope of the present invention is
limited only by the claims. Additionally, although a feature may appear to be
described in connection with particular embodiments, one skilled in the art
would recognize that various features of the described embodiments may
be combined in accordance with the invention.
[092] Furthermore, although individually listed, a plurality of means,
elements or process steps may be implemented by, for example, a single
unit or processor. Additionally, although individual features may be included
in different claims, these may possibly be advantageously combined, and
the inclusion in different claims does not imply that a combination of features
is not feasible and/or advantageous. Also, the inclusion of a feature in one
category of claims does not imply a limitation to this category, but rather the
feature may be equally applicable to other claim categories, as appropriate. '

CLAIMS
WHAT IS CLAIMED IS:
1. A method (300) for script less automated testing of a service, the method
comprising:
analyzing (304), by a testing device (102), an input file comprising at
least one service end-point link and a plurality of test attributes;
identifying (306) for each of the at least one service end-point link, by
the testing device (102), an external testing application relevant for each of
the at least one service end-point link;
invoking (308) for each of the at least one service end-point link, by
the testing device (102), the associated external testing application to
perform a test on a service end-point connected with each of the at least
one service end-point link in at least one of a plurality of test environments,
wherein the service end-point connected with each of the at least one
service end-point link is tested based on at least one of the plurality of test
attributes provided for each of the at least one service end-point link;
receiving (310) for each of the at least one service end-point link, by
the testing device (102), at least one test result of the test performed by the
associated external testing application on the service end-point connected
with each of the at least one service end-point link; and
converting (312) for each of the at least one service end-point link,
by the testing device (102), the at least one test result into one of a set of
predefined output formats.
2. The method (300) of claim 1 further comprising receiving (302) the input
file from a user, wherein the input file is created in at least one predefined
input format.
Docket No: IIP-HCL-P0053
-36-
3. The method (300) of a claim 1, wherein:
each of the at least one service end-point link corresponds to one of
a microservice or a web service;
the at least one test attribute comprises at least one parameter,
authentication details, and at least one non-functional detail, and wherein
the at least one non-functional detail comprises at least one of performance
details, load details, or security details;
the set of predefined output formats comprises at least one of color
codes, predefined textual codes, or predefined pattern codes;
the plurality of test environments comprises at least one of contract
test environment, load test environment, performance test environment,
security test environment, end-point test environment, or unit testcase
generation environment; and
the external testing application corresponds to one of
Representational State Transfer (REST) and Simple Object Access
Protocol (SOAP).
4. The method (300) of claim 1, further comprising:
providing (402) an option to a user to select one or more of the at
least one service end-point link for testing;
receiving (404) a user selection of the one or more of the at least one
service end-point link for testing for testing; and
invoking (406) the associated external testing application to perform
a test based on the user selection.
Docket No: IIP-HCL-P0053
-37-
5. The method (300) of claim 1, further comprises:
generating (502) a report comprising the at least one test result in
one of the set of predefined output formats;
receiving (504) inputs from a user to modify representation of the at
least one test result in the report; and
modifying (506) the report based on the inputs received from the
user.

6. A system (100) for script less automated testing of a service, the system
(100) comprising:
a processor (106); and
a memory (104) communicatively coupled to the processor (106),
wherein the memory (104) stores processor executable instructions, which,
on execution, causes the processor (106) to:
analyze (304) an input file comprising at least one service endpoint link and a plurality of test attributes;
identify (306) for each of the at least one service end-point
link, an external testing application relevant for each of the at least
one service end-point link;
invoke (308) for each of the at least one service end-point link,
the associated external testing application to perform a test on a
service end-point connected with each of the at least one service
end-point link in at least one of a plurality of test environments,
wherein the service end-point connected with each of the at least one
service end-point link is tested based on at least one of the plurality
Docket No: IIP-HCL-P0053
-38-
of test attributes provided for each of the at least one service endpoint link;
receive (310) for each of the at least one service end-point
link, at least one test result of the test performed by the associated
external testing application on the service end-point connected with
each of the at least one service end-point link; and
convert (312) for each of the at least one service end-point
link, the at least one test result into one of a set of predefined output
formats.
7. The system (100) of claim 6, wherein the processor executable
instructions cause the processor (106) to receive (302) the input file from a
user, wherein the input file is created in at least one predefined input format.
8. The system (100) of claim 6, wherein:
each of the at least one end-point link corresponds to one of a
microservice or a web service;
the at least one test attribute comprises at least one parameter,
authentication details, and at least one non-functional detail, and wherein
the at least one non-functional detail comprises at least one of performance
details, load details, or security details;
the set of predefined output formats comprises at least one of color
codes, predefined textual codes, or predefined pattern codes;
the plurality of test environments comprises at least one of contract
test environment, load test environment, performance test environment,
Docket No: IIP-HCL-P0053
-39-
security test environment, end-point test environment, or unit testcase
generation environment; and
the external testing application corresponds to one of
Representational State Transfer (REST) and Simple Object Access
Protocol (SOAP).
9. The system (100) of claim 6, wherein the processor executable
instructions further cause the processor (106) to:
provide (402) an option to a user to select one or more of the at least
one service end-point link for testing;
receive (404) a user selection of the one or more of the at least one
service end-point link for testing for testing; and
invoke (406) the associated external testing application to perform a
test based on the user selection.

10. The system (100) of claim 6, wherein the processor executable
instructions further cause the processor (106) to:
generate (502) a report comprising the at least one test result in one
of the set of predefined output formats;
receive (504) inputs from a user to modify representation of the at
least one test result in the report; and
modify (506) the report based on the inputs received from the user.

Documents

Application Documents

# Name Date
1 202111025604-CLAIMS [12-10-2022(online)].pdf 2022-10-12
1 202111025604-STATEMENT OF UNDERTAKING (FORM 3) [09-06-2021(online)].pdf 2021-06-09
2 202111025604-CORRESPONDENCE [12-10-2022(online)].pdf 2022-10-12
2 202111025604-REQUEST FOR EXAMINATION (FORM-18) [09-06-2021(online)].pdf 2021-06-09
3 202111025604-REQUEST FOR EARLY PUBLICATION(FORM-9) [09-06-2021(online)].pdf 2021-06-09
3 202111025604-FER_SER_REPLY [12-10-2022(online)].pdf 2022-10-12
4 202111025604-PROOF OF RIGHT [09-06-2021(online)].pdf 2021-06-09
4 202111025604-OTHERS [12-10-2022(online)].pdf 2022-10-12
5 202111025604-POWER OF AUTHORITY [09-06-2021(online)].pdf 2021-06-09
5 202111025604-FER.pdf 2022-05-12
6 202111025604-FORM-9 [09-06-2021(online)].pdf 2021-06-09
6 202111025604-COMPLETE SPECIFICATION [09-06-2021(online)].pdf 2021-06-09
7 202111025604-FORM 18 [09-06-2021(online)].pdf 2021-06-09
7 202111025604-DECLARATION OF INVENTORSHIP (FORM 5) [09-06-2021(online)].pdf 2021-06-09
8 202111025604-DRAWINGS [09-06-2021(online)].pdf 2021-06-09
8 202111025604-FORM 1 [09-06-2021(online)].pdf 2021-06-09
9 202111025604-FIGURE OF ABSTRACT [09-06-2021(online)].jpg 2021-06-09
10 202111025604-FORM 1 [09-06-2021(online)].pdf 2021-06-09
10 202111025604-DRAWINGS [09-06-2021(online)].pdf 2021-06-09
11 202111025604-FORM 18 [09-06-2021(online)].pdf 2021-06-09
11 202111025604-DECLARATION OF INVENTORSHIP (FORM 5) [09-06-2021(online)].pdf 2021-06-09
12 202111025604-FORM-9 [09-06-2021(online)].pdf 2021-06-09
12 202111025604-COMPLETE SPECIFICATION [09-06-2021(online)].pdf 2021-06-09
13 202111025604-POWER OF AUTHORITY [09-06-2021(online)].pdf 2021-06-09
13 202111025604-FER.pdf 2022-05-12
14 202111025604-PROOF OF RIGHT [09-06-2021(online)].pdf 2021-06-09
14 202111025604-OTHERS [12-10-2022(online)].pdf 2022-10-12
15 202111025604-REQUEST FOR EARLY PUBLICATION(FORM-9) [09-06-2021(online)].pdf 2021-06-09
15 202111025604-FER_SER_REPLY [12-10-2022(online)].pdf 2022-10-12
16 202111025604-REQUEST FOR EXAMINATION (FORM-18) [09-06-2021(online)].pdf 2021-06-09
16 202111025604-CORRESPONDENCE [12-10-2022(online)].pdf 2022-10-12
17 202111025604-STATEMENT OF UNDERTAKING (FORM 3) [09-06-2021(online)].pdf 2021-06-09
17 202111025604-CLAIMS [12-10-2022(online)].pdf 2022-10-12

Search Strategy

1 202111025604E_11-05-2022.pdf