Abstract: The present invention relates to a method of automated testing of an application designed to run on a computing device, said method comprising the steps of: monitoring usage of said application on the device; transferring session details or scenarios; comparing said scenarios with a standard scenario; converting a prioritized scenario into a test automation script; enabling a user to run said automated test on the device; and reporting a detailed result of the test performed. The present invention also relates to a system for the evaluation of an application designed to run on mobile devices, the said system comprising of a test development kit for the collection and transfer of the details required to be analyzed; a scenario engine for receiving and determining the scenarios suitable for the application; an auto-script engine for converting the scenarios into automation scripts; and a platform for running the automated tests.
DESC:Field of the invention
The present invention relates to a system and method for evaluation of an application that run on computing devices. More specifically, the present invention relates to assessment of the application for various parameters; auto-generation of automation scripts; scrutinization of the results; and test recommendation based on test analytics.
Background of the invention
A very crucial part of the software development is the software testing. Software testing is performed to vindicate a software program or software application that the software meets the business as well as the technical requirements; efficient as per the requirements of the developers or consumers; determine the risks of the software implementation; ascertain the quality of the software program; and aids in the development of the software technology.
The assessment of the software application requires a testing device or system that is employed for the purpose of evaluating several parameters of the software programs or software applications; and a method for performing tests on the software applications so as to provide an appropriate result.
Such a device and method have been disclosed in CN 103049371 A. The method comprises steps of analyzing configuration values of key words of various operation processes from a testing example; sequentially executing tests of operation processes in accordance with analyzed configuration values of the serial number of operation processes; and executing the test for an operation process as follows: calling an element operation tool, wherein element operation tool conducts corresponding operation for corresponding user interface (UI) elements of tested Android application programs in accordance with operation types and configuration values of operation objects of the operation process. Therefore, a tester can configure testing examples based on key words conveniently, conducts the analyzing through key words of testing examples and calls relative tools to operate UI elements of tested application programs, and the testing process is completed automatically, and the testing is convenient. An android application testing apparatus comprising: a test case parsing module for parsing a configuration value of each keyword steps from the test case; wherein the key steps include: serial number, type of operation , the operation target; operation target value of the configuration UI identifying information or text elements; test execution module, configured to test the value based on the parsing module parses out the respective numbers of steps, various steps performed in sequence test; a test for which the procedure is performed: call element manipulation tools, the elements of the operation tool based on the value of the operating procedure of the configuration and operation of the object type, the measured android application corresponding UI elements the corresponding operation. Test storage module; test case generation module for each key test procedure inputted configuration values, generating the test.
Such a testing method and a testing tool have been disclosed in CN 102419732 A. The said method comprises the steps of: a recording operation to record events and generate script step android platform executable scripts; the device object receives a command to execute the script after performing the steps of the script executable scripts; and a judgment based on the results of the script execution result of the determination step whether through automation use cases. The said automated testing tool comprises of script recording unit for recording operating events and generates android platform executable script; the script execution unit for receiving the device object after execution of the command to execute the script executable scripts; and a comparison judgment unit for judging whether through automation use cases based on the results of the script execution.
Such a method has been disclosed in CN 102141960 A. The said method comprises the steps of all kinds of calls android testing procedures and internal shell commands; various types of testing procedures and internal shell command to encode the code; compile the code after a series of calls to random test sequence, and generates a corresponding test event; the code is written, and to prepare for the expansion interface to write programs to reserve storage space; the code is compiled by the Linux build environment, the formation of executable file; an executable file called Android in the test monkey random test command for various test procedures for random testing.
Such a system and method have been disclosed in US 20120198279 A1. The method comprises the steps of receiving, at a server, a request to perform a test instruction on one or more of a plurality of computing devices in communication with the server via a network; selecting a first computing device from the plurality of computing devices, the first computing device being capable of performing instructions written in a first one of the computer programming language instruction sets; transmitting the test instruction to the first computing device via the network; and receiving a response message from the first computing device, the response message comprising a result of an attempt to perform the test instruction at the first computing device. The testing system comprises of a server in communication with a plurality of computing devices via a network. The server being configured to perform the procedure as described above.
The application testing devices and methods as are described in the prior art are quite expensive, requires a cumbersome procedure to followed, involves lot of human intervention during the performance of the tests, some testing tools demands writing of the test scripts separately for each device and requires a lot of time and effort.
Objective of the Invention
The objective of the invention is to employ a testing system and a testing method for the evaluation of the software application designed to run on computing devices that is reasonable, less time consuming, convenient to be performed and involves minimum human interference by using real usage scenarios and analytics for automation testing.
Yet another objective of the present invention is to design a method for evaluation of said application by automatic generation of test scripts and does not involve any human intervention.
The present invention achieves the objective in that the testing device evaluates varied parameters and actions of the software application; performs test analytics; and automatically generates test scripts enabling test automation.
Summary of the Invention
The present invention relates to a method of automated testing of an application designed to run on a computing device, said method comprising the steps of: monitoring usage of said application on the device; transferring session details or scenarios; comparing said scenarios with a standard scenario; converting a prioritized scenario into a test automation script; enabling a user to run said automated test on the device; and reporting a detailed result of the test performed.
According to a preferred embodiment of the invention, the computing device may be a mobile device configured to run on a cellular data network.
According to one of the preferred embodiment of the invention, the data collected by the test development kit is injected into the application package of the mobile operating system.
According to another preferred embodiment of the invention, the injection of the data collected by the test development kit to the application package of the mobile operating system may be performed automatically or by the developers based on the request made by the users.
According to other preferred embodiment of the invention, the data is stitched for determining the amount of tests to be executed based on the conditions which include, not limiting to, most common sessions and flows by the users, most common elements across the builds, most failures, critical flows, and test analytics.
According to yet another preferred embodiment of the invention, the said analyzes flows by connecting to and fetching details from analytics such as, not limited to, Google, Crashlytics and Firebase.
According to another preferred embodiment of the invention, the analysis is performed by the server depending upon the sessions generated based on the interaction with the users, the most common flows are filtered and comparison is made between the existing flows and new flows.
According to yet another preferred embodiment of the invention, recommendations of the tests are provided to the users of various devices based on the server platform. The test recommendations are made based on the features, not limiting to, failures and success rate of tests across other parameters, at every level of Test Plan creation.
According to other preferred embodiment of the invention, the execution of the test on the collected data is based on the creation of the test suite for the device.
The present invention also relates to a system for the evaluation of an application designed to run on mobile devices, the said system comprising of a test development kit for the collection and transfer of the details required to be analyzed; a scenario engine for receiving and determining the scenarios suitable for the application; an auto-script engine for converting the scenarios into automation scripts; and a platform for running the automated tests.
According to one of the preferred embodiment of the invention, the test development kit captures the data for evaluation which includes, not limited to, screen shotting, session recording, profiling, flow engine, device details, network calls, CPU, applications installed, logs, hardware, layout details, screen elements, user actions, user input data, critical code paths.
According to a preferred embodiment of the invention, the test development kit captures workflow details including, but not limiting to, layout details, screen elements, user actions, user input data and critical code paths.
According to other preferred embodiment of the invention, the test development kit records core system actions including, not limiting to, switching between applications, network calls and monitoring, installed applications and background process.
According to yet other preferred embodiment of the invention, the test development kit records every action of the application with data points including, not limited to, activity name, package name, text, index, content description and bounds.
According to another preferred embodiment of the invention, the test development kit captures the device details of the users which includes, not limiting to, model name, model number, screen resolution, GPU, CPU, memory processor type, cores, clock speed, internal memory, RAM, network type, battery capacity, manufacturer, chipset and operating system version.
According to other preferred embodiment of the invention, the test development kit measures the data for performance profiling which includes, not limited to, the application’s RAM utilization, data utilization, battery utilization, CPU load, utilization, threads, processes, core information and context Switches
According to yet another preferred embodiment of the invention, the test development kit records data for determining the performance of the application. The said data includes, not limiting to, usage of memory of the application during the test execution such as App Dalvik, App PSS, App Native, App total, Dalvik limit, Garbage collection, Resources, Stack, Native objects and memory, Static variables and Functions, Threads and Objects, Classes & Objects, Finalizers and Unfinalized objects, Busy monitor objects and Allocation and deallocation patterns.
According to yet another preferred embodiment of the invention, the test development kit records data related to GPU usage including, not limiting to, Frames with issues, Swap buffers, Command issues, Sync, Draw, Measure/layout, Animation, Input handling, Vsync delay and Overdraws.
According to yet another preferred embodiment of the invention, the test development kit records data related to network usage which includes, not limited to, Data Transferred in, Data Transferred Out and Network type.
According to yet another preferred embodiment invention, the flow engine of the test development kit, records the actions including, not limiting to, clicks, scroll, long press, check box, swipes, slides, activity lists, ID’s which are unique & automatically generated by android compiling format, field type, layout details and indexes.
According to other preferred embodiment invention, the test development kit is installed in the live version of the application and records the live environment details including, not limiting to, network calls, network details, installed application, location, background process, battery conditions and sensors.
According to a preferred embodiment of the invention, the application package of a mobile operating system is modified so as to enable the server to get information for test analysis.
According to another preferred embodiment of the invention, the data captured by scenario engine includes, but not limited to, events, inputs (fields, search terms), behavior flows, clicks, dropouts, audience, devices, networks, screens, crashes and exceptions, user flows, activities, and test development kits.
According to yet another preferred embodiment of the invention, the steps for analyzing the captured data comprises of: analysis of environment, including but not limiting to, developer/ system, testers, and user devices; generation of several sessions based on user interaction; stored mapping of said sessions; filtering of the user flows and the devices; and comparison of existing flows with new flows.
According to another preferred embodiment of the invention, an auto-schedule playback will determine new builds from a connect build server; analyze; and run an appropriate test.
Example
The execution of the test script is successful if said test script runs till the final logging module. In case, the flow of the test script fails then the test script skips the rest of the execution and jumps to the final logging module.
The outcome of the script as displayed to the customers is the final screen status and the coverage report of the same is illustrated as:
Motorola1 Samsung1 Motorola2 Xioami1
Test1 Screel1=Done
Screel2=Order placed Screel1= -
Screel2=Server Error Screel1=Done
Screel2=Order Placed Screen1=Done
Screen2=Order placed with few items removed
Test2 Screel1=Item added to cart Screel1=Item(s) added to cart Screel1= Wrong screen reached
Detailed Description of the invention
The present invention relates to a method of automated testing of an application designed to run on a computing device, said method comprising the steps of: monitoring usage of said application on the device; transferring session details or scenarios; comparing said scenarios with a standard scenario; converting a prioritized scenario into a test automation script; enabling a user to run said automated test on the device; and reporting a detailed result of the test performed. The present invention also relates to a system for the evaluation of an application designed to run on mobile devices, the said system comprising of a test development kit for the collection and transfer of the details required to be analyzed; a scenario engine for receiving and determining the scenarios suitable for the application; an auto-script engine for converting the scenarios into automation scripts; and a platform for running the automated tests.
The instant invention provides a system and method for assessment of an application designed to run on mobile devices. The said system comprises of the following components:
1. Test Development Kit (TDK) which is a library that is bundled within a mobile application and monitors application usage and device context in real world applications and transfers session details to a Scenario Engine.
2. Scenario Engine (SE) that receives user workflows from multiple TDKs installed in user phones; gets the standard user workflows from internal and third party analytics and compares them to prioritize which scenarios are suitable to test the application.
3. Auto-Script Engine (AE) converts the prioritized scenarios into test automation scripts that can be run on real phones multiple times.
4. Playback and Reporting Platform (Platform) enables an application owner/tester to run the automated tests to identify issues and bugs in the mobile application. Tests can be run with extended coverage i.e. using other builds, using multiple devices, conditions, networks, carriers, interrupts and user data.
The test development kit monitors the sessions further enabling the platform to replicate a similar workflow and environment when the prioritized test on the application is being initiated. The said workflows and environments are stored in the platform for test intelligence capability. Furthermore, the test development kit records all the flows that are being performed by the user or a tester for each application. Said flows may include a performance of checkout by the user and the same is recorded successfully. The user flows may be recorded from varied sources including analytics, test development kit and third party SDK. The said user flows are categorized and processed to the scenario engine.
The scenario engine connects to third party analytics that includes Google, Crashlytics, and Firebase and utilizes the test development kit analytics to derive common workflows used by the users. The data recorded is sufficient for indicating potential success and failure for a workflow.
The AutoScript Engine completely automates the creation of the automation test scripts by converting the prioritized scenarios into test automation scripts without any human intervention. The Autoscript engine can spew out scripts for multiple frameworks like Robotium, Appium, Espresso, and Calabash given a workflow group as input.
The Playback and Report Platform enables an application user or a tester to run the automated tests to identify issues and bugs in the said application. Tests may be run with extended coverage using multiple devices, conditions, networks, carriers, interrupts and user data. The data will be received from TDK / Test and accordingly will be analyzed based on the filters and a test suite will be created. The selected test suites will be executed on the devices and the tests will be run accordingly based on user’s acceptance.
Mobile devices are configured to run without any restrictions or replicate the user flow condition / Network / Interruptions / Battery state / In-built apps / Background process. The automation tests may run on:
1. Conditions that may include Low Ram/CPU/Battery/Disk space/resources, varrying Ram/CPU/Battery/Disk space/resources, orientation changes.
2. Networks that may include 2G, 3G, 4G, Wifi, CDMA, no network.
3. Carriers that may include various telecommunication operators like Vodafone, Airtel, AT&T.
4. Interrupts that may include phone call during test, SMS interrupt, low batter alerts, notifications.
5. Data that may include various data combinations of usernames, emails ids, countries, cities, search terms.
A detailed test coverage reports are produced to showcase the issues in the mobile application, once the test are completed on the application. The playback module captures a screenshot of each click of the screen and the same will be represented as a part of the detailed report.
Tests that have been executed on the computing devices will be provided a result that are either Pass, Fail, Crash, or Risk (based on performance) of the application.
Brief Description of the drawings
Figure 1 illustrates analytics driven testing workflow
Figure 2 illustrates storing user details in Appachhi database
Figure 3 depicts end-to-end back-end and platform integration
Figure 4 illustrate end screens and sessions
Figures 5 and 6 illustrate auto-scripting flow and switch flow
Figure 7 depicts classification of threshold based on sessions
Figure 8 illustrates analytics and analysis of the user sessions
Figure 9 illustrates user flow diagram
Figure 10 depicts CHHI score
Figure 11 illustrates screen coverage
Figure 12 illustrates test coverage
Figure 13 illustrates performance profiling
Detailed Description of the drawings
The drawings illustrate the system and the method for the evaluation of the application that run on mobile devices. The method comprises of the following steps:
1. Collection of data that is required to be analyzed done by test development kit;
2. Scrutinizing the amount of tests to be executed on the data;
3. Generation of automation test scripts enabling test automation; and
4. Analyzing the result.
The data collected by the test development kit is injected into the application package of the mobile operating system which is connected to the server for the processing of the data. Numbers of sessions are recorded from different sources such as, development environment, user environment and the server platform. The data recorded is stitched so as to be able to determine the number of tests to be executed. The test suite created is performed on the application and the result is analysed. The data is stored in the server which analyzes the test and compares the existing test suite with the new developed test suite. Test recommendations are provided to the users that are linked to the server platform.
Figure 1 describes the analytics driven testing workflow. Initially to receive data (Input values, location, events, background services & more) from user's phone and analytics will be processed by scenario engine that makes the critical flows to convert them to scripts via autoscript engine. Platform can be used as a median to generate test plans by users to generate test results.
According to Figure 2, SDK on the user end captures information of the devices and monitors the performance parameters. SDK also captures the user's actions, data, gestures, data to push the same to the database.
As per Figure 3 of the instant invention, the information provided is fetched via TDK followed by capturing the details of the device of the user or the device from test environment to determine what devices and the environment the tests should run in the device laboratory that simulates the real user conditions or basically the details captured from user’s / tester’s device. The same will be analyzed and the results will be published on the platform.
According to Figure 4 of the instant invention, the number of end points to the screen 8 can be to any number of screens and the same can have multiple sessions to end a specific user flow. The same has will be identified and flow for each screens understanding the number of sessions to determine the end screen.
As per Figures 5 and 6, the automation creates a specific flow. 1,3,5,6,8 signifies the flow of user and 8 being the assertion. The fallback mechanism when script engine identifies screen 4 to add the same to the flow between 3 and 5. The engine identifies the necessary screens and scripts to make a combination of workflows.
Figure 7 describes the creation and understanding the number of session to understand how the tests and screens would fail. This is generated to come up with test recommendations and decision making if the screen is pass or fail.
According to Figure 8, sessions that will be recorded or fetched from analytics will be stored in the database as a raw data to be processed for analysis to create required test suite. Test suite is a set of test that is created to accomplish certain kind of testing like regressions, sanity, and smoke testing. If there are few sessions that belong to an existing test suite will be added to the test suite.
Figure 9 is an example of what the user would do in an e-commerce application designed to run on a mobile. The user enters the necessary data to login and navigating to proceed for success payment. This is the ground rule for any test to be created.
Figure 10 describes the mechanism that shows the health of the application based on analysis and considering various performance attributes.
Figure 11 depicts the playback module capturing screenshot of each click of the screen will be represented as part of the report.
Figure 12 describes the tests that have been executed on devices will be provided an result that are either Pass, fail, Crash, Risk (based on performance) of the application. The same has been represented above.
Figure 13 illustrates the capture performance details. Applications with TDK injected will be monitored for performance such as Memory, CPU & Network for details to analyse the defect of the application.
,CLAIMS:We Claim:
1. A method of automated testing of an application designed to run on a computing device, said method comprising the steps of:
i. monitoring usage of said application on the device;
ii. transferring session details or scenarios;
iii. comparing said scenarios with a standard scenario;
iv. converting a prioritized scenario into a test automation script;
v. enabling a user to run said automated test on the device; and
vi. reporting a detailed result of the test performed;
characterized in that the said test scripts are generated automatically thereby enabling test automation.
2. The method as claimed in Claim 1, wherein said computing device is a mobile computing device.
3. The method as claimed in Claim 1, wherein said monitoring of the application usage is performed by an automated testing kit facilitated by a web based user interface.
4. The method as claimed in Claim 1, wherein comparison of scenarios is executed by a software tool capable of creating an interactive priority simulation.
5. The method as claimed in Claim 1, wherein a software tool interacts with the user interface and converts the prioritized scenario into the test automation scripts.
6. The method as claimed in Claim 1, wherein a platform enables a user to run the automated tests for identifying issues and bugs in the computing device.
7. A system of automated testing of an application, said system comprising of:
a memory for saving database;
a server that runs on an operating system; and
a processor for processing database;
where said system further comprises of:
i. a test development kit for the collection and transfer of the details required to be analyzed;
ii. a scenario engine for receiving and determining the scenarios suitable for the application;
iii. autoscript engine for converting the scenarios into automation scripts; and
iv. playback and reporting platform for running the automated tests;
wherein said system is configured to generate the test scripts automatically for test automation.
8. The system as claimed in Claim 7, wherein the test development kit enables the platform to replicate a workflow and environment while initiation of test thereby leading to storage of said workflow and environment for test intelligence capability.
9. The system as claimed in Claim 7, wherein details of data from said application captured by the test development kit comprises of:
i. Workflow details;
ii. Core system actions;
iii. Device details;
iv. Performance profiling;
v. Actions;
vi. Environment details
10. The system as claimed in Claim 7 wherein the scenario engine compares the scenarios received from the test development kit and a third party analytics for prioritizing a scenario applicable for the test.
11. The system as claimed in Claim 7, wherein the functions effectuated by the scenario engine comprises of:
i. capturing of data from said application;
ii. analysing said data;
iii. recommending the test to be performed.
12. The system as claimed in Claim 7 wherein the autoscript engine is capable of generating automated test scripts for multiple frameworks.
13. The system as claimed in Claim 7, wherein varied aspects for performance of automated tests comprises of:
i. conditions;
ii. networks;
iii. carriers;
iv. interrupts; and
v. data
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 201641040732-RELEVANT DOCUMENTS [07-02-2024(online)].pdf | 2024-02-07 |
| 1 | Form28_Small Entity_29-11-2016.pdf | 2016-11-29 |
| 2 | 201641040732-Correspondence to notify the Controller [05-02-2024(online)].pdf | 2024-02-05 |
| 2 | Form26_Power of Attorney_29-11-2016.pdf | 2016-11-29 |
| 3 | Form2 Title Page_Provisional_29-11-2016.pdf | 2016-11-29 |
| 3 | 201641040732-US(14)-HearingNotice-(HearingDate-06-02-2024).pdf | 2024-01-11 |
| 4 | Evidence For SSI_As Filed_29-11-2016.pdf | 2016-11-29 |
| 4 | 201641040732-FER_SER_REPLY [03-11-2021(online)].pdf | 2021-11-03 |
| 5 | Drawings_As Filed_29-11-2016.pdf | 2016-11-29 |
| 5 | 201641040732-FER.pdf | 2021-10-17 |
| 6 | Description Provisional_As Filed_29-11-2016.pdf | 2016-11-29 |
| 6 | 201641040732-FORM 4(ii) [09-08-2021(online)].pdf | 2021-08-09 |
| 7 | Other Patent Document [08-03-2017(online)].pdf | 2017-03-08 |
| 7 | 201641040732-FORM 18 [11-10-2018(online)].pdf | 2018-10-11 |
| 8 | 201641040732-COMPLETE SPECIFICATION [29-11-2017(online)].pdf | 2017-11-29 |
| 8 | Correspondence by Agent_Form1 and POA_22-03-2017.pdf | 2017-03-22 |
| 9 | 201641040732-DRAWING [29-11-2017(online)].pdf | 2017-11-29 |
| 10 | Correspondence by Agent_Form1 and POA_22-03-2017.pdf | 2017-03-22 |
| 10 | 201641040732-COMPLETE SPECIFICATION [29-11-2017(online)].pdf | 2017-11-29 |
| 11 | Other Patent Document [08-03-2017(online)].pdf | 2017-03-08 |
| 11 | 201641040732-FORM 18 [11-10-2018(online)].pdf | 2018-10-11 |
| 12 | Description Provisional_As Filed_29-11-2016.pdf | 2016-11-29 |
| 12 | 201641040732-FORM 4(ii) [09-08-2021(online)].pdf | 2021-08-09 |
| 13 | Drawings_As Filed_29-11-2016.pdf | 2016-11-29 |
| 13 | 201641040732-FER.pdf | 2021-10-17 |
| 14 | Evidence For SSI_As Filed_29-11-2016.pdf | 2016-11-29 |
| 14 | 201641040732-FER_SER_REPLY [03-11-2021(online)].pdf | 2021-11-03 |
| 15 | Form2 Title Page_Provisional_29-11-2016.pdf | 2016-11-29 |
| 15 | 201641040732-US(14)-HearingNotice-(HearingDate-06-02-2024).pdf | 2024-01-11 |
| 16 | Form26_Power of Attorney_29-11-2016.pdf | 2016-11-29 |
| 16 | 201641040732-Correspondence to notify the Controller [05-02-2024(online)].pdf | 2024-02-05 |
| 17 | Form28_Small Entity_29-11-2016.pdf | 2016-11-29 |
| 17 | 201641040732-RELEVANT DOCUMENTS [07-02-2024(online)].pdf | 2024-02-07 |
| 1 | search040732E_23-12-2020.pdf |