Sign In to Follow Application
View All Documents & Correspondence

Scenario Generation Framework For Software Product Testing And Automation

Abstract: The present invention provides a computing system for testing a software module under test (102). The computer system includes a test scenario generation module (111) and a test scenario execution module (112). The test scenario generation module (111) reads interface messages and a test scenario script. The test scenario generation module (111) generates a database (115) indicative of test scenarios and executable tasks. The test scenario execution module (112) executes aforesaid executable tasks in a user-defined sequence to generate a plurality of test events and provides the test events to the software module under test (102). Ref. Fig. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
28 September 2018
Publication Number
14/2020
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
info@krishnaandsaurastri.com
Parent Application

Applicants

BHARAT ELECTRONICS LIMITED
Corporate Office, Outer Ring Road, Nagavara, Bangalore

Inventors

1. Bharath Kumar
BSTC/Central D&E Bharat Electronics Limited, Jalahalli PO, Banglore, Karnataka, India, Pin Code-560 013.
2. Anil Kumar
BSTC/Central D&E Bharat Electronics Limited, Jalahalli PO, Banglore, 560 013.
3. Jayanthi
BSTC/Central D&E Bharat Electronics Limited, Jalahalli PO, Banglore, 560 013.

Specification

FIELD OF INVENTION
The present invention relates generally to electronic devices and particularly to testing of software modules.
BACKGROUND
Software testing is an important phase in development of products. A typical product lifecycle includes software development, support, and maintenance. Testing of software applications require complex real-time testing environment. This requires interaction with external systems which may or may not be available at the time of testing. Simulators are used for developing the conditions for testing the applications. These simulators create signals and environment which would be created by external systems.
The problems are reported for analysis and resolution during product support and maintenance phase. The re-creation of problem scenario requires simulators, event generators and tools, which involves the development test environment along with expected field conditions to be administered with manual intervention. The problem recreation requires analysis of the product log data when the symptoms cannot be directly inferred. The analysis of problem reported may need to visit the product deployment site and analysis to be conducted with user environment.
Once the problem scenario analyzed, the test environment needs to be updated for the specific requirements and debugging specific applications for resolving the problem. The test environment creation needs stupendous effort and cost in case of large systems. This involves updating of the external interface simulators and involving target systems.
Therefore, there is a need for an improved framework for software testing.

SUMMARY
[0006] This summary is provided to introduce concepts related to framework for software testing. This summary is neither intended to identify essential features of the present disclosure nor is it intended for use in determining or limiting the scope of the present disclosure.
[0007] In an embodiment of the present invention, a computing system is provided for testing a software module under test. The computer system includes a test scenario generation module and a test scenario execution module. The test scenario generation module is configured to read a plurality of interface messages to be communicated with the software module under test. The test scenario generation module further reads a test scenario script input by a user using predefined keywords. The test scenario generation module generates a database based on the interface messages and the test scenario script. The database is indicative of a plurality of test scenarios and a plurality of executable tasks. The test scenario generation module stores the database in the memory. The test scenario execution module reads the test scenarios and the executable tasks from the database stored in the memory. The test scenario execution module executes aforesaid executable tasks in a user-defined sequence to generate a plurality of test events and provides the test events to the software module under test.
[0008] In another embodiment of the present invention, a method performed by a computer system for testing a software module under test is provided. The method includes reading a plurality of interface messages by a test scenario generation module. The messages are to be communicated with the software module under test Further, the test scenario generation module reads a test scenario script input by a user using predefined keywords. The test scenario generation module generates a database based on the interface messages and the test scenario script. The database indicates a plurality of test scenarios and a plurality of executable tasks. The database is stored in a memory. A test scenario execution module reads the executable tasks from the database stored in the memory and executes the executable tasks in user-defined

sequence to generate a plurality of test events. The test scenario execution module provides the test events to the software module under test.
[0009] In an exemplary embodiment, the test scenario generation module includes an interface message definition module, an interface message processing module, a test scenario definition module, and a test scenario processing module. The interface message definition module reads the interface messages. The interface messages include C/C++ structures. The interface message processing module reads the structures, processes the structures, generates test scenario data, and stores the test scenario data in the database within the memory. The test scenario definition module receives the user-defined test scenario script. The test scenario script includes a plurality of test cases. The test scenario processing module generates the test scenarios and the executable tasks based on the test cases.
[0010] In another exemplary embodiment, the test scenario execution module includes a task schedule module, a task monitoring module, and a data logging module. The task schedule module generates an execution schedule indicative of a sequence of execution of the executable tasks. The task monitoring module monitors the execution of the executable tasks. The data logging module stores execution logs of the executed tasks in the memory.
BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS
[0011] The detailed description is described with reference to the accompanying figures. The same numbers are used throughout the drawings to reference like features and modules.
[0012] Fig. 1 illustrates a schematic block diagram of a testing system in accordance with an embodiment of the present invention.
[0013] Fig. 2 illustrates a schematic block diagram of a test scenario generation module in accordance with an embodiment of the present invention.

[0014] Fig. 3 illustrates a schematic block diagram of a test scenario execution module in accordance with an embodiment of the present invention.
[0015] Fig. 4 depicts a flowchart illustrating a method of testing software in accordance with an embodiment of the present invention.
[0016] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present disclosure.
[0017] Similarly, it will be appreciated that any flow charts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
[0018] The various embodiments of the present invention provide a testing system.
[0019] In the following description, for purpose of explanation, specific details are set forth in order to provide an understanding of the present disclosure. It will be apparent, however, to one skilled in the art that the present disclosure may be practiced without these details.
[0020] One skilled in the art will recognize that embodiments of the present disclosure, some of which are described below, may be incorporated into a number of systems.
[0021] However, the systems and methods are not limited to the specific embodiments described herein. Further, structures and devices shown in the figures are illustrative of exemplary embodiments of the present disclosure and are meant to avoid obscuring of the present disclosure.

[0022] Furthermore, connections between components and/or modules within the figures are not intended to be limited to direct connections. Rather, these components and modules may be modified, re-formatted or otherwise changed by intermediary components and modules.
[0023] References in the present disclosure to “one embodiment” or “an embodiment” mean that a particular feature, structure, characteristic, or function described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
[0024] The present invention provides a testing system.
[0025] Fig 1 illustrates a testing system (101) in accordance with an embodiment of the present invention. The testing system (101) includes a test scenario generation module (111), a test scenario execution module (112), interface messages (113), a test scenario script (114), a database (115) stored in a memory (120), and a processor (130). The testing system (101) is in communication with a software module under test (102). The aforesaid components of the testing system (101) may be mutually interconnected. However, the interconnections are not depicted in the figure for simplicity.
[0026] The testing system (101) creates a testing environment for testing the software module under test (102) and provides testing automation for testing software modules and applications.
[0027] The test scenario generation module (111) reads the interface messages (113) to be exchanged with the software module under test (102). The interface messages (113) include C/C++ structures readable by a computer. The test scenario generation module (111) also reads the test scenario script (114) provided by a user. The test scenario script (114) may be provided manually by the user using multiple keywords. The keywords may be user-defined keywords or predefined keywords.

[0028] The test scenario generation module (111) generates the database (115) and stores the database in the memory (120). The database (115) includes information about multiple test scenarios and also includes multiple tasks. The tasks may be executed parallelly by the test scenario execution module (112).
[0029] The test scenario execution module (112) retrieves the tasks from the database (115) as per a user-defined sequence to generate multiple test events to be communicated with the software module under test (102) for creating an intended test scenario.
[0030] Fig. 2 illustrates a block diagram of the test scenario generation module (111). The test scenario generation module (111) includes an interface message definition module (201), an interface message processing module (202), a test scenario definition module (203), and a test scenario processing module (204).
[0031] The interface definition module (201) reads the interface messages (113) which contains all the interface messages to be communicated across the software modules under test (102). The interface messages are represented using C/C++ structures along with various other information for each member field of the structure like default values and range of values including step size. The syntax used to define a structure in C/C++ language shall be used here to define an interface message to be exchanged between CSCIs under test:
struct message-name {
Data-Type Data-Member; Default-Value Data-Type BitFieldDataMember: Bit-value; Default-Value Data-Type Data-Member; Default-Value Min Max Step-size };k
[0032] The keywords including struct and basic data types are represented as “tag”. Member field information including default value, range of values, bitwise fields is represented as “data”. The following data types are supported in present invention to define an interface message: unsigned char/UINT_8/BYTE, char /signed

char, unsigned short/UINT_16/ HWORD, short/signed short, unsigned int/UINT_32/WORD, int/signed int, float, and double.
[0033] An example of defining interface message is shown below:
struct PRIMARY_TRACK {
HWORD msg_id; 8301 HWORD trkno:8; 111 1 200 1 HWORD class:8; 4 5 10 1 float rng; 0.0 HWORD type:8; 0 0 10 1 HWORD iff:8; 1 1 5 1 };
[0034] The interface message processing module (202) processes all the structures defined in the interface messages (113) and stores the following information about each interface message in the database including total number of interface messages for current test scenario for further processing includes preparation of test scenario data by the test scenario processing module (204): Name, No. of bytes to be allocated, No. of data members, Information about each data member, Name, Data type, Default value, Range of Values (Min and Max Limits) with step size, Byte offset from the beginning of the structure, No. of bits allocated for bit-wise data members.
[0035] The test scenario definition module (203) allows the user to define test scenarios using pre-defined rules. A test scenario is a group of test cases to be executed in parallel. In turn, each test case is a set of instructions to be executed in sequence. An instruction can be defined by using a set of pre-defined keywords along with appropriate arguments. An instruction includes a command to be executed at a present time and a combination of pre-defined commands and arguments.
[0036] The following syntax can be used to define an instruction:

Instruction-No. Command Argument-List
[0037] The following is a table of supported commands along with
corresponding arguments:
Command to be I Syntax
Executed
Send an interface TxUdp
message to an e.g. TxUdp TimeSynch 192.168.3.10 0x5010
external interface
using UDP
protocol
Delay between 2 Delay

Documents

Application Documents

# Name Date
1 201841036739-FORM 13 [21-02-2025(online)].pdf 2025-02-21
1 201841036739-STATEMENT OF UNDERTAKING (FORM 3) [28-09-2018(online)].pdf 2018-09-28
2 201841036739-POA [21-02-2025(online)].pdf 2025-02-21
2 201841036739-FORM 1 [28-09-2018(online)].pdf 2018-09-28
3 201841036739-RELEVANT DOCUMENTS [21-02-2025(online)].pdf 2025-02-21
3 201841036739-FIGURE OF ABSTRACT [28-09-2018(online)].pdf 2018-09-28
4 201841036739-DRAWINGS [28-09-2018(online)].pdf 2018-09-28
4 201841036739-ABSTRACT [22-07-2022(online)].pdf 2022-07-22
5 201841036739-DECLARATION OF INVENTORSHIP (FORM 5) [28-09-2018(online)].pdf 2018-09-28
5 201841036739-CLAIMS [22-07-2022(online)].pdf 2022-07-22
6 201841036739-COMPLETE SPECIFICATION [28-09-2018(online)].pdf 2018-09-28
6 201841036739-COMPLETE SPECIFICATION [22-07-2022(online)].pdf 2022-07-22
7 201841036739-FORM-26 [27-12-2018(online)].pdf 2018-12-27
7 201841036739-DRAWING [22-07-2022(online)].pdf 2022-07-22
8 Correspondence by Agent_Power of Attorney_07-01-2019.pdf 2019-01-07
8 201841036739-FER_SER_REPLY [22-07-2022(online)].pdf 2022-07-22
9 201841036739-Proof of Right (MANDATORY) [20-02-2019(online)].pdf 2019-02-20
9 201841036739-OTHERS [22-07-2022(online)].pdf 2022-07-22
10 201841036739-FER.pdf 2022-01-25
10 Correspondence by Agent_Form1_25-02-2019.pdf 2019-02-25
11 201841036739-FORM 18 [10-02-2021(online)].pdf 2021-02-10
12 201841036739-FER.pdf 2022-01-25
12 Correspondence by Agent_Form1_25-02-2019.pdf 2019-02-25
13 201841036739-OTHERS [22-07-2022(online)].pdf 2022-07-22
13 201841036739-Proof of Right (MANDATORY) [20-02-2019(online)].pdf 2019-02-20
14 201841036739-FER_SER_REPLY [22-07-2022(online)].pdf 2022-07-22
14 Correspondence by Agent_Power of Attorney_07-01-2019.pdf 2019-01-07
15 201841036739-DRAWING [22-07-2022(online)].pdf 2022-07-22
15 201841036739-FORM-26 [27-12-2018(online)].pdf 2018-12-27
16 201841036739-COMPLETE SPECIFICATION [22-07-2022(online)].pdf 2022-07-22
16 201841036739-COMPLETE SPECIFICATION [28-09-2018(online)].pdf 2018-09-28
17 201841036739-CLAIMS [22-07-2022(online)].pdf 2022-07-22
17 201841036739-DECLARATION OF INVENTORSHIP (FORM 5) [28-09-2018(online)].pdf 2018-09-28
18 201841036739-ABSTRACT [22-07-2022(online)].pdf 2022-07-22
18 201841036739-DRAWINGS [28-09-2018(online)].pdf 2018-09-28
19 201841036739-FIGURE OF ABSTRACT [28-09-2018(online)].pdf 2018-09-28
19 201841036739-RELEVANT DOCUMENTS [21-02-2025(online)].pdf 2025-02-21
20 201841036739-POA [21-02-2025(online)].pdf 2025-02-21
20 201841036739-FORM 1 [28-09-2018(online)].pdf 2018-09-28
21 201841036739-STATEMENT OF UNDERTAKING (FORM 3) [28-09-2018(online)].pdf 2018-09-28
21 201841036739-FORM 13 [21-02-2025(online)].pdf 2025-02-21
22 201841036739-US(14)-HearingNotice-(HearingDate-03-10-2025).pdf 2025-09-08
23 201841036739-Correspondence to notify the Controller [29-09-2025(online)].pdf 2025-09-29
24 201841036739-Written submissions and relevant documents [17-10-2025(online)].pdf 2025-10-17

Search Strategy

1 SearchHistory(90)E_21-01-2022.pdf
2 searchamendAE_27-03-2023.pdf