Sign In to Follow Application
View All Documents & Correspondence

A Test System And Method For Validating Functionalities Of A Software Defined System

Abstract: A test system (102) that efficiently tests functionalities of a containerized application (106A) is provided. The test system (102) includes an edge device (104) including the containerized application (106A), and a scenario generation system (124) that generates test cases required for testing the containerized application (106A). Each of the test cases includes common test steps that are present in all of the test cases, unique test steps that are unique to a respective test case, and corresponding metadata tags. A cloud server (120) generates a first command when all test steps in all the test cases are to be executed, and alternatively generates a second command when only selected test steps in each of the test cases are to be executed. A scenario execution system (128) tests the functionalities of the containerized application (106A) based on the first command or the second command received from the cloud server (120). FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
24 September 2024
Publication Number
40/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

TATA ELXSI LIMITED
ITPB Road, Whitefield, Bangalore – 560048, India.

Inventors

1. JIHAS KHAN
TATA ELXSI LIMITED, ITPB Road, Whitefield, Bangalore – 560048, India.
2. ABHIRAM REGHU
TATA ELXSI LIMITED, ITPB Road, Whitefield, Bangalore – 560048, India
3. CHANDNI SAPNA VIJAY
TATA ELXSI LIMITED, ITPB Road, Whitefield, Bangalore – 560048, India
4. AAMIR SOHAIL
TATA ELXSI LIMITED, ITPB Road, Whitefield, Bangalore – 560048, India
5. ADITHYA BALACHANDRAN
TATA ELXSI LIMITED, ITPB Road, Whitefield, Bangalore – 560048, India
6. MALAVIKA VASUDEVAN
TATA ELXSI LIMITED, ITPB Road, Whitefield, Bangalore – 560048, India

Specification

Description:RELATED ART

[0001] Embodiments of the present specification relate generally to a test system, and more particularly to an automotive test system that tests and validates functionalities of the applications that are to be deployed in main electronic control units (ECUs) of real-world software defined vehicles.
[0002] A software defined vehicle manages vehicle operations, performs associated functionalities, and enables new features primarily through software. The software defined vehicle is a result of transformation in the automotive industry from a traditional automotive setup that relies mainly on hardware to a software-centric electronic device on wheels. The software defined vehicle offers a variety of benefits over a traditional vehicle. For example, the software defined vehicle allows original equipment manufacturers (OEMs) to offer new features, improve vehicle performance, and enhance vehicle safety through software updates without vehicle owners needing to visit service centers to avail these benefits.
[0003] Generally, the software defined vehicle includes applications that execute safety critical functionalities in the vehicle. Examples of such applications include a lane departure warning application, a lane keep assist application, an intersection collision warning application, a vehicle dynamics controller application, a powertrain management application, an anti-lock braking application, and a traction control application. These applications residing in the software defined vehicle provide comfort and convenience to the driver and passengers of the vehicle and also ensure safe navigation of the vehicle, safety of the driver and passengers, and safety of the people and objects in the surroundings of the vehicle. Hence, these applications need to be thoroughly tested and validated for associated functionalities before deploying them in real-world vehicles.
[0004] Testing and validating the software defined vehicle and related applications, however, are complex procedures that require new testing technologies. This is because the traditional vehicle typically includes several ECUs each of which executes a only specific vehicle functionality. In contrast, the software defined vehicle includes multiple ECUs, for example, 3 to 4 ECUs with all high-performance computing occurring in one particular main ECU. Usually, the main ECU includes a complicated architecture with a hypervisor and multiple operating systems running on top of the hypervisor to handle several applications running in the main ECU in parallel. Hence, conventional test systems and techniques cannot be used to effectively test the main ECU that runs several applications in parallel.
[0005] In addition, proper testing and validation of the software defined vehicle and associated applications require execution of at least few thousands to millions of complex test cases. Hence, the related testing procedures are highly complex. In addition, certain currently available test systems require an actual vehicle in place for testing functionalities of software defined vehicle (SDV) applications. In such systems, developers need to completely develop the SDV applications to be tested, and deploy them in the actual vehicle. However, if any issue is found with the SDV applications while testing them using the actual vehicle, a huge overhead of efforts, cost, and time are needed for fixing the issue.
[0006] In addition, it is difficult to generate all test scenarios required for testing the SDV applications using the actual vehicle. With a skilled driver driving the actual vehicle in predefined test tracks, the existing test systems may be able to generate only a limited number of test scenarios. Hence, it may be preferable to validate functionalities of the SDV applications early during associated product development lifecycles without requiring the actual vehicle in place for performing testing and validation studies.
[0007] Accordingly, there remains a need for an improved test system that effectively and accurately tests functionalities of the SDV applications early during their product development lifecycles even in absence of a target hardware device in which the tested SDV applications are going to be deployed ultimately.

BRIEF DESCRIPTION

[0008] It is an objective of the present disclosure to provide a test system. The test system includes an edge device including a containerized application, one or more of whose associated functionalities are to be tested by the test system. Further, the test system includes a scenario generation system that generates a plurality of test cases required for testing the functionalities of the containerized application. Each of the generated test cases includes one or more common test steps that are present in all of the generated test cases, one or more unique test steps that are unique to a respective test case selected from the plurality of test cases, and one or more corresponding metadata tags. Furthermore, the test system includes a cloud server that generates a first command when all test steps in all the generated test cases are to be executed, and alternatively generates a second command different from the first command when only one or more selected test steps in each of the generated test cases are to be executed.
[0009] In addition, the test system includes a scenario execution system that is communicatively coupled to one or more of the edge device, the cloud server, and the scenario generation system. The scenario execution system tests the functionalities of the containerized application by executing all test steps in all the generated test cases upon receiving the first command from the cloud server. The scenario execution system tests the functionalities of the containerized application by identifying the one or more selected test steps to be executed in each of the generated test cases based on one or more of the corresponding metadata tags associated with the second command and by executing only the identified test steps. The cloud server deploys the containerized application that is successfully tested to a software defined system. The software defined system corresponds to one or more of a software defined vehicle, a software defined network system, a mobile phone, a laptop, and a set-top-box.
[0010] The edge device corresponds to one of an actual main electronic control unit and a virtual main electronic unit of a software defined vehicle. The edge device includes a set of components including one or more of a system-on-chip, a hypervisor, one or more operating systems, and a plurality of containers. The containerized application corresponds to one of a lane departure warning application, a lane keep assist application, an intersection collision warning application, a vehicle dynamics controller application, a powertrain management application, an anti-lock braking application, a traction control application, a connected vehicle application, a green light optimal speed advisory application, an advanced driver assistance system application, and electric vehicle related application to be deployed in the software defined vehicle.
[0011] The test system is configured to one or more of test one or more functionalities of a simulated model of the containerized application using the edge device. Further, the test system tests the one or more functionalities of the containerized application of a production grade using one or more virtualized operating systems of the edge device. Furthermore, the test system tests the one or more functionalities of the production grade containerized application and one or more production grade operating systems using a virtualized hypervisor of the edge device. In addition, the test system tests the one or more functionalities of the production grade containerized application, the one or more production grade operating systems, and a hypervisor including binary files of the edge device using a virtualized system-on-chip of the edge device.
[0012] It is another objective of the present disclosure to provide a method for testing functionalities of a containerized application using a test system. The method includes generating a plurality of test cases required for testing the functionalities of the containerized application using a scenario generation system. Each of the generated test cases includes one or more common test steps that are present in all of the generated test cases, one or more unique test steps that are unique to a respective test case selected from the plurality of test cases, and one or more corresponding metadata tags. Further, the method includes generating one of a first command when all test steps in all the generated test cases are to be executed, and alternatively a second command when only one or more selected test steps in each of the generated test cases are to be executed using a cloud server. The first command is different from the second command. Furthermore, the method includes executing all test steps in all the generated test cases upon receiving the first command from the cloud server for testing the functionalities of the containerized application.
[0013] In addition, the method includes identifying the one or more selected test steps to be executed in each of the generated test cases based on one or more of the corresponding metadata tags associated with the second command and executing only the identified test steps for testing the functionalities of the containerized application. Generating the plurality of test cases required for testing the functionalities of the containerized application includes generating a test scenario including a simulated image of a vehicle using the scenario generation system. Further, the method includes transmitting the simulated image from the scenario generation system to a container in which the containerized application is yet to be deployed using a data extraction system. Furthermore, the method includes determining if the container has successfully received the simulated image from the data extraction system, an overall time taken for the container to receive the simulated image based on a time at which the simulated image is generated by the scenario generation system and a time at which the simulated image is received by the container, and a memory usage of the edge device while receiving the simulated image from the data extraction system.
[0014] In addition, the method includes performing a first level validation and successfully validating that all components of the edge device are functioning properly when the container is determined to have successfully received the simulated image from the data extraction system, the determined overall time taken is within a particular time limit, and the determined memory usage is within a designated threshold. The components of the edge device include one or more of a system-on-chip, a hypervisor, one or more operating systems, and a plurality of containers. Moreover, the method includes generating a warning message post successfully performing the first level validation using the container, and transmitting the warning message from the container to another container via a communications link. Additionally, the method includes identifying if the another container has successfully received the warning message from the container, and performing a second level validation and successfully validating that all the components of the edge device are functioning properly when the another container is identified to have successfully received the warning message from the container.
[0015] Generating the plurality of test cases required for testing the functionalities of the containerized application includes receiving an input prompt including a request to generate the plurality of test cases from the test system by the scenario generation system. The scenario generation system corresponds to a generative artificial intelligence based system. Further, the method includes retrieving domain specific information required for generating the plurality of test cases from one or more automotive databases. Furthermore, the method includes automatically generating the plurality of test cases required for testing functionalities of the containerized application based on the input prompt received from the test system and the domain specific information retrieved from the one or more automotive databases. In addition, the method includes automatically generating a scenario generation file corresponding to each of the plurality of generated test cases. The scenario generation file related to each of the plurality of test cases includes information required for recreating a test scenario defined by a respective test case in a simulated environment.
[0016] Testing the functionalities of the containerized application includes receiving a latest version of the containerized application to be tested from the cloud server and deploying the latest version in the container post successfully testing the functionalities of the components of the edge device. Further, the method includes receiving a test case selected from the plurality of test cases and the scenario generation file corresponding to the selected test case from the scenario generation system by a scenario execution system in the test system. The selected test case defines a test scenario in which a functionality of the containerized application has to be tested. Furthermore, the method includes executing all test steps in the selected test case upon receiving the first command from the cloud server, alternatively executing only the one or more selected test steps in the selected test case upon receiving the second command from the cloud server. Moreover, the method includes recreating the test scenario defined by the selected test case in a simulated environment using the scenario generation file corresponding to the selected test case.
[0017] Recreating the test scenario defined by the selected test case in the simulated environment includes generating a simulated image of a vehicle deviating towards a specific lane by the scenario execution system. Further, the method includes extracting the simulated image, generated by the scenario execution system, using a data extraction system, and providing the simulated image as an input to one or more of a ground truth generation system and the containerized application corresponding to a lane departure warning application to be tested. Furthermore, the method includes identifying if the vehicle in the simulated image deviates towards the specific lane using the ground truth generation system based on a location coordinate of the specific lane and a location coordinate of the vehicle, and generating a ground truth log file that specifies if the vehicle has deviated towards the specific lane. Moreover, the method includes processing the simulated image and identifying if the vehicle in the simulated image deviates towards the specific lane using the lane departure warning application.
[0018] In addition, the method includes generating and transmitting a lane deviation message from the lane departure warning application to a human-machine interface application deployed in another container when the lane departure warning application identifies that the vehicle in the simulated image deviates towards the specific lane. The method further includes generating a lane departure visual warning by the human-machine interface application upon receiving the lane departure message from the lane departure warning application, and further displaying the generated lane departure visual warning in a display unit of a human-machine interface. Further, the method includes capturing an image of the lane departure visual warning displayed on the display unit of the human-machine interface by an imaging system, and processing the captured image using the test system and identifying a specific type of lane deviation of the vehicle. The specific type of lane deviation corresponds to one of a right lane deviation, a left lane deviation and no lane deviation. Furthermore, the method includes generating an output log file using the test system based on the specific type of lane deviation identified from processing of the captured image. Moreover, the method includes successfully verifying a lane departure warning generation functionality of the lane departure warning application when information in the generated ground truth log file matches with information in the generated output log file.
[0019] Recreating the test scenario defined by the selected test case in the simulated environment includes simulating battery characteristics information including one or more of a simulated voltage, current, and temperature of a battery using the scenario execution system. Further, the method includes extracting the simulated battery characteristics information from the scenario execution system using a data extraction system, and providing the simulated battery characteristics information as an input to both a ground truth generation system and the containerized application that corresponds to a battery state-of-charge (SOC) determination application to be tested. Furthermore, the method includes determining ground truth information including a state-of-charge of the battery based on the received simulated battery characteristics information using a digital twin framework residing in the cloud server. Moreover, the method includes determining output information including the state-of-charge of the battery based on the received simulated battery characteristics information using the battery SOC determination application. In addition, the method includes successfully verifying a functionality of the battery SOC determination application when the ground truth information including the state-of-charge of the battery determined using the digital twin framework matches with the output information including the state-of-charge of the battery determined using the battery SOC determination application.
[0020] The functionalities of the containerized application includes deploying the latest version of the containerized application whose associated functionalities are successfully tested by the test system from the cloud server to a software defined system. The software defined system corresponds to one or more of a software defined vehicle, a software defined network system, a mobile phone, a laptop, and a set-top-box. Deploying the latest version of the containerized application from the cloud server to the software defined system includes receiving an over-the-air update including the latest version of the containerized application successfully tested from the cloud server by the software defined system. The software defined system includes a plurality of associated containers including a plurality of containerized applications. Further, the method includes updating only a previous version of the containerized application deployed in a particular container of the software defined system to the latest version based on the received over-the-air update without updating other containerized applications selected from the containerized applications deployed in other containers selected from the containers.
[0021] Transmitting the latest version of the containerized application from the cloud server to the software defined system includes dividing a memory unit of the software defined system into an inactive memory bank and an active memory bank. The inactive memory bank includes no information related to the containerized applications of the software defined system initially. The active memory bank includes one or more blocks related to all of the containerized applications of the software defined system initially. Further, the method includes receiving blocks related to the latest version of the containerized application transmitted from the cloud server to the software defined system by the inactive memory bank, and rebooting one or more operating systems of the software defined system upon receiving the latest version of the containerized application from the cloud server. Moreover, the method includes copying the one or more blocks related to all of the containerized applications except the blocks related to the containerized application from the active memory bank to the inactive memory bank during rebooting of the one or more operating systems of the software defined system. In addition, the method includes using the inactive memory bank including the blocks related to the latest version of the containerized application and the blocks related to the containerized applications copied from the active memory bank as a new active memory bank.
[0022] The method further includes deactivating the active memory bank to a new inactive memory bank, and updating the new inactive memory bank back to the active memory bank when the blocks related to the latest version of the containerized application in the new active memory bank are not functioning properly. Testing the functionalities of the containerized application using the test system includes one or more of testing one or more functionalities of a simulated model of the containerized application using an edge device. Further, the method includes testing the one or more functionalities of the containerized application of a production grade using one or more virtualized operating systems of the edge device. Furthermore, the method includes testing the one or more functionalities of the production grade containerized application and one or more production grade operating systems using a virtualized hypervisor of the edge device. Moreover, the method includes testing the one or more functionalities of the production grade containerized application, the one or more production grade operating systems, and a hypervisor including binary files of the edge device using a virtualized system-on-chip of the edge device.

BRIEF DESCRIPTION OF DRAWINGS

[0023] These and other features, aspects, and advantages of the claimed subject matter will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0024] FIG. 1 illustrates a block diagram depicting an in-lab test setup including a test system for testing SDV applications, in accordance with aspects of the present disclosure;
[0025] FIG. 2 illustrates a graphical representation of an exemplary cloud server that transmits one or more containerized applications whose associated functionalities are successfully tested using the test system of FIG. 1 to one or more software defined vehicles, in accordance with aspects of the present disclosure;
[0026] FIG. 3 illustrates a flow diagram depicting an exemplary method for validating if components of a main ECU are functioning properly before testing functionalities of the containerized applications using the test system of FIG. 1, in accordance with aspects of the present disclosure;
[0027] FIG. 4 illustrates a flow diagram depicting an exemplary method for generating a plurality of test cases required for testing functionalities of a containerized application using the test system of FIG. 1, in accordance with aspects of the present disclosure;
[0028] FIGS. 5A-B illustrate a flow diagram depicting an exemplary method for testing functionalities of the containerized application using the test system of FIG. 1, in accordance with aspects of the present disclosure;
[0029] FIG. 6 illustrates a graphical representation of an exemplary test scenario of a vehicle deviating towards a right lane generated in a simulated environment for testing functionalities of the containerized application using the test system of FIG. 1, in accordance with aspects of the present disclosure; and
[0030] FIG. 7 illustrates an exemplary graphical representation depicting an exemplary visual warning generated and displayed on a display unit of a human-machine interface for a right lane deviation while testing functionalities of the containerized application using the test system of FIG. 1, in accordance with aspects of the present disclosure.

DETAILED DESCRIPTION

[0031] The following description presents an exemplary test system that accurately and efficiently tests and validates functionalities of containerized applications and associated source codes. Particularly, embodiments described herein disclose the test system that tests functionalities of the containerized applications in a laboratory environment using an actual hardware device in which the containerized applications are going to be deployed ultimately or using a simulated or a virtualized model of the hardware device when the actual hardware device is not readily available for testing.
[0032] The test system described herein throughout various embodiments, for example, may be used to perform in-lab testing of functionalities of the containerized applications that are to be deployed in real-world vehicles such as in software defined vehicles. As noted previously, conventional test systems are not capable of testing the containerized applications to be deployed in the software defined vehicles effectively as such testing generally involves complex procedures and requires execution of several thousands to millions of test cases. In addition, a main ECU of the software defined vehicles runs several safety critical applications in parallel and includes a complex architecture, which cannot be tested accurately with the conventional test systems and techniques.
[0033] Further, the present available test systems require developers to fully develop the containerized applications and to subsequently deploy the fully developed applications in an actual main ECU for testing functionalities of those applications. However, if there are any issues identified with the fully developed applications while testing them with the actual main ECU, it requires a huge amount of effort, time, and cost for fixing the issues. In order to address the aforementioned issues, the present disclosure provides a test system that accurately and efficiently tests functionalities of the containerized applications to be deployed in the software defined vehicles despite the testing complexity and the large number of required test cases.
[0034] Further, the test system described in the present disclosure facilitates testing of the containerized applications early in their product development lifecycles using a simulated model of the main ECU. Additionally, the test system allows developers to fix issues, if any, identified during testing during early phases of a product development lifecycle, which ensures that the fully developed applications are free from any major issues and can be quickly deployed in real-world vehicles without any delay.
[0035] It may be noted that different embodiments of the present test system may be used in different application areas. For example, the test system may be used to test functionalities of the containerized applications that are to be deployed in software defined vehicles, as noted previously. In another example, the test system may be used to test functionalities of the containerized applications that are to be deployed in one or more of mobile phones, laptops, and set-top-boxes. In yet another example, the test system may be used to test functionalities of the containerized applications that are to be deployed in a software defined network (SDN) system. For clarity, an embodiment of the present test system is described herein in greater detail with reference to testing functionalities of one or more of the containerized applications that are to be deployed in real-world vehicles such as in software defined vehicles.
[0036] FIG. 1 illustrates a block diagram depicting an in-lab test setup (100) including a test system (102) that is communicatively coupled to an edge device (104) such as a main electronic control unit (ECU) (104), which in turn, includes one or more containerized applications (106A-N) whose associated functionalities are to be tested using the test system (102). In one embodiment, the test system (102) corresponds to one of a laptop, a desktop, a mobile phone, a tablet, and a cloud-based system that includes a graphical processing unit capable of providing graphics support. In certain embodiments, the main ECU (104) corresponds to a high-performance computer of a software defined vehicle capable of running multiple containerized applications (106A-N) in parallel.
[0037] In one embodiment, the test system (102) employs an actual vehicle ECU corresponding to the main ECU (104) of the software defined vehicle for testing functionalities of the containerized applications (106A-N) when such an ECU is readily available. In this embodiment, the main ECU (104) includes a system-on-chip (108), a hypervisor (110), a plurality of operating systems (112A-N) running on top of the hypervisor (110), and a plurality of containers (114A-N) generated, for example, using a docker platform. Each of the containers (114A-N) in the main ECU (104) includes an application to be tested by the test system (102). For example, the container (114A) includes a lane departure warning application, the container (114B) includes a human-machine interface application, and the container (114N) includes a battery state-of-charge (SOC) determination application, each to be tested by the test system (102).
[0038] In certain other embodiments, the test system (102) employs a simulated model of the main ECU (104) when an actual main ECU of a software defined vehicle is not available for testing the containerized applications (106A-N). In one embodiment, the simulated model of the main ECU (104) includes an architecture that closely mimics an architecture of the actual main ECU. Further, the simulated model includes the same set of components that are present in the actual main ECU (104) except for the fact that those components are simulated components. Specifically, the simulated model of the main ECU (104) includes a simulated system-on-chip (108), a simulated hypervisor (110), one or more simulated operating systems (112A-N), and one or more simulated containers (114A-N), which host the containerized applications (106A-N) to be tested. It is to be noted that the term “main ECU (104)” used herein throughout various embodiments of the present disclosure is not restricted only to an actual main ECU of a vehicle. The term “main ECU (104)” refers to both the actual main ECU and a simulated model of the actual main ECU.
[0039] In one embodiment, the test system (102) includes an ECU validation system (116) that validates if components of the main ECU (104) such as the system-on-chip (108), the hypervisor (110), and the operating systems (112A-N) are functioning properly before the containerized applications (106A-N) are added into the containers (114A-N). A specific approach by which the ECU validation system (116) validates if the components of the main ECU (104) are functioning properly is described in a greater detail with reference to FIG. 3.
[0040] Post successful validation, the test system (102) requests a development and operational (DevOps) system (118) to transmit latest versions of the containerized applications (106A-N) to the main ECU (104). In one embodiment, the DevOps system (118) resides in a cloud server (120) and stores latest versions of the containerized applications (106A-N) that are transmitted to the main ECU (104) via a communications link (122). Examples of the communications link (122) include a scalable service-oriented middleware over IP (SOME/IP), a transmission control protocol, a controller area network, a local interconnect network, a FlexRay, a Wi-Fi network, an Ethernet, and a cellular data network. The main ECU (104) then receives the latest versions of the containerized applications (106A-N) from the cloud server (120) via the communications link (122), and further adds the received containerized applications (106A-N) into the containers (114A-N), respectively for testing functionalities of those applications (106A-N).
[0041] In certain embodiments, the test system (102) includes a scenario generation system (124) that resides in the test system (102) itself. Alternatively, the scenario generation system (124) may reside outside of the test system (102) in a remote location such as in a cloud and is communicatively coupled to the test system (102) via the communications link (122). The scenario generation system (124) generates a plurality of tests cases, each of which defines a test scenario for testing functionalities of the containerized applications (106A-N). Examples of the scenario generation system (124) that may be used for generating the test cases include a CARLA simulator, a CarMaker simulator, a quantum geographic information system (QGIS) platform, and a generative artificial intelligence (AI)-based scenario simulator.
[0042] When using the CARLA or CarMaker simulator, the scenario generation system (124) generates the test cases needed for testing the containerized applications (106A-N) based on user inputs. Particularly, a user inputs all necessary information required for generating the test cases such as waypoint information, environmental condition information, traffic condition information, road condition information, and location information to the CARLA or CarMaker simulator. Based on these inputs provided by the user, the CARLA or CarMaker simulator automatically generates the necessary test cases defining test scenarios under which functionalities of the containerized applications (106A-N) are to be tested.
[0043] In another embodiment, the user may input all necessary information needed for generating the test cases to the QGIS platform instead of inputting the information directly to the CARLA or CarMaker simulator. In this embodiment, the QGIS platform generates scenario simulation files based on the information provided as inputs by the user to the QGIS platform. Further, the QGIS platform provides the generated scenario simulation files as inputs to the CARLA simulator. Based on the received scenario simulation files, the CARLA simulator generates the necessary test cases defining test scenarios under which functionalities of the containerized applications (106A-N) are to be tested.
[0044] In yet another embodiment, the scenario generation system (124) generates the test cases needed for testing the containerized applications (106A-N) using generative AI technology. In this embodiment, the test system (102) includes a generative AI based prompt system (126), as depicted in FIG. 1. The generative AI generative AI based prompt system (126) includes one or more associated graphical user interfaces and a chatbox (not shown). A user may submit a suitably designed input prompt via the chatbox to the scenario generation system (124) to generate the test cases needed for testing the containerized applications (106A-N). Subsequently, the scenario generation system (124) generates the test cases based on the received input prompt, as described in a greater detail with reference to FIG. 4.
[0045] In certain embodiments, the scenario generation system (124) also generates a scenario generation file corresponding to each of the generated test cases, for example, in an OpenSCENARIO standard format, as described further with reference to FIG. 4. The scenario generation system (124) then provides the generated scenario generation file corresponding to each of the generated test cases to a scenario execution system (128). Examples of the scenario execution system (128) include a CARLA simulator and a CarMaker simulator.
[0046] In one embodiment, each of the test cases generated by the scenario generation system (124) includes a set of test steps which are tagged with metadata tags generated, for example, using Robot Framework. Subsequently, the scenario execution system (128) executes all test steps in all of the test cases generated by the scenario generation system (124). Alternatively, the scenario execution system (128) omits execution of test steps that are commonly present in all of the generated test cases, and selectively executes only unique test steps in each of the generated test cases. Avoiding execution of the common test steps and executing only the unique test steps in each of the generated test cases saves a significant amount of overall time taken to test the containerized applications (106A-N), as described in a greater detail with reference to FIGS. 5A-B.
[0047] In certain embodiments, the test system (102) further includes a data extraction system (130) that extracts simulated data from the scenario execution system (128), for example, from the CARLA simulator while executing each of the generated test cases. In one embodiment, the data extraction system (130) includes one or more relevant application programming interfaces (APIs) such as python APIs. Using the associated APIs, the data extraction system (130) extracts the simulated data, such as one or more of environmental data, traffic data, ego-vehicle data, ambient vehicle data, ambient infrastructure data, lane markings data, and one or more images of test scenarios, from the scenario execution system (128) during execution of each of the test cases.
[0048] The data extraction system (130) then transmits the simulated data extracted during execution of each of the test cases to one or more of a ground truth generation system (132) and a containerized application (106A) whose functionalities need to be tested. Upon receiving the simulated data, the ground truth generation system (132) generates ground truth information from the simulated data using one or more algorithms that are generally known to output accurate results, as described in detail with reference to FIGS. 5A-B. Further, the containerized application (106A), whose associated functionalities need to be tested, also independently generates an output from the simulated data received from the data extraction system (130), as described in detail with reference to FIGS. 5A-B.
[0049] The test system (102) then determines if the generated ground truth information matches with the output of the containerized application (106A). In one embodiment, the test system (102) successfully validates functionalities of the containerized application (106A) when the generated ground truth information matches with the output of the containerized application (106A). Otherwise, the test system (102) identifies that the containerized application (106A) is not functioning properly when the output of the containerized application (106A) mismatches with the generated ground truth information.
[0050] In one embodiment, the test system (102) and associated systems including the ECU validation system (116), scenario generation system (124), scenario execution system (128), data extraction system (130), and ground truth generation system (132), for example, may include one or more of general-purpose processors and specialized processors to test and validate functionalities of one or more of the containerized applications (106A-N). In certain embodiments, the test system (102), ECU validation system (116), scenario generation system (124), scenario execution system (128), data extraction system (130), and ground truth generation system (132) may include one or more of graphical processing units, microprocessors, programming logic arrays, field programming gate arrays, integrated circuits, system on chips, and/or other suitable computing devices. Additionally, certain operations of the test system (102), ECU validation system (116), scenario generation system (124), scenario execution system (128), data extraction system (130), and ground truth generation system (132) may be implemented by suitable code on a processor-based system, such as a general-purpose or a special-purpose computer.
[0051] In certain embodiments, post successfully testing functionalities of the containerized applications (106A-N), the test system (102) requests the cloud server (120) to transmit one or more of the successfully tested containerized applications (106A-N) from the cloud server (120) to one or more software defined systems (202) such as one or more software defined vehicles (202A-N), as depicted in FIG. 2. The software defined vehicles (202A-N) then receive the containerized applications from the cloud server (120) via the communications link (122) and deploy those applications in associated containers for executing various automotive related functionalities.
[0052] In one embodiment, the test system (102) depicted and described with reference to FIG. 1 is capable of testing a wide variety of containerized applications before deploying such applications in real-world software defined vehicles (202A-N). Examples of such containerized applications include a lane departure warning application, a lane keep assist application, an intersection collision warning application, a vehicle dynamics controller application, a powertrain management application, and an anti-lock braking application. Additional examples of the containerized applications that can be tested using the test system (102) include a traction control application, a connected vehicle application, a green light optimal speed advisory (GLOSA) application, an advanced driver assistance system (ADAS) application, and electric vehicle related application. However, for the sake of simplicity, the test system (102) used for testing and validating functionalities of the lane departure warning (LDW) application (106A) is described subsequently with reference to FIGS. 3-7. In certain embodiments, the ECU validation system (116) identifies if the components of the main ECU (104) such as the system-on-chip (108), the hypervisor (110), the operating systems (112A-N), and the containers (114A-N) are functioning properly before adding the LDW application (106A) to be tested into the container (114A). For the sake of simplicity, the container (114A) is referred herein afterwards as an LDW container (114A) throughout various embodiments of the present disclosure.
[0053] FIG. 3 illustrates a flow diagram depicting an exemplary method (300) for validating if the components of the main ECU (104) are functioning properly before adding the LDW application (106A) into the LDW container (114A). The order in which the exemplary method is described is not intended to be construed as a limitation, and any number of the described blocks may be combined in any order to implement the exemplary method disclosed herein, or an equivalent alternative method. Additionally, certain blocks may be deleted from the exemplary method or augmented by additional blocks with added functionality without departing from the claimed scope of the subject matter described herein.
[0054] At step (302), the scenario generation system (124) generates a first simulated signal including a simulated image of a vehicle deviating from a current travelling lane based on a user input. In one embodiment, the simulated image includes timestamp information indicating a time at which the simulated image is generated by the scenario generation system (124). Subsequently, at step (304), the data extraction system (130) extracts the first simulated signal from the scenario generation system (124), and transmits the first simulated signal to the LDW container (114A) via the communications link (122).
[0055] At step (306), the ECU validation system (116) performs a first level validation to identify if the components of the main ECU (104) are functioning properly based on the first simulated signal transmitted to the LDW container (114A). Specifically, the ECU validation system (116) performs the first level validation using the main ECU (104), which determines if the LDW container (114A) has successfully received the first simulated signal from the data extraction system (130). Further, the main ECU (104) determines an overall time taken by the LDW container (114A) to receive the first simulated signal based on the time at which the simulated image is generated and the time at which the simulated image is received by the LDW container (114A). Additionally, the main ECU (104) determines an associated memory usage when the first simulated signal is received by the LDW container (114A).
[0056] Subsequently, the main ECU (104) transmits the determined information that indicates if the LDW container (114A) has successfully received the first simulated signal, the determined overall time taken, and the determined memory usage back to the ECU validation system (116). The ECU validation system (116) then identifies that the components of the main ECU (104) are functioning properly only when the first simulated signal is successfully received by the LDW container (114A), the determined overall time taken is within a particular time limit, and the determined memory usage is within a designated threshold. In one embodiment, the ECU validation system (116) includes an associated database (not shown in FIGS. 1-7) that previously stores values associated with the particular time limit and the designated threshold. In certain embodiments, the ECU validation system (116) may also additionally use parameters such as accessibility of interfaces and system calls to perform the first level validation and to identify if the components of the main ECU (104) are functioning properly.
[0057] At step (308), the LDW container (114A) generates a second simulated signal including a lane departure warning (LDW) message post successfully performing the first level validation. At step (310), the LDW container (114A) transmits the generated lane departure warning message to another container such as a human-machine interface (HMI) container (114B) in which an HMI application (106B) is yet to be deployed. At step (312), the main ECU (104) performs a second level validation by identifying if the HMI container (114B) has correctly received the lane departure warning message from the LDW container (114A). At step (314), the ECU validation system (116) identifies that the components of the main ECU (104) are functioning properly when the HMI container (114B) has correctly received the lane departure warning message from the LDW container (114A).
[0058] It may be noted that identification of issues, if any, in functioning of the components of the main ECU (104) will be extremely difficult once the containerized applications (106A-N) are added into the container (114A-N). This is because, it would be difficult to identify if the issues have occurred due to improper functioning of the components of the main ECU (104) or due to improper functioning of the containerized applications (106A-N). Hence, the test system (102) described in the present disclosure identifies issues, if any, in functioning of the components of the main ECU (104) upfront before adding the containerized applications (106A-N) into the containers (114A-N), which saves a lot of time and manual effort that otherwise would have been wasted in identifying root causes of the issues.
[0059] In certain embodiments, post successfully validating functionalities of the components of the main ECU (104), the test system (102) may request the cloud server (120) to transmit latest versions of the LDW and HMI applications (106A and 106B) to the LDW and HMI containers (114A and 114B), respectively for testing and validating functionalities of the LDW application (106A). Further, the scenario generation system (124) generates a plurality of test cases required for testing functionalities of the LDW application (106A), as described subsequently with reference to FIG. 4.
[0060] FIG. 4 illustrates a flow diagram depicting an exemplary method (400) for generating a plurality of test cases required for testing functionalities of the LDW application (106A). At step (402), the scenario generation system (124) receives an input prompt in natural language from the test system (102) for automatically generating test cases related to testing functionalities of the LDW application (106A). In one embodiment, the scenario generation system (124) corresponds to a generative AI based system that uses a large language model such as Bard-3.5 and GPT-4, for automatically generating the test cases. A user may submit the input prompt required for generating the test cases to the scenario generation system (124) via a chatbox in the generative AI based prompt system (126). An example of such input prompt submitted by the user to the scenario generation system (124) includes “Generate test cases for testing lane departure warning functionalities of a vehicle moving on a road network in a Barcelona town.”
[0061] Subsequently, at step (404), the scenario generation system (124) retrieves domain specific information required for generating the test cases from one or more standard automotive databases. It may be noted that existing large language models (LLMs) include very limited domain specific knowledge that is not sufficient for generating the test cases specifically required for testing the LDW application (106A). In addition, training and fine-tuning the existing LLMs to automatically generate the test cases required for testing the LDW application (106A) would require a significant amount of time, resources, and human effort. In order to address this issue, the scenario generation system (124) employs one or more approaches such as retrieval-augmented generation and/or in-context learning approaches. These approaches enable the scenario generation system (124) to retrieve automotive specific information such as automotive specific files, documents, and standards from standard automotive databases such as POLARION and rational quality manager (RQM). The scenario generation system (124) then uses the retrieved automotive specific information for generating the test cases required for testing the LDW application (106A).
[0062] At step (406), the scenario generation system (124) generates the test cases based on the input prompt and the domain specific information retrieved from the standard automotive databases. In one embodiment, the scenario generation system (124) generates multiple test cases in a keyword format for testing functionalities of the LDW application (106A). Table 1, for example, includes details of 3 exemplary test cases generated by the scenario generation system (124).

Table 1: Exemplary test cases generated by the scenario generation system (124)

Test Case ID Test Scenario
TC_LDW_001 Verify if a lane departure (LD) warning in an HMI shows a
right deviation when a vehicle deviates towards a right
lane
TC_LDW_002 Verify if a LD warning in an HMI shows a left deviation
when a vehicle deviates towards a left lane
TC_LDW_003 Verify if a LD warning in an HMI fails to show a left
deviation when a vehicle deviates towards a left lane with
a left indicator on

[0063] It may be noted from the previously noted Table 1 that each of the test cases generated by the scenario generation system (124) defines a test scenario under which functionalities of the LDW application (106A) have to be tested. For example, the test case “TC_LDW_001” defines a test scenario related to testing and verifying if the LDW application (106A) displays a right deviation warning message on a human-machine interface (138) when a vehicle deviates towards a right lane. The test case “TC_LDW_002’ defines another test scenario related to testing and verifying if the LDW application (106A) displays a left deviation warning message on the human-machine interface (138) when the vehicle deviates towards a left lane. The test case “TC_LDW_003” defines yet another test scenario related to testing and verifying if the LDW application (106A) displays no warning messages on the human-machine interface (138) when the vehicle deviates towards the left lane with an associated left indicator signal turned on.
[0064] Further, it may be noted that each of the three exemplary test cases “TC_LDW_001”, “TC_LDW_002”, and “TC_LDW_003” generated by the scenario generation system (124) includes a set of associated test steps that are all tagged with metadata tags, as indicated below in subsequent paragraphs.
[0065] Test steps related to exemplary test cases:
[0066] Precondition_Test_Infra:
[Documentation] Setting up test infrastructure
[Metadata Tags] Environment
Run CARLA client and verify
Boot up CentOS and verify
[0067] Precondition_Applications_Setup:
[Documentation] Checking pre-condition for testing applications
[Metadata Tags] Precondition
Pull LDW container if not loaded
Run and verify HMI
Run and verify LDW application
[0068] Integration_TestStep_1
[Documentation] Checking the LDW right deviation
[Metadata Tags] LDW
Running the CARLA scenario “Rightdeviation.xosc”
Verify deviation “Right”
[0069] Integration_TestStep_2
[Documentation] Checking the LDW left deviation
[Metadata Tags] LDW
Running the CARLA scenario “Leftdeviation.xosc”
Verify deviation “Left”
[0070] Integration_TestStep_3
[Documentation] Checking the LDW left deviation with left indicator
[Metadata Tags] LDW
Running the CARLA scenario “LeftdeviationIndicator.xosc”
Verify deviation with indicator “Left”
[0071] Clean Up
[Documentation] Bringing back infrastructure to initial state
[Metadata Tags] Clean_up
Clear
[0072] For example, each of the three exemplary test cases noted previously includes a first test step related to ‘setting up an infrastructure required for testing the LDW application (106A)’, a second test step related to ‘checking certain conditions prior to testing the LDW application (106A)’, and a third test step related to ‘cleaning up and bringing back the infrastructure to an initial state.’ In one embodiment, these first, second, and third test steps are common to all of the generated test cases.
[0073] In addition to the common steps, each of these generated test cases also includes an associated unique test step. For example, the test case “TC_LDW_001” includes a unique test step “Integration_TestStep_1”, which is applicable only to the test case “TC_LDW_001” and is not applicable to other two test cases “TC_LDW_002” and “TC_LDW_003.” Similarly, the test cases “TC_LDW_002” and “TC_LDW_003” include corresponding unique test steps “Integration_TestStep_2” and “Integration_TestStep_3”, which are applicable only to the test cases “TC_LDW_002” and “TC_LDW_003”, respectively.
[0074] At step (408), the scenario generation system (124) generates scenario generation files including information required for recreating the generated test cases in a simulated environment. In one embodiment, the scenario generation system (124) generates the scenario generation files in an ASAM openSCENARIO format such as in .XOSC format. For example, for the test case “TC_LDW_001” noted previously in Table 1, the scenario generation system (124) generates a scenario generation file in an ASAM openSCENARIO format, which includes all necessary information required for recreating a test scenario of a vehicle deviating towards a right lane in a simulated environment. For instance, in the previously noted example, the scenario generation file generated by the scenario generation system (124) includes information such as real-world location of a road network found in the Barcelona town specified in an ASAM openDRIVE format such as in .XODR format. In addition, the scenario generation file generated by the scenario generation system (124) includes information such as a location coordinate of a left lane in the road network, a location coordinate of a right lane in the road network, and a location coordinate of a vehicle defined in such a way that one of the vehicle’s wheel touches the right lane, which are all required for recreating the test scenario of the vehicle deviating towards the right lane in the simulated environment.
[0075] Though it is not described in detail, it is to be understood that the scenario generation system (124) similarly generates scenario generation files required for recreating the other test cases “TC_LDW_002” and “TC_LDW_003” in a simulated environment. It may be noted that conventional scenario generation systems generating test cases for application testing require a driver to manually drive a vehicle. While the driver is driving the vehicle, the conventional scenario generation systems record ambient data using associated on-board sensors and generate necessary test cases based on the recorded ambient data. However, this conventional approach requires a huge amount of manual effort, costs, and time. For example, conventional scenario generation systems require the driver to drive the vehicle for 1 hour, and further requires 1.5 hours for processing the ambient data captured by the on-board sensors and for generating relevant test cases. However, in contrast to conventional scenario generation systems, the scenario generation system (124) using the power of generative AI generates such test cases required for testing applications in few seconds to minutes upon receiving an input prompt from a user.
[0076] In one embodiment, the scenario generation system (124) transmits the test cases and the scenario generation files generated using the scenario generation system (124) to the scenario execution system (128) via the communications link (122). The scenario execution system (128) then executes the generated test cases using the scenario generation files, and tests the functionalities of the LDW application (106A), as described in greater detail subsequently with reference to FIGS. 5A-B.
[0077] FIGS. 5A-B illustrate a flow diagram depicting an exemplary method (500) for testing functionalities of the LDW application (106A) using the test system (102) of FIG. 1. At step (502), the main ECU (104) receives a latest version of the LDW application (106A) to be tested from the cloud server (120) via the communications link (122) and deploys the received LDW application (106A) in the LDW container (114A). At step (504), the scenario execution system (128) receives the test cases and the corresponding scenario generation files generated previously for testing the LDW application (106A) from the scenario generation system (124).
[0078] In certain embodiments, the test system (102) includes a web portal (134) that allows a developer and/or a test executor to add any other test cases to be executed in addition to the test cases received from the scenario generation system (124). The web portal (134) also allows the developer and/or test executor to schedule execution of the test cases and to perceive test results via an associated graphical user interface.
[0079] At step (506), the scenario execution system (128) selectively executes one or more test steps in each of the test cases based on a command received from the DevOps system (118). For example, the test executor may provide a first command via the DevOps system (118) when all test steps in all the test cases have to be executed. Otherwise, the test executor may provide a second command via the DevOps system (118) when only few selected test steps have to be executed in each of the test cases. For instance, the test executor may provide the following exemplary first command represented herein using equation (1) to the scenario execution system (128) when all test steps in all the previously noted exemplary test cases “TC_LDW_001”, “TC_LDW_002”, and “TC_LDW_003” have to be executed.

robot LDW_TC.robot (1)

[0080] Upon receiving the previously noted first command, the scenario execution system (128) executes all test steps in all the test cases. For example, the scenario execution system (128) executes all test steps related to the test case “TC_LDW_001” upon receiving the first command. In particular, the scenario execution system (128) first executes the test step related to setting up an infrastructure required for testing the LDW application (106A). In one embodiment, setting up the infrastructure entails running and verifying if the scenario execution system (128) such as the CARLA simulator is working properly, and/or booting up and verifying if the one or more operating systems (112A-N) associated with the main ECU (104) are working properly. Upon successfully executing the test step related to setting up the infrastructure, the scenario execution system (128) subsequently executes the test step related to checking certain pre-conditions prior to testing functionalities of the LDW application (106A). In certain embodiments, checking those pre-conditions entails verifying if the LDW and HMI applications (106A and 106B) are properly loaded in the LDW and HMI containers (114A and 114B), respectively.
[0081] Upon successfully executing both the test steps related to setting up the infrastructure and checking the pre-conditions, the scenario execution system (128) executes the test step “Integration_TestStep_1”, which is unique to the test case “TC_LDW_001.” The test step “Integration_TestStep_1” entails generating a simulated image of a vehicle deviating towards a right lane based on the scenario generation file related to the test case “TC_LDW_001,” and providing the simulated image as input to the LDW application (106A) to check if the LDW application (106A) triggers a right lane deviation warning (RLDW) message based on the received simulated image. Finally, the scenario execution system (128) executes the test step related to clean up and brings the infrastructure back to an initial state post executing the test step related to checking the RLDW message generating functionality of the LDW application (106A).
[0082] Though not described in detail, it is to be understood that the scenario execution system (128) executes each test step in each of the other test cases “TC_LDW_002” and “TC_LDW_P003” similarly based on the first command received from the DevOps system (118). For example, the scenario execution system (128) executes all test steps in the test case “TC_LDW_002” by first re-executing the test steps related to setting up the infrastructure and checking the pre-conditions, followed by executing the test step “Integration_TestStep_2”, and finally by re-executing the test step related to bringing the infrastructure back to the initial state. Similarly, the scenario execution system (128) executes all test steps in the test case “TC_LDW_003” by first re-executing the test steps related to setting up the infrastructure and checking the pre-conditions, followed by executing the test step “Integration_TestStep_3”, and finally by re-executing the test step related to bringing the infrastructure back to the initial state.
[0083] In certain embodiments, the test executor may provide the following exemplary second command represented herein using equations (2) and (3) to the scenario execution system (128) when only few selected test steps in each of the test cases “TC_LDW_001”, “TC_LDW_002”, and “TC_LDW_003” have to be executed.

robot -e Clean_up LDW_TC.robot (2)
robot -e Environment -e Precondition LDW_TC.robot (3)

[0084] In one embodiment, the minus symbol “-” in the previously noted equations (2) and (3) indicates to the scenario execution system (128) the test steps that need not to be repeatedly executed every time during execution of each of the test cases. Specifically, the minus symbol “-“ in the equations (2) and (3) indicates to the scenario execution system (128) that the test steps such as setting up the infrastructure, checking the pre-conditions, and cleaning up and bringing back the infrastructure to the initial state do not need to be repeated every single time during execution of each of the test cases “TC_LDW_001”, “TC_LDW_002”, and “TC_LDW_003”. Accordingly, ,upon receiving the second command represented herein using equations (2) and (3), the scenario execution system (128) executes the test steps related to setting up the infrastructure and checking the pre-conditions only once at the beginning while executing a first test case. The scenario execution system (128) does not re-execute these test steps every time while executing any of the subsequent test cases. The scenario execution system (128) also executes the test step related to bringing the infrastructure back to the initial state only once while executing the last test case in a pipeline of multiple test cases to be executed. In other words, the scenario execution system (128) skips execution of the test step related to bringing the infrastructure back to the initial state for all the test cases except for the last test case.
[0085] For example, while executing the exemplary test cases “TC_LDW_001”, “TC_LDW_002”, and “TC_LDW_003” based on the second command, the scenario execution system (128) first executes the test steps related to setting up the infrastructure and checking the pre-conditions only once while executing the first exemplary test case “TC_LDW_001.” The scenario execution system (128) then executes the test step that is unique to the first test case “TC_LDW_001.” However, the scenario execution system (128) does not execute the test step related to bringing the infrastructure back to the initial state while executing the first test case “TC_LDW_001.” Upon executing the first test case “TC_LDW_001”, the scenario execution system (128) skips re-execution of the test steps related to setting up the infrastructure and checking the pre-conditions. The scenario execution system (128) directly completes execution of the second test case “TC_LDW_002” by only executing the test step “Integration_TestStep_2”, which is unique to the second test case “TC_LDW_002.”
[0086] Subsequently, the scenario execution system (128) executes the third and last test case “TC_LDW_003” by directly executing the test step “Integration_TestStep_3” that is unique to the third test case “TC_LDW_003” and further by executing the test step related to bringing the infrastructure back to the initial state. By executing only few selected test steps in each of the test cases and not by executing all test steps in all of the test cases, the scenario execution system (128) saves a significant amount of associated scenario execution time and computing resources without affecting the testing performance and coverage. For example, the following Table 2 tabulates an approximate amount of time required for executing all test steps in a particular test case.

Table 2: Exemplary amount of time required for executing one particular test case

Test steps related to a particular test case “TC_LDW_001” Time Required
Test step related to setting up the infrastructure 45 seconds
Test step related to checking the pre-conditions 60 seconds
Test step related to testing functionality of an application 30 seconds
Test step related to bringing the infrastructure back to initial 30 seconds
state
Total execution time required for 1 test case 165 seconds

[0087] In real-world scenarios, the scenario execution system (128) typically needs to execute several thousands of test cases to adequately and thoroughly test every single aspect of an application, for example, the LDW application (106A) before deploying such application (106A) in the main ECU (104) of real-world vehicles. However, assuming the scenario execution system (128) needs to execute only 10 test cases, the scenario execution system (128) approximately requires 1650 seconds for executing all test steps in all 10 test cases based on the received first command. However, it is to be understood that the scenario execution system (128) is also capable of executing all those 10 test cases in a significantly lesser amount of time, for example, in 435 seconds based on the received second command and metadata tags added in those 10 test cases. In this implementation, the scenario execution system (128) identifies the test steps that are not to be repeated every single time using the second command and the metadata tags added to the test steps in the test cases. The scenario execution system (128) then skips re-executing the identified test steps every single time during execution of every single test case, which saves a significant amount of time and helps in releasing a tested application into the market in an expedited manner.
[0088] At step (508), the data extraction system (130) extracts simulated data from the scenario execution system (128) when the scenario execution system (128) executes each of the test cases, and further transmits the extracted simulated data to one or more of a ground truth generation system (132) and the LDW application (106A) residing in the LDW container (114A). While executing the test case “TC_LDW_001”, the scenario execution system (128) generates the test scenario of a vehicle deviating towards a specific lane corresponding to, for example, a right lane in a simulated environment based on the scenario generation file related to the test case “TC_LDW_001” generated previously using the scenario generation system (124).
[0089] In one embodiment, the generated scenario generation file related to the test case “TC_LDW_001” includes information such as real-world location of a road network found in the Barcelona town specified in an ASAM openDRIVE format, and location coordinates of each of a left lane and a right lane in the road network, as noted previously. Further, the scenario generation file includes a location coordinate of a vehicle defined in such a way that one of the vehicle’s wheel touches the specific lane corresponding to the right lane. Based on the information specified in the scenario generation file, the scenario execution system (128) generates a simulated image (600) (depicted in FIG. 6) of a vehicle (602) traversing along a road network found in the Barcelona town and deviating towards the specific lane corresponding to a right lane (604). In one embodiment, the data extraction system (130) extracts the simulated image (600) generated by the scenario execution system (128) from the scenario execution system (128). Further, the data extraction system (130) transmits the simulated image (600) to the LDW application (106A) residing in the LDW container (114A).
[0090] Subsequently, at step (510), the ground truth generation system (132) generates a ground truth log file while executing each of the test cases based on an associated scenario generation file. For example, while executing the test case “TC_LDW_001”, the ground truth generation system (132) receives the scenario generation file related to the test case “TC_LDW_001” from the scenario generation system (124). Subsequently, the ground truth generation system (132) extracts the location coordinates of the left and right lanes of the road network, and the location coordinate of the vehicle (602) from the received scenario generation file. The ground truth generation system (132) then identifies that the vehicle (602) is actually deviating towards the right lane (604) of the road network based on the extracted location coordinates of the right lane (604) and the vehicle (602). Further, the ground truth generation system (132) generates the ground truth log file that stores data indicative of deviation of the vehicle (602) towards the right lane (604) of the road network.
[0091] At step (512), the LDW application (106A) residing in the LDW container (114A) processes the simulated data received from the data extraction system (130) and generates an output log file using the test system (102). For example, the LDW application (106A) receives the simulated image (600) generated by the scenario execution system (128) from the data extraction system (130). The LDW application (106A) then identifies if the vehicle (602) in the simulated image (600) deviates towards any lanes in the simulated road network by processing the simulated image (600) using one or more associated image processing or lane departure identification algorithms. For example, the LDW application (106A) identifies the vehicle (602) to be deviating towards the right lane (604) in the simulated road network based on processing of the simulated image (600) using one or more associated image processing or lane departure identification algorithms. In this example, the LDW application (106A) subsequently generates a right lane deviation (RLD) message.
[0092] Further, the LDW application (106A) transmits the generated RLD message to the HMI container (114B) in which the HMI application (106B) resides. Upon receiving the RLD message, the HMI application (106B) generates a right lane deviation (RLD) visual warning (702) (depicted in FIG. 7), which is displayed on a display unit (139) of the human-machine interface (138). In certain embodiments, in-lab test setup (100) includes an imaging system (140) that captures an image of the RLD visual warning (702), thus displayed on the display unit (139) of the human-machine interface (138). Further, the imaging system (140) transmits the captured image of the RLD visual warning (702) to the test system (102). Subsequently, the test system (102) processes the image received from the imaging system (140), for example, using image processing techniques, optical character recognition techniques, and/or template matching techniques. Further, the test system (102) identifies that the vehicle (602) in the simulated image (600) is actually deviating towards the right lane (604) of the road network based on the processing of the image of the RLD visual warning (602). Subsequently, the test system (102) generates the output log file that stores data indicative of deviation of the vehicle (602) towards the right lane (604) of the road network.
[0093] At step (514), the test system (102) successfully verifies functionalities of the LDW application (106A) when information in the ground truth log file generated by the ground truth generation system (132) matches with information in the output log file generated by the test system (102). For instance, with respect to previously noted examples, the test system (102) identifies that the ground truth log file generated by the ground truth generation system (132) is exactly matching with the output log file generated by the test system (102) as both these log files include the same data indicative of deviation of the vehicle (602) towards the right lane (604) of the road network. Accordingly, in this example, the test system (102) successfully verifies a right lane deviation warning generation functionality of the LDW application (106A) as both information in the ground truth log file and information in the output log file are matching with each other.
[0094] Though not described in detail, it is to be understood that the test system (102) similarly verifies “left lane deviation warning generation functionality of the LDW application (106A)” and “no lane deviation warning generation functionality of the LDW application (106A) when a vehicle’s indicator is turned on” defined in other exemplary tests cases “TC_LDW_002” and “TC_LDW_003,” respectively. Specifically, the scenario execution system (128) simulates an image of a vehicle deviating towards a left lane based on a scenario generation file related to the test case “TC_LDW_002”, and provides the simulated image as an input to the LDW application (106A) via the data extraction system (130). The test system (102) then successfully verifies the left lane deviation warning generation functionality of the LDW application (106A) when the LDW application (106A) generates an output log file including information that matches with information in a ground truth log file generated by the ground truth generation system (132).
[0095] Similarly, the scenario execution system (128) simulates an image of a vehicle deviating towards a left lane with an associated left indicator turned on based on a scenario generation file related to the test case “TC_LDW_003,” and provides the simulated image as an input to the LDW application (106A) via the data extraction system (130). The test system (102) then verifies if the LDW application (106A) generates any alert message or indicator based on the received simulated image. Subsequently, the test system (102) successfully verifies “no lane deviation warning generation functionality of the LDW application (106A)” when the LDW application (106A) has not generated any alert message or indicator based on the received simulated image.
[0096] At step (516), the cloud server (120) transmits the LDW container (114A) including the latest version of the LDW application (106A) that is successfully tested to one or more real-world vehicles such as to one or more software defined vehicles (202A-N). The software defined vehicles (202A-N) then receive the LDW container (114A) including the LDW application (106A) as over-the-air (OTA) updates from the cloud server (120), and further deploy the received LDW container in their main ECUs for enabling the LDW application (106A) to perform various intended functionalities.
[0097] In certain embodiments, in addition to testing functionalities of the LDW application (106A), the test system (102) is also capable of testing functionalities of various other containerized applications. For example, the test system (102) is also capable of testing functionalities of a battery SOC determination application (106N) deployed in the container (114N) of the main ECU (104). For testing functionalities of the battery SOC determination application (106N), the scenario execution system (128) in the test system (102) first simulates battery characteristics information such as a battery’s voltage, current, and temperature information. Subsequently, the data extraction system (130) then extracts these information from the scenario execution system (128), and provides these information as an input to both a digital twin framework (136) and to the battery SOC determination application (106N) whose functionality has to be tested.
[0098] In one embodiment, the digital twin framework (136) includes a battery model that is generally known to output accurate results. Using the battery model, the digital twin framework (136) determines an SOC of the battery based on the simulated battery characteristics information received from the data extraction system (130). The digital twin framework (136) then transmits the determined SOC of the battery as ground truth information to the test system (102).
[0099] In certain embodiments, the battery SOC determination application (106N), whose functionality has to be tested, also independently determines the SOC of the battery upon receiving the simulated battery characteristics information received from the data extraction system (130). The battery SOC determination application (106N) then transmits the determined SOC of the battery as an output information to the test system (102). Subsequently, the test system (102) compares and identifies if the ground truth information including the SOC of the battery determined by the digital twin framework (136) matches with the output information including the SOC of the battery determined by the battery SOC determination application (106N). The test system (102) then identifies the battery SOC determination application (106N) to be functioning properly when the ground truth information determined by the digital twin framework (136) matches with the output information determined by the battery SOC determination application (106N).
[00100] In certain embodiments, the test system (102) includes an associated dashboard (not shown) that displays real-time status of a count of traceability of test requirements to test cases, a number of test cases executed, a number of test cases passed, and a number of test cases failed. In one embodiment, the test system (102) adds all applications (106A-N) to be tested to the corresponding containers (114A-N) and tests functionalities of all those applications (106A-N) simultaneously. Alternatively, the test system (102) adds the applications (106A-N) to the containers (114A-N) sequentially one-by-one. For example, the test system (102) first adds a first container including a first application to the main ECU (104) and tests functionalities of the first application before adding a second container including a second application to be tested to the main ECU (104). Also, the test system (102) monitors utilization of resources of the main ECU (104) such as central processing unit (CPU) utilization, bus utilization, and/or memory usage of the main ECU (104) when the test system (102) tests the functionalities of the first application. Further, the test system (102) transmits an alert message to the cloud server (120) and alerts software developers responsible for developing the first application when the resources of the main ECU (104) utilized during testing of the first application exceed corresponding predefined threshold values. Subsequently, the software developers may take necessary actions such as may modify codes associated with the first application such that the consumption of the resources of the main ECU (104) are within the corresponding predefined threshold values while testing functionalities of the first application.
[00101] In certain embodiments, the cloud server (120) provides only an updated version of an application that is successfully tested by the test system (102) as an OTA update directly to an SDV (202A). The cloud server (120) does not generate a software package by bundling the updated version of the application with other SDV applications that are not updated. Further, the cloud server (120) may not transmit the software package including both updated and non-updated applications as the OTA update to the SDV (202A) to reduce application installation time and data consumption at the SDV (202A) end. For example, the cloud server (120) transmits an OTA update including a latest version of the containerized application (106A) successfully tested to the software defined vehicle (202A). Upon receiving the latest version of the containerized application (106A), the software defined vehicle (202A) updates only a previous version of the containerized application (106A) deployed in the container (114A) to the latest version based on the received OTA update without updating other containerized applications (106B-N) deployed in other containers (114B-N) of the main ECU (104) of the software defined vehicle (202A).
[00102] In certain other embodiments, the SDV (202A) described in the present disclosure includes a custom designed memory unit (142). The memory unit (142) includes two memory banks including an inactive memory bank (144), and an active memory bank (146) that is currently used by the SDV (202A). The active memory bank (146) initially includes one or more blocks related to all containerized applications (106A-N). The inactive memory bank (144) does not initially include any blocks related to any of the containerized applications (106A-N). When the cloud server (120) transmits an updated version of an application, for example, an updated version of the LDW application (106A) to the SDV (202A), the inactive memory bank (144) first receives blocks related to the updated version of the LDW application (106A).
[00103] Subsequently, the SDV (202A) reboots the operating systems (112A-N) of the associated main ECU (104) during which all blocks related to all the containerized applications (106B-N) except blocks related to the LDW application (106A) are copied from the active memory bank (146) to the inactive memory bank (144). Thus, the inactive memory bank (144) will include blocks related to the updated version of the LDW application (106A) and all other blocks copied from the active memory bank (146) after rebooting of the operating systems (112A-N) by the SDV (202A). In one embodiment, the SDV (202A) then identifies and uses the inactive memory bank (144) that includes blocks related to the updated version of the LDW application (106A) and all other blocks copied from the active memory bank (146), as a new active memory bank. Further, the SDV (202A) deactivates and identifies the active memory bank (146) from which blocks related to all the applications (106B-N) were previously copied as a new inactive memory bank. The technical advantage that the double-banks memory unit (142) provides is when the updated version of the LDW application (106A) includes any anomaly and is not functioning properly, the SDV (202A) can easily roll back to a previous version of the LDW application (106A) by deactivating the new active memory bank and further by activating the new inactive memory bank or updating the new inactive memory bank to the active memory bank (146).
[00104] In certain embodiments, the SDV (202A) includes an application performance monitoring system (not shown) that continuously monitors if an application, deployed in the main ECU (104), is functioning properly. For example, the application performance monitoring system continuously monitors functioning of the LDW application (106A) to identify if the LDW application (106A) fails to generate a lane departure warning in a particular scenario in which the lane departure warning should have been generated by the LDW application (106A). Upon identifying such a failure scenario, the application performance monitoring system collects information such as vehicle information, one or more images of ambient environment, vehicle state data, and the SDV’s (202A) resource usage status at the time of failure of the LDW application (106A). The application performance monitoring system then transmits the collected information to the cloud server (120) such that the software developers can retest performance of the LDW application (106A) in the failure scenario and develop a suitable fix for the issue.
[00105] The test system (102) described in the present disclosure accurately and efficiently tests functionalities of the containerized applications (106A-N) to be deployed in the software defined vehicles (202A-N) though such testing is generally complex and involves execution of multiple test cases. Further, the test system (102) facilitates testing of the containerized applications (106A-N) early in their product development lifecycles using a simulated model of the main ECU (104). In addition, the test system (102) allows testing of an application effectively at various stages of its product development lifecycle to ensure any issues in the application are identified and resolved while the application is being developed.
[00106] For example, software developers generally develop a Simulink model of an application first in MATLAB before further developing such model into a matured production grade containerized application. At a first stage of the product development lifecycle, the test system (102) automatically tests and identifies if such simulink model of the containerized application developed by the software developers includes any issues. At a second stage of the product development lifecycle, the test system (102) tests functionalities of the containerized application with production grade software codes using virtualized operating systems (112A-N) of the main ECU (104). At a third stage of the product development lifecycle, the test system (102) tests functionalities of the production grade containerized application along with a production grade operating system using a virtualized hypervisor (110) of the main ECU (104).
[00107] At a fourth and final stage of the product development lifecycle, the test system (102) tests functionalities of the application container along with an operating system and the hypervisor (110) including actual binary files related to a target hardware in which the application container is ultimately to be deployed using the virtualized system-on-chip (108) of the main ECU (104). Thus, the test system (102) tests functionalities of the containerized application at various stages of its product development lifecycle and thereby allows developers to fix issues, if any, in the application then and there during the product development stage itself. As a result, the test system (102) allows the developers to quickly deploy the containerized application that is free from any major issues in real-world vehicles without any delay.
[00108] Although specific features of various embodiments of the present systems and methods may be shown in and/or described with respect to some drawings and not in others, this is for convenience only. It is to be understood that the described features, structures, and/or characteristics may be combined and/or used interchangeably in any suitable manner in the various embodiments shown in the different figures.
[00109] While only certain features of the present systems and methods have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes.

LIST OF NUMERAL REFERENCES:

100 Test setup
102 Test system
104 Edge device
106A-N Containerized applications
108 System-on-chip
110 Hypervisor
112A-N Operating systems
114A-N Containers
116 ECU validation system
118 DevOps system
120 Cloud server
122 Communications link
124 Scenario generation system
126 Generative AI based prompt system
128 Scenario execution system
130 Data extraction system
132 Ground truth generation system
134 Web portal
136 Digital twin framework
138 Human-machine interface
139 Display unit
140 Imaging system
142 Memory unit
144 Inactive memory bank
146 Active memory bank
202 Software defined system
202A-N Software defined vehicles
300-314 Steps of a method for validating functionalities of components of a main ECU
400-408 Steps of a method for generating test cases
500-516 Steps of a method for testing functionalities of a containerized application
600 Simulated image
602 Simulated vehicle
604 Right lane
702 Visual warning , Claims:We claim:

1. A test system (102), comprising:
an edge device (104) comprising a containerized application (106A), one or more of whose associated functionalities are to be tested by the test system (102);
a scenario generation system (124) that generates a plurality of test cases required for testing the functionalities of the containerized application (106A), wherein each of the generated test cases comprises one or more common test steps that are present in all of the generated test cases, one or more unique test steps that are unique to a respective test case selected from the plurality of test cases, and one or more corresponding metadata tags;
a cloud server (120) that generates a first command when all test steps in all the generated test cases are to be executed, and alternatively generates a second command different from the first command when only one or more selected test steps in each of the generated test cases are to be executed;
a scenario execution system (128) that is communicatively coupled to one or more of the edge device (104), the cloud server (120), and the scenario generation system (124), wherein the scenario execution system (128) tests the functionalities of the containerized application (106A) by executing all test steps in all the generated test cases upon receiving the first command from the cloud server (120), and
wherein the scenario execution system (120) tests the functionalities of the containerized application (106A) by identifying the one or more selected test steps to be executed in each of the generated test cases based on one or more of the corresponding metadata tags associated with the second command and by executing only the identified test steps.

2. The test system (102) claimed in claim 1, wherein the cloud server (120) deploys the containerized application (106A) that is successfully tested to a software defined system (202), wherein the software defined system (202) corresponds to one or more of a software defined vehicle (202A), a software defined network system, a mobile phone, a laptop, and a set-top-box.

3. The test system (102) as claimed in claim 1, wherein the edge device (104) corresponds to one of an actual main electronic control unit (104) and a virtual main electronic unit (104) of a software defined vehicle (202A), wherein the edge device (104) comprises a set of components comprising one or more of a system-on-chip (108), a hypervisor (110), one or more operating systems (112A-N), and a plurality of containers (114A-N), wherein the containerized application (106A) corresponds to one of a lane departure warning application, a lane keep assist application, an intersection collision warning application, a vehicle dynamics controller application, a powertrain management application, an anti-lock braking application, a traction control application, a connected vehicle application, a green light optimal speed advisory application, an advanced driver assistance system application, and electric vehicle related application to be deployed in the software defined vehicle (202A).

4. The test system (102) as claimed in claim 1, wherein the test system (102) is configured to one or more of:
test one or more functionalities of a simulated model of the containerized application (106A) using the edge device (104);
test the one or more functionalities of the containerized application (106A) of a production grade using one or more virtualized operating systems (112A-N) of the edge device (104);
test the one or more functionalities of the production grade containerized application (106A) and one or more production grade operating systems using a virtualized hypervisor (110) of the edge device (104); and
test the one or more functionalities of the production grade containerized application (106A), the one or more production grade operating systems, and a hypervisor comprising binary files of the edge device (104) using a virtualized system-on-chip (108) of the edge device (104).

5. A method for testing functionalities of a containerized application (106A) using a test system (102), comprising:
generating a plurality of test cases required for testing the functionalities of the containerized application (106A) using a scenario generation system (124), wherein each of the generated test cases comprises one or more common test steps that are present in all of the generated test cases, one or more unique test steps that are unique to a respective test case selected from the plurality of test cases, and one or more corresponding metadata tags;
generating one of a first command when all test steps in all the generated test cases are to be executed, and alternatively a second command when only one or more selected test steps in each of the generated test cases are to be executed using a cloud server (120), wherein the first command is different from the second command;
executing all test steps in all the generated test cases upon receiving the first command from the cloud server (120) for testing the functionalities of the containerized application (106A); and
identifying the one or more selected test steps to be executed in each of the generated test cases based on one or more of the corresponding metadata tags associated with the second command and executing only the identified test steps for testing the functionalities of the containerized application (106A).

6. The method claimed in claim 5, wherein generating the plurality of test cases required for testing the functionalities of the containerized application (106A) comprises:
generating a test scenario comprising a simulated image of a vehicle using the scenario generation system (124);
transmitting the simulated image from the scenario generation system (124) to a container (114A) in which the containerized application (106A) is yet to be deployed using a data extraction system (130);
determining if the container (114A) has successfully received the simulated image from the data extraction system (130), an overall time taken for the container (114A) to receive the simulated image based on a time at which the simulated image is generated by the scenario generation system (124) and a time at which the simulated image is received by the container (114A), and a memory usage of the edge device (104) while receiving the simulated image from the data extraction system (130);
performing a first level validation and successfully validating that all components of the edge device (104) are functioning properly when the container (114A) is determined to have successfully received the simulated image from the data extraction system (130), the determined overall time taken is within a particular time limit, and the determined memory usage is within a designated threshold, wherein the components of the edge device (104) comprise one or more of a system-on-chip (108), a hypervisor (110), one or more operating systems (112A-N), and a plurality of containers (114A-N);
generating a warning message post successfully performing the first level validation using the container (114A), and transmitting the warning message from the container (114A) to another container (114B) via a communications link (122);
identifying if the another container (114B) has successfully received the warning message from the container (114A); and
performing a second level validation and successfully validating that all the components of the edge device (104) are functioning properly when the another container (114B) is identified to have successfully received the warning message from the container (114A).

7. The method claimed in claim 6, wherein generating the plurality of test cases required for testing the functionalities of the containerized application (106A) comprises:
receiving an input prompt comprising a request to generate the plurality of test cases from the test system (102) by the scenario generation system (124), wherein the scenario generation system (124) corresponds to a generative artificial intelligence based system;
retrieving domain specific information required for generating the plurality of test cases from one or more automotive databases;
automatically generating the plurality of test cases required for testing functionalities of the containerized application (106A) based on the input prompt received from the test system (102) and the domain specific information retrieved from the one or more automotive databases; and
automatically generating a scenario generation file corresponding to each of the plurality of generated test cases, wherein the scenario generation file related to each of the plurality of test cases comprises information required for recreating a test scenario defined by a respective test case in a simulated environment.

8. The method as claimed in claim 7, wherein testing the functionalities of the containerized application (106A) comprises:
receiving a latest version of the containerized application (106A) to be tested from the cloud server (120) and deploying the latest version in the container (114A) post successfully testing the functionalities of the components of the edge device (104);
receiving a test case selected from the plurality of test cases and the scenario generation file corresponding to the selected test case from the scenario generation system (124) by a scenario execution system (128) in the test system (102), wherein the selected test case defines a test scenario in which a functionality of the containerized application (106A) has to be tested;
executing all test steps in the selected test case upon receiving the first command from the cloud server (120), alternatively executing only the one or more selected test steps in the selected test case upon receiving the second command from the cloud server (120); and
recreating the test scenario defined by the selected test case in a simulated environment using the scenario generation file corresponding to the selected test case.

9. The method as claimed in claim 8, wherein recreating the test scenario defined by the selected test case in the simulated environment comprises:
generating a simulated image of a vehicle deviating towards a specific lane by the scenario execution system (128);
extracting the simulated image, generated by the scenario execution system (128), using a data extraction system (130), and providing the simulated image as an input to one or more of a ground truth generation system (132) and the containerized application (106A) corresponding to a lane departure warning application (106A) to be tested;
identifying if the vehicle in the simulated image deviates towards the specific lane using the ground truth generation system (132) based on a location coordinate of the specific lane and a location coordinate of the vehicle, and generating a ground truth log file that specifies if the vehicle has deviated towards the specific lane;
processing the simulated image and identifying if the vehicle in the simulated image deviates towards the specific lane using the lane departure warning application (106A);
generating and transmitting a lane deviation message from the lane departure warning application (106A) to a human-machine interface application (106B) deployed in another container (114B) when the lane departure warning application (106A) identifies that the vehicle in the simulated image deviates towards the specific lane;
generating a lane departure visual warning by the human-machine interface application (106B) upon receiving the lane departure message from the lane departure warning application (106A), and further displaying the generated lane departure visual warning in a display unit (139) of a human-machine interface (138);
capturing an image of the lane departure visual warning displayed on the display unit (139) of the human-machine interface (138) by an imaging system (140);
processing the captured image using the test system (102) and identifying a specific type of lane deviation of the vehicle, wherein the specific type of lane deviation corresponds to one of a right lane deviation, a left lane deviation and no lane deviation;
generating an output log file using the test system (102) based on the specific type of lane deviation identified from processing of the captured image; and
successfully verifying a lane departure warning generation functionality of the lane departure warning application (106A) when information in the generated ground truth log file matches with information in the generated output log file.

10. The method as claimed in claim 8, wherein recreating the test scenario defined by the selected test case in the simulated environment comprises:
simulating battery characteristics information comprising one or more of a simulated voltage, current, and temperature of a battery using the scenario execution system (128);
extracting the simulated battery characteristics information from the scenario execution system (128) using a data extraction system (130), and providing the simulated battery characteristics information as an input to both a ground truth generation system (132) and the containerized application (106A) that corresponds to a battery state-of-charge (SOC) determination application (106A) to be tested;
determining ground truth information comprising a state-of-charge of the battery based on the received simulated battery characteristics information using a digital twin framework (136) residing in the cloud server (120);
determining output information comprising the state-of-charge of the battery based on the received simulated battery characteristics information using the battery SOC determination application (106A); and
successfully verifying a functionality of the battery SOC determination application (106A) when the ground truth information comprising the state-of-charge of the battery determined using the digital twin framework (136) matches with the output information comprising the state-of-charge of the battery determined using the battery SOC determination application (106A).

11. The method as claimed in claim 9, wherein testing the functionalities of the containerized application (106A) comprises deploying the latest version of the containerized application (106A) whose associated functionalities are successfully tested by the test system (102) from the cloud server (120) to a software defined system (202), wherein the software defined system (202) corresponds to one or more of a software defined vehicle (202A), a software defined network system, a mobile phone, a laptop, and a set-top-box.

12. The method as claimed in claimed in claim 11, wherein deploying the latest version of the containerized application (106A) from the cloud server (120) to the software defined system (202) comprises:
receiving an over-the-air update comprising the latest version of the containerized application (106A) successfully tested from the cloud server (120) by the software defined system (202), wherein the software defined system (202) comprises a plurality of associated containers (114A-N) comprising a plurality of containerized applications (106A-N); and
updating only a previous version of the containerized application (106A) deployed in a particular container (114A) of the software defined system (202) to the latest version based on the received over-the-air update without updating other containerized applications (106B-N) selected from the containerized applications (106A-N) deployed in other containers (114B-N) selected from the containers (114A-N).

13. The method as claimed in claim 12, wherein transmitting the latest version of the containerized application (106A) from the cloud server (120) to the software defined system (202) comprises:
dividing a memory unit (142) of the software defined system (202) into an inactive memory bank (144) and an active memory bank (146), wherein the inactive memory bank (144) comprises no information related to the containerized applications (106A-N) of the software defined system (202) initially, wherein the active memory bank (146) comprises one or more blocks related to all of the containerized applications (106A-N) of the software defined system (202) initially;
receiving blocks related to the latest version of the containerized application (106A) transmitted from the cloud server (120) to the software defined system (202) by the inactive memory bank (144);
rebooting one or more operating systems of the software defined system (202) upon receiving the latest version of the containerized application (106A) from the cloud server (120);
copying the one or more blocks related to all of the containerized applications (106B-N) except the blocks related to the containerized application (106A) from the active memory bank (146) to the inactive memory bank (144) during rebooting of the one or more operating systems of the software defined system (202);
using the inactive memory bank (144) comprising the blocks related to the latest version of the containerized application (106A) and the blocks related to the containerized applications (106B-N) copied from the active memory bank (146) as a new active memory bank;
deactivating the active memory bank (146) to a new inactive memory bank; and
updating the new inactive memory bank back to the active memory bank (146) when the blocks related to the latest version of the containerized application (106A) in the new active memory bank are not functioning properly.

14. The method as claimed in claim 5, wherein testing the functionalities of the containerized application (106A) using the test system (102) comprises one or more of:
testing one or more functionalities of a simulated model of the containerized application (106A) using an edge device (104);
testing the one or more functionalities of the containerized application (106A) of a production grade using one or more virtualized operating systems (112A-N) of the edge device (104);
testing the one or more functionalities of the production grade containerized application (106A) and one or more production grade operating systems using a virtualized hypervisor (110) of the edge device (104); and
testing the one or more functionalities of the production grade containerized application (106A), the one or more production grade operating systems, and a hypervisor comprising binary files of the edge device (104) using a virtualized system-on-chip (108) of the edge device (104).

Documents

Application Documents

# Name Date
1 202441072148-POWER OF AUTHORITY [24-09-2024(online)].pdf 2024-09-24
2 202441072148-FORM-9 [24-09-2024(online)].pdf 2024-09-24
3 202441072148-FORM 3 [24-09-2024(online)].pdf 2024-09-24
4 202441072148-FORM 18 [24-09-2024(online)].pdf 2024-09-24
5 202441072148-FORM 1 [24-09-2024(online)].pdf 2024-09-24
6 202441072148-FIGURE OF ABSTRACT [24-09-2024(online)].pdf 2024-09-24
7 202441072148-DRAWINGS [24-09-2024(online)].pdf 2024-09-24
8 202441072148-COMPLETE SPECIFICATION [24-09-2024(online)].pdf 2024-09-24
9 202441072148-FORM-26 [01-10-2024(online)].pdf 2024-10-01