Sign In to Follow Application
View All Documents & Correspondence

System And Method For Validating Functionalities Of An Embedded System And One Or More Sensors Thereof

Abstract: A system (100) and method for validating functionalities of a DUT (103) with an apparatus (101) is provided. The apparatus (101) moves an object, (102) accurately across various dimensions, velocity and trajectories for emulating testing scenarios under which functionalities of the device under test (103) is tested. The apparatus (101) includes first set of axial guiding structures (110,120), second set of axial guiding structures (133,143) and a fifth guiding structure (153). First axial moving assemblies (113,123) are moveably secured to the first set of axial guiding structures (110,120) for moving the object (102) along a first axis (115). Second axial moving assemblies (136,146) are moveably secured to the second set of axial guiding structures (133,143) for moving the object (102) along a second axis (138). A fifth moving assembly (154) is moveably secured to the fifth guiding structure (153) for moving the object (102) along a third axis (156).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
23 August 2016
Publication Number
09/2018
Publication Type
INA
Invention Field
PHYSICS
Status
Email
shery.nair@tataelxsi.co.in
Parent Application
Patent Number
Legal Status
Grant Date
2022-02-02
Renewal Date

Applicants

TATA ELXSI LIMITED
ITPB Road, Whitefield, Bangalore

Inventors

1. JIHAS KHAN
TATA ELXSI LIMITED ITPB Road, Whitefield, Bangalore – 560048

Specification

Claims:BACKGROUND

[0001] Embodiments of the present specification relate generally to emulators. More particularly, the present specification relates to a system and method for automatic verification and validation of a Device Under Test (DUT), which includes an embedded system and one or more sensors, by emulating associated functionalities using an efficient three-dimensional (3D) positioning system.
[0002] Techniques for efficiently detecting and tracking stationary or moving objects find use in a wide range of applications. For example, these techniques may be used in military and defense, for detecting flying objects, for nautical surveillance and for missile guidance. In the automotive industry, these techniques may be used for detecting nearby vehicles, passengers, traffic, and physical objects in the locality of a vehicle. In the avionics industry, these techniques may be used for detecting nearby airplanes and flying creatures. In the marine industry, these techniques may be used for detecting objects under water. In addition, in the aerospace industry, these techniques may be used for detecting space debris, space vehicles, and satellites.
[0003] Generally, such detection and tracking functionality may be achieved using different types of sensors. For example, one way of detecting and tracking an object includes using one or more laser sensors that emit signals towards the object and receive a feedback signal from a surface of the object. Generally, the type of sensor may vary based on the context, a distance to be covered, nature of environment, noise involved, and a type of application. Additionally, the sensors may be connected to an embedded system that is configured to process the feedback signal received by the sensor from the surface of the object. Particularly, the embedded system may be configured to process the feedback signal, for example, to determine a location, calculate velocity and acceleration, and/or to determine a direction of travel and/or shape of the object. In addition to object detection and tracking, the embedded system may also be capable of providing further functionality to an associated system. For example, an embedded system used in an automobile may be configured to provide Adaptive Cruise Control (ACC) functionally by controlling various vehicle components to automatically and continually maintain a safe distance (or a range of velocity) from a leading vehicle.
[0004] As decisions made by such embedded systems and associated sensors impact human lives, accurate and consistent performance of the embedded systems are safety-critical aspects. Hence, every function of the embedded system and the associated sensors needs to be thoroughly validated before deployment in the associated system.
[0005] Unfortunately, in existing approaches, a validation study of the sensors and the embedded system is performed only after associated hardware, namely a real object at which the sensors and the embedded system are to be placed and a target object that is to be detected and tracked, is available for production. For example, in the avionics industry, a RADAR target detection system is typically tested only after the whole airplane is ready. Specifically, the airplane with the RADAR target detection system is taken for test flights for validating a RADAR sensor and an associated embedded system. However, if any issue is found with the sensors and the embedded system at this stage, a huge overhead of efforts, cost and time is needed for fixing the issue. In addition, it is difficult to generate all test scenarios when testing the sensors and the embedded system using the real object. Hence, it may be preferable to validate functionality of a DUT that includes the sensors, the embedded system, and associated software early during an embedded product development lifecycle without requiring the real object in place for the validation study.
[0006] Another validation approach obviates a need for the actual target object by validating the sensors and the embedded system using a target simulator. The target simulator creates a virtual target object that is to be detected and tracked. To that end, the target simulator is placed adjacent to the sensors and the embedded system to receive signals emanating from the sensors. The target simulator then applies suitable delay and signal processing to the received signals before transmitting the processed signals back to the DUT to mimic presence of the target object at a user-programmed distance. Particularly, the target simulator may be configured to process the signals and subsequently transmit the processed signals such that the sensors and the embedded system in the DUT receive signals indicating presence of the target object at the specified distance. Additionally, certain target simulators may also be capable of simulating movement of the target object with a user-specified velocity.
[0007] However, such target simulators are very expensive, thereby limiting their application and usage. Further, simulation of the virtual target object entails complicated signal processing. Typically, even a small mistake in processing the signals causes the sensors and the embedded system to behave in an unpredictable manner. In addition, currently available target simulators are specifically configured for a particular type of sensor. Accordingly, the same target simulator cannot be used for different type of sensors. For example, variants of RADAR sensors available in the market include Frequency Modulated Continuous Wave (FMCW) RADAR, Pulsed RADAR, and Ultra Wide Band (UWB) RADAR. Each of these variants employs a specific target simulator, as the required signal processing is different for different types of target simulators, thus limiting the reusability and scalability of the currently available target simulators.
[0008] Another conventional approach for validating the sensors and the embedded system uses motion devices that are capable of emulating 3D motion of the target object. For example, 3D motion may be emulated using motion devices that are capable of moving dummy target objects placed on the motion devices for emulating movements of the target object. However, systems and methods that use such conventional approaches require real objects having sensors and an embedded system for detecting and tracking the movement of the dummy target objects. However, as previously noted, testing the sensors and the embedded system in the presence of the real objects have many issues. In addition, these validation approaches have limited applicability, are restricted only to a limited number of testing scenarios, and thus may not be able to recreate various real-world scenarios with great accuracy.
[0009] By way of example, a European published patent reference EP2228781A2 describes a motion device that includes four ropes, each having an electrically controlled winding and unwinding device for winding and unwinding each of the four ropes. A test body is fastened at a holding device, and connects the four ropes together. A control device is provided for controlling the winding and unwinding device for each of the ropes. However, with this particular device, stability, accuracy and precision involved in movement of the test body using the ropes may be unreliable, thereby limiting its applicability especially in areas where detection of a target object more accurately and precisely is of utmost importance.
[0010] Additionally, the German published patent reference DE102007035474A1 describes another motion device used for testing a driver assistance system. This driver assistance system is configured for detecting information about the vehicle environment and/or for processing information about the vehicle environment received from another system. The motion device includes a rail device that supports a dummy object that can be moved to simulate a real traffic situation on the test track. However, the rail device allows only one-dimensional movement of the dummy object, and there is no option for automated movement in other two dimensions.
[0011] Further, the European patent reference EP2781904A1 describes a test apparatus having a test piece, which is connected with a sliding carriage by a suspension, where a rail system is movable relative to a track of a vehicle along a guide rail. The test piece is connected to the sliding carriage via the suspension in such a way that the test object is approachable from the vehicle. A support structure and the guide rail in the test apparatus include beam deflector plates for reducing the radar cross-section of the test apparatus for radar radiation. However, the suspension with the sliding carriage is capable of moving only along the guide rail, thus restricting the motion of the test object to only one dimension.
[0012] Accordingly, there remains a need for an efficient device for moving an object automatically with any cross section, across any dimensions and trajectories, with any velocity for emulating various testing conditions. Additionally, there is a need for a system and method for validating various functions of an embedded system and associated sensors automatically without requiring presence of a real object.

BRIEF DESCRIPTION

[0013] According to an exemplary aspect of the present specification, an apparatus for moving an object is provided. The apparatus includes a first set of axial guiding structures, a second set of axial guiding structures, and a fifth guiding structure that are all positioned axially with respect to each other. The fifth guiding structure supports the object. The apparatus further includes a first set of axial moving assemblies, a second set of axial moving assemblies, and a fifth moving assembly. The first set of axial moving assemblies that are moveably secured to the first set of axial guiding structures. The first set of axial moving assemblies are adapted to move the second set of axial guiding structures, the fifth guiding structure and the object along a first axis. Each of the first set of axial guiding structures is stationary and is positioned at a specified distance from other axial guiding structure in the first set of axial guiding structures. The second set of axial moving assemblies is moveably secured to the second set of axial guiding structures and is adapted to move the fifth guiding structure and the object along a second axis. The fifth moving assembly is moveably secured to the fifth guiding structure and is adapted to support the object. The fifth moving assembly is capable of moving orthogonally with respect to the first set of axial guiding structures and the second set of axial guiding structures for moving the object along a third axis.
[0014] According to another exemplary aspect of the present specification, a system for validating functionality of a device under test is provided. The system includes a first set of axial guiding structures, a second set of axial guiding structures, and a fifth guiding structure that are all positioned axially with respect to each other. Each of the first set of axial guiding structures is stationary and is positioned at a specified distance from other axial guiding structure in the first set of axial guiding structures. The fifth guiding structure supports a pseudo object. The system further includes one or more axial moving assemblies, one or more sensors, and an embedded system that are placed at a designated distance from the pseudo object, and an embedded simulator. The one or more axial moving assemblies are moveably secured to each of the first set of axial guiding structures, the second set of axial guiding structures, and the fifth guiding structure for moving the pseudo object along one or more axes. The embedded simulator is configured to simulate one or more operating parameters associated with a virtual object. The one or more operating parameters are provided as an input to the embedded system. The virtual object represents a model of a real object at which the one or more sensors and the embedded system are adapted to be placed. The embedded simulator subsequently adjusts the one or more operating parameters associated with the virtual object to one or more specified values based on one or more parameters associated with the pseudo object to obtain one or more updated operating parameters. The embedded simulator then validates at least one functionality of the one or more sensors and the embedded system based on the one or more updated operating parameters.
[0015] According to yet another exemplary aspect of the present specification, a method for validating a device under test is provided. The method includes the steps of placing a pseudo object on an apparatus at a specified distance from the device under test. The device under test includes one or more sensors and an embedded system that are communicatively coupled to an embedded simulator. One or more operating parameters associated with a virtual object are simulated using the embedded simulator and are provided as an input to the embedded system. The pseudo object is placed outside a designated area surrounding the device under test such that the one or more sensors fail to detect a presence of the pseudo object. The pseudo object is moved by the embedded simulator within the designated area surrounding the device under test until the one or more sensors detect the presence of the pseudo object based on a reflected feedback signal received by the one or more sensors from the pseudo object. The embedded system determines one or more parameters associated with the pseudo object based on the reflected feedback signal. The one or more operating parameters associated with the virtual object is adjusted, using the embedded simulator, based on the one or more parameters determined by the embedded system to validate at least one functionality associated with the one or more sensors and the embedded system. The embedded simulator also controls motion of the pseudo object along different axes for realizing different test scenarios. Based on or more parameters of the virtual object, motion of the pseudo object is controlled by the embedded simulator using a closed loop control for recreating different real world test scenarios.

DRAWINGS

[0016] These and other features, aspects, and advantages of the claimed subject matter will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0017] FIG. 1 is a schematic diagram illustrating an exemplary system that efficiently emulates 3D motion of a pseudo object for automatic validation of various functions of an embedded system and associated sensors in a DUT;
[0018] FIG. 2 is a flow diagram illustrating a method for validating functionalities associated with the DUT of FIG. 1;
[0019] FIG. 3 is a flow diagram illustrating a method for validating the Adaptive Cruise Control (ACC) functionality implemented by the DUT in a first vehicle with respect to a second vehicle;
[0020] FIG. 4 is a schematic view of a test system for automatically validating a device under test by emulating scenarios related to a parking slot detector, a park assistance, and a distance alert system using one or more ultrasonic sensors; and
[0021] FIG. 5 is a schematic diagram illustrating a test system for validating a sensor fusion system having more than one type of sensor and an embedded system for executing associated functionalities.

DETAILED DESCRIPTION

[0022] The following description presents exemplary systems and methods including an apparatus for moving an object with respect to a device under test across various dimensions, trajectories, and velocities to emulate various testing scenarios. The present systems and methods are further configured for validating various functions of the device under test including detecting and tracking the object by emulating a plurality of real-world scenarios for testing and validation without need for a real object or a target object in place.
[0023] The term “object,” used in the various embodiments described herein, broadly refers to any physical thing that is perceptible by vision or touch. The term “device under test,” used in the various embodiments described herein, broadly refers to one or more sensors, an embedded system, and/or one or more associated algorithms or software routines that configure the device under test to detect and track objects in the surroundings, and to execute various functionalities. The term “real object,” used in the various embodiments described herein, broadly refers to a physical thing (e.g., an automobile, an aircraft, a watercraft, a spacecraft, military vehicles) within which the device under test is placed in order to detect and track objects and execute various functionalities.
[0024] The term “virtual object,” used in the various embodiments described herein, broadly refers to a simulated model of the real object (e.g., an automobile, an aircraft, a watercraft, a spacecraft, and military vehicles) that enables testing of the device under test without requiring the real object in place. The term “target object,” used in the various embodiments described herein, broadly refers to a physical thing (e.g., nearby vehicles, pedestrians, poles, flying objects, missiles, rocks, snows) that is detected and/or tracked by the device under test placed within the real object. Further, the term “pseudo object,” used in the various embodiments described herein, broadly refers to a dummy model of the target object that is detected and/or tracked by the device under test. An exemplary system that emulates the real object and the target object using the virtual object and the pseudo object to test for various functionalities of the device under test is described in greater detail with reference to FIG. 1.
[0025] FIG. 1 is a schematic diagram illustrating a test system 100 for automatic validation of various functionalities of a device under test (DUT). It may be noted that different embodiments of the test system 100 may be used for automatic verification and validation of functionalities of embedded systems and associated sensors used in devices, include but are not limited to, an automotive vehicle, an aircraft, a watercraft, a consumer electronic device, a motion detector, a satellite, an air traffic controller, a traffic monitoring system, a military device, a defense application module, an industrial device, a non-destructive testing device, an intrusion sensing system, an obstacle detection system, an obstacle tracking system and a spacecraft. However, for clarity of explanation, the test system 100 will be described herein with reference to an automatic verification and validation system that is configured to validate various functionalities of an embedded system and associated sensors within a first vehicle in presence of at least a second vehicle. For example, the test system 100 used for testing functionalities of an embedded system and associated sensors that are to be placed in the first vehicle for operating the first vehicle in an Adaptive Cruise Control (ACC) mode in presence of the second vehicle is described in subsequent sections. Here, the first vehicle is considered as a real object, the second vehicle is considered as a target object, and the embedded system along with the sensors implementing ACC is considered as the device under test. As an example for explaining potential and working capabilities of the test system 100, the test system 100 will be described herein with reference to automotive applications, and it does not mean that the test system 100 can be applied only to automotive applications.
[0026] Accordingly, in one embodiment, the test system 100 includes an apparatus 101 for moving an object 102 across one or more axes or in a specific trajectory and velocity. Structure of the apparatus 101 will be described in greater detail in the following paragraphs. In one embodiment, the object 102 is a dummy model that mimics or represents the target object (e.g., second vehicle) to be detected and tracked by the device under test 103 placed in the first vehicle in a real-world scenario. The dummy model has physical characteristics such as a shape, dimensions, a material composition, and a surface profile that are similar to the target object to be detected. For example, the dummy model has physical characteristics that are similar to the second vehicle. Additionally, the physical characteristics of the dummy model are selected such that properties of feedback signals that are reflected from a surface of the dummy model are identical to properties of feedback signals reflected from a surface of the target object (e.g., the second vehicle) in a real-world scenario.
[0027] Additionally, the test system 100 includes a device under test 103, one or more functionalities of which are to be tested. In certain embodiments, the device under test 103 includes one or more sensors 103A and an embedded system 103B having one or more associated algorithms to detect the feedback signals reflected back from the object 102 for tracking the object 102, for determining one or more parameters of the object 102, and for implementing specific functional requirements. Examples of the sensors 103A associated with the device under test 103 include, but are not limited to, a RADAR sensor, an ultrasonic sensor, a LIDAR sensor, and an infrared sensor. For a particular validation study, the device under test 103 is placed at a designated distance in front of the object 102. The apparatus 101 moves the object 102 across one or more axes simultaneously or in a specific trajectory and velocity in front of the device under test 103 for emulating a motion of the target object (e.g., the second vehicle) moving ahead of the real object (e.g., the first vehicle) in a real-world scenario.
[0028] For example, the real-world scenario of the first vehicle following the second vehicle is emulated by placing the object 102 in front of the sensors 103A and the embedded system 103B on the apparatus 101, and by simulating a speed of the first vehicle using a virtual object. Here, the virtual object is a simulated model of the first vehicle. The sensors 103A and the embedded system 103B receive the simulated speed of the virtual object and determine that the sensors 103A and the embedded system 103B are placed in the first vehicle that is moving at the simulated speed. The object 102 is subsequently moved by an embedded simulator in an area of a coverage of the sensors 103A such that the sensors 103A and the embedded system 103B determine a presence of the object 102. Further, the object 102 is moved across one or more axes by the embedded simulator in front of the sensors 103A and the embedded system 103B using the apparatus 101 for emulating the motion of the second vehicle in front of the first vehicle. In one embodiment, the one or more sensors 103A and the embedded system 103B detect and track the object 102 while stationary or moving, and determine one or more parameters associated with the object 102, which may be correlated to the motion of the target object (e.g., the second vehicle). For example, the sensors 103A and the embedded system 103B determine a distance between the object 102 and the device under test 103 and a velocity of the object 102, which may be correlated to the distance between the first vehicle and the second vehicle and the velocity of the second vehicle respectively in a real-world scenario. Particularly, the one or more sensors 103A and the embedded system 103B are made to track the object 102 in various test scenarios to test a performance of the device under test 103 in response to the emulated motion of the target object.
[0029] The test system 100 further includes an embedded simulator 104 that simulates scenarios that mimic dynamics of the real object (e.g., the first vehicle) in a real-world scenario, controls the motion of the object 102, and enables automatic verification and validation of the expected performance of the device under test 103 in response to each of the simulated scenarios. For example, the embedded simulator 104 may simulate an Automotive Cruise Control (ACC) functionality in the first vehicle by simulating a presence of the first vehicle, a motion, a direction of travel, a velocity of motion, engine parameters, and brake information associated with the first vehicle, a driver behavior, etc. The embedded simulator 104 enables automatic validation of the device under test 103 that is adapted to detect and track the second vehicle that is moving ahead of the first vehicle. The device under test 103 is further configured to regulate a speed of the first vehicle with respect to a speed of the second vehicle for maintaining a minimum distance between these vehicles, and to regulate a velocity of the first vehicle based on a velocity of the second vehicle, as described in greater detail with reference to FIG. 3. Thus, the test system 100 eliminates a requirement of the real object in place for testing the device under test 103.
[0030] In certain embodiments, the embedded simulator 104 is communicatively coupled to the device under test 103 bi-directionally. The bi-directional coupling allows the embedded simulator 104 to test the Adaptive Cruise Control (ACC) functionality of the device under test 103. For example, the bi-directional coupling allows the embedded simulator 104 to read the distance and the velocity values associated with the object 102 from the device under test 103, which is calculated by the device under test 103. When the embedded system 103B finds the distance between the object 102 and the device under test 103 is less than a minimum distance to be maintained between the first vehicle and the second vehicle, the embedded system 103B generates a request. Consequently, the embedded simulator 104 receives the request from the embedded system 103B and reduces the speed of the virtual object to a designated speed value. In the real-world scenario, reducing the speed of the first vehicle causes an increase in a distance between the first vehicle and the second vehicle if the second vehicle is still moving at a same speed or a higher speed. Increasing the distance between the first vehicle and the second vehicle is emulated or recreated by moving the pseudo object 102 appropriately away from the device under test 103. The functionality of the sensors 103A and the embedded system 103B, thus, is validated by validating the computed designated speed value is equal to a specified expected value. Similarly, when the embedded system 103B finds the object 102 is currently moving at a reduced velocity compared to an initial velocity, the embedded system 103B generates a request. The embedded simulator 104 receives the request and reduces the velocity of the virtual object in order to match the reduced velocity of the object 102. In the real-world scenario, reducing the velocity of the first vehicle to the velocity of the second vehicle causes both the vehicles to move at a same velocity. The functionality of the sensors 103A and the embedded system 103B, thus, is validated by validating whether the embedded simulator 104 reduces or increases the velocity of the virtual object based on the velocity of the object 102 reported by the device under test 103. The distance and the velocity of the pseudo object 102 computed by the device under test 103 are also validated by the embedded simulator 104 for its accuracy and precision.
[0031] Although, the present embodiment describes validating the ACC functionality, the test system 100 may be used to validate various other functionalities of the embedded system 103B such as a car ECU via suitable control and configuration of the apparatus 101. As previously noted, the apparatus 101 may be configured to move the object 102 in front of the device under test 103 at a specific speed or a specific velocity across one or more axes simultaneously or in a specific trajectory to emulate a desired position, speed, and direction of movement of the object 102 in a real-world scenario. To the end, the apparatus 101 includes one or more structural assemblies including a first assembly 105, a second assembly 106, a third assembly 107, a fourth assembly 108, and a fifth assembly 109 for moving the object 102 across various axes and trajectories. Additionally, in one embodiment, the apparatus 101, including all the assemblies 105, 106, 107, 108, and 109, are made to absorb sensor signals such that the sensors 103A receive reflected feedback signals only from the object 102 and not the apparatus 101. This is achieved by covering the assemblies 105-109 of the apparatus 101 with a sensor signal absorbent material (e.g., RF absorbing foam is used, in case a RADAR sensor is used to detect and track the object 102). The apparatus 101, the object 102, and the device under test 103 are enclosed within a chamber (not shown in the FIG. 1) that absorbs all sensor signals such that the sensors 103A receive reflected feedback signals only from the object and not from external environment including walls, persons standing near a test space, embedded simulator, wires etc.
[0032] The apparatus 101 having each of these structural assemblies 105-109 includes a guiding structure, a moving assembly, and at least one actuator. In one embodiment, the guiding structure is an aluminum extrusion profile having a slot that extends approximately over the entire length of the guiding structure. The moving assembly is a metal plate having an upper flat surface and a bottom-elongated surface. In certain embodiments, the moving assembly uses the guiding structure to move along a predetermined axis. Particularly, the bottom-elongated surface of the moving assembly is moveably coupled to the slot of the guiding structure in order to move the moving assembly through the guiding structure along the predetermined axis. To that end, at least one actuator is coupled to each of the structural assemblies at one end of the guiding structure for driving the moving assembly along the predetermined axis. In one embodiment, the actuator is a brushed DC motor having a position feedback sensor (e.g., a hall sensor). In certain embodiments, the apparatus 101 utilizes another type of actuator, for example selected from the group including of a brushless DC motor, a permanent magnet synchronous motor, an alternating current (AC) motor, a stepper motor, an induction motor, instead of the brushed DC motor with slight modifications in the apparatus 101 setup. The motor facilitates movement of the moving assembly through the guiding structure, and controls a position, a speed, and a velocity associated with the moving assembly. The motor includes a motor shaft (not shown in the FIG. 1) that utilizes a circular thread-like metal rod for driving the moving assembly through the guiding structure. Additionally, the motor includes a position identifying sensor, for example a hall sensor, for identifying a present position associated with the motor.
[0033] In certain embodiments, the motor further includes four pins including a motor control plus pin, a motor control minus pin, a position sensor plus pin, and a position sensor minus pin. A high voltage at the motor control plus pin and a low voltage at the motor control minus pin make the motor rotate in a clockwise direction. Alternatively, a low voltage at the motor control plus pin and a high voltage at the motor control minus pin make the motor rotate in an anti-clockwise direction. In addition, when a difference in a voltage applied across the motor control plus pin and the motor control minus pin is equal to a rated motor voltage, the motor runs at a maximum supported Revolutions Per Minute (RPM). Similarly, when a difference in a voltage applied across the motor control plus pin and the motor control minus pin is less than a rated motor voltage, the motor runs at lower RPM values. Thus, the motor is made to rotate at any desired RPM values, which are less than the maximum supported RPM value based on voltages fed at the motor control plus pin and the motor control minus pin. In addition, a direction and a speed of the moving assembly that moves through the guiding structure is controlled by applying suitable voltages at the motor control plus pin and the motor control minus pin. Furthermore, the position sensor of the motor outputs motor position data through the position sensor plus pin and the position sensor minus pin, and a current position of the motor can be decoded based on the received outputs to validate if the motor is moved to a desired position. Thus, the structural assemblies 105-109 include a plurality of guiding structures, moving assemblies, and actuators for enabling motion of the object 102 at a specific speed or a specific velocity across one or more axes simultaneously or in a specific trajectory. Each of the structural assemblies 105-109 of the apparatus 101 are described in detail in subsequent paragraphs.
[0034] The first assembly 105 includes a first guiding structure 110 having a first end 111 and a second end 112, a first moving assembly 113. Additionally, the first assembly 105 includes a first actuator 114 for moving the first moving assembly 113 through the first guiding structure 110 along a first axis 115. In one embodiment, the first actuator 114 is a brushed DC motor, and is coupled to the first end 111 of the first guiding structure 110. In one embodiment, the first actuator 114 in the first assembly 105 includes a motor control plus pin 116, a motor control minus pin 117, a position sensor plus pin 118, and a position sensor minus pin 119. In certain embodiments, the first assembly 105 is placed in parallel and at a specified distance with respect to the second assembly 106. In an embodiment, the specified distance between the assemblies 105-106 is determined based on a testing application, or more specifically based on a range of distance over which the object 102 is to be moved along a specific axis.
[0035] Similarly, the second assembly 106 includes a second guiding structure 120 having a first end 121 and a second end 122, a second moving assembly 123, and a second actuator 124 for moving the second moving assembly 123 through the second guiding structure 120 along the first axis 115. The second actuator 124 includes a motor control plus pin 125, a motor control minus pin 126, a position sensor plus pin 127, and a position sensor minus pin 128. The first guiding structure 110 and the second guiding structure 120 are referred to herein as a first set of axial guiding structures. The first moving assembly 113 and the second moving assembly 123 are referred to herein as a first set of axial moving assemblies, and the first actuator 114 and the second actuator 124 are referred to herein as a first set of axial actuators. In one embodiment, the first set of axial guiding structures 110 and 120 are stationary and act as a stable base structure of the apparatus 101, and in turn, the object 102. In one embodiment, the object 102 is safely fixed on a fifth moving assembly (described in greater in the following sections) using one or more screws.
[0036] The first set of axial guiding structures 110 and 120, the first set of axial moving assemblies 113 and 123 and the first set of axial actuators 114 and 124 are adapted to move the object 102 placed on the apparatus 101 along the first axis 115 for emulating motion of the object 102 in a real-world scenario. The object 102, which is supported on the apparatus 101, is moved along the first axis 115 by feeding the same motor control signals to the first set of axial actuators 114 and 124 from the embedded simulator 104. The first set of axial actuators 114 and 124 move the first set of axial moving assemblies 113 and 123 in unison through the first set of axial guiding structures 110 and 120. In addition, the first set of axial actuators 114 and 124 move the object 102 along the first axis 115 at a desired speed and velocity as inputted and/or controlled by the embedded simulator 104 to simulate a selected real-world driving scenario.
[0037] Further, the first assembly 105 and the second assembly 106 are coupled to each other using connecting members for moving the object 102 along the first axis 115. For example, the first assembly 105 and the second assembly 106 are coupled to each other using a first connecting member 129, a second connecting member 130, and a third connecting member 131. In one embodiment, the first connecting member 129, the second connecting member 130, and the third connecting member 131 are metal rods having a rectangular shape. In certain embodiments, the first moving assembly 113 is coupled to a first end of the first connecting member 129, whereas the second moving assembly 123 is coupled to another end of the first connecting member 129. Specifically, the first connecting member 129 may be coupled to the first moving assembly 113 and the second moving assembly 123 by coupling means, including but are not limited to, welding, fasteners, rivets, bolts and nuts, or adhesives. In certain embodiments, the first connecting member 129 is adapted to move the first moving assembly 113 and the second moving assembly 123 simultaneously for moving the object 102 along the first axis 115.
[0038] Further, the second connecting member 130 is adapted to interconnect the first actuator 114 and the second actuator 124 for providing additional rigidity and strength to the apparatus 101 by connecting the guiding structures 110 and 120 to stay as a single unit. Each of the actuators 114 and 124 includes a motor shaft that always moves simultaneously for moving the moving assemblies 113 and 123 through the guiding structures 110 and 120 respectively for guiding the object 102 along the first axis 115. In addition, the first actuator 114 and the second actuator 124 are provided with same motor control signals from the embedded simulator 104 such that both the first actuator 114 and the second actuator 124 rotate in unison. Moreover, the third connecting member 131 interconnects the second end 112 of the first guiding structure 110 and the second end 122 of the second guiding structure 120 for providing additional stability to a structure of the apparatus 101. Additionally, the third connecting member 131 is moveably coupled to a front shaft 132 in the apparatus 101. The device under test 103, including the one or more sensors 103A and the embedded system 103B, is placed on the front shaft 132. A position of the front shaft 132 may be adjusted in order to place the device under test 103 at a desired distance with respect to the object 102 as per a test requirement.
[0039] With reference to the first assembly 105 and the second assembly 106 that move the object 102 along the first axis 115, the third assembly 107 and the fourth assembly 108 are provided for moving the object 102 along a second axis. In one embodiment, the third assembly 107 is placed axially with respect to the first assembly 105 and is mounted on one end of the first connecting member 129. The first assembly 105 supports and bears weight of the third assembly 107. The third assembly 107 includes a third guiding structure 133 having a first end 134 and a second end 135, a third moving assembly 136, and a third actuator 137. The third moving assembly 136 moves through the third guiding structure 133 along a second axis 138. The third actuator 137 is coupled to the first end 134 of the third guiding structure 133. The third actuator includes a motor control plus pin 139, a motor control minus pin 140, a position sensor plus pin 141, and a position sensor minus pin 142. The second end 135 of the third guiding structure 133 is mounted on the first connecting member 129 that is coupled to the first moving assembly 113 that in turn is moveably placed at the first guiding structure 110.
[0040] Similar to the third assembly 107, the fourth assembly 108 is placed axially with respect to the second assembly 106 and is mounted on another end of the first connecting member 129 that is coupled to the second moving assembly 123. The second assembly 106 supports and bears weight of the fourth assembly 108. In addition, the fourth assembly 108 is placed in parallel and at a specified distance with respect to the third assembly 107. In an embodiment, the specified distance between the assemblies 107-108 is determined based on a testing application, or more specifically based on a range of distance over which the object 102 is to be moved along a specific axis.
[0041] Additionally, the fourth assembly 108 includes a fourth guiding structure 143 having a first end 144 and a second end 145, a fourth moving assembly 146, and a fourth actuator 147. The fourth moving assembly 146 moves through the fourth guiding structure 143 along the second axis 138. The fourth actuator 147 includes a motor control plus pin 148, a motor control minus pin 149, a position sensor plus pin 150, and a position sensor minus pin 151. The third guiding structure 133 and the fourth guiding structure 143 are referred to herein as a second set of axial guiding structures.
[0042] Similarly, the third moving assembly 136 and the fourth moving assembly 146 are referred to herein as a second set of axial moving assemblies. Moreover, the third actuator 137 and the fourth actuator 147 are referred to herein as a second set of axial actuators. In one embodiment, the second set of axial guiding structures 133 and 143, the second set of axial moving assemblies 136 and 146, and the second set of axial actuators 137 and 147 are adapted to move the object 102 placed on the apparatus 101 along the second axis 138. The object 102, which is supported on the apparatus 101, is moved along the second axis 138 by feeding the same motor control signals to the second set of axial actuators 137 and 147. The second set of axial actuators 137 and 147 move the second set of axial moving assemblies 136 and 146 in unison through the second set of axial guiding structures 133 and 143 in order to move the object 102 along the second axis 138. In addition, the second set of axial actuators 137 and 147 move the object 102 along the second axis 138 at a desired speed and velocity as inputted and/or controlled by the embedded simulator 104.
[0043] Further, the third assembly 107 and the fourth assembly 108 are coupled to each other using two connecting members. The connecting members include the first connecting member 129 and a fourth connecting member 152. In one embodiment, the fourth connecting member 152 is a metal rod having a rectangular shape. The fourth connecting member 152 is adapted to interconnect the third actuator 137 and the fourth actuator 147 for providing additional rigidity and strength to the apparatus 101. Each of the actuators 137 and 147 includes a motor shaft that always moves simultaneously for moving the moving assemblies 136 and 146 through the guiding structures 133 and 143 respectively for moving the object 102 along the second axis 138. In addition, the third actuator 137 and the fourth actuator 147 are provided with same motor control signals from the embedded simulator 104 such that shafts of both the third actuator 137 and the fourth actuator 147 move in unison.
[0044] The fifth assembly 109 is placed axially with respect to all other structural assemblies including the first assembly 105, the second assembly 106, the third assembly 107, and the fourth assembly 108. The fifth assembly 109 includes a fifth guiding structure 153, a fifth moving assembly 154, and a fifth actuator 155 for moving the fifth moving assembly 154 through the fifth guiding structure 153 along a third axis 156. The fifth guiding structure 153 is mounted on a second set of axial moving assemblies 136 and 146. The second set of axial moving assemblies 136 and 146 move simultaneously in unison for moving the fifth guiding structure 153 along the second axis 138. The fifth moving assembly 154 supports the object 102 to be detected and tracked by the device under test 103. The fifth actuator 155 includes a motor control plus pin 157, a motor control minus pin 158, a position sensor plus pin 159, and a position sensor minus pin 160. The object 102 is moved along the third axis 156 by providing a corresponding motor control signal to the fifth actuator 155. The fifth actuator 155 moves the fifth moving assembly 154 orthogonally with respect to the first set of axial guiding structures 110 and 120 and the second set of axial guiding structures 133 and 143 and through the fifth guiding structure 153 in order to move the object 102 along the third axis 156. The fifth actuator 155 moves the object 102 along the third axis 156 at a desired speed and velocity as inputted and/or controlled by the embedded simulator 104.
[0045] The fifth assembly 109 carrying the object 102 is secured to the assemblies 107 and 108 that are secured to the first connecting member 129, which in turn, is coupled to the first set of axial moving assemblies 113 and 123. Therefore, driving the first set of axial moving assemblies 113 and 123 along the first axis 115 causes the assemblies 107, 108, and 109 and the object 102 to move along the first axis 115. More specifically, the second set of axial guiding structures 133 and 143, the fifth guiding structure 153, and the object 102 move along the first axis 115 when the first set of axial moving assemblies 113 and 123 is moved along the first axis 115. Also, as previously noted, the fifth assembly 109 carrying the object 102 is secured to the second set of axial moving assemblies 136 and 146. Hence, driving the second set of axial moving assemblies 136 and 146 along the second axis 138 also moves the object 102 along the second axis 138. More specifically, the fifth guiding structure 153 along with the object 102 moves along the second axis 138 when the second set of axial moving assemblies 136 and 146 are moved along the second axis 138.
[0046] The exemplary embodiment presented herein describes the apparatus 101 as including two horizontal assemblies 105 and 106, two vertical assemblies 107 and 108, and an assembly 109 for moving the object 102 along the first axis 115, the second axis 138, and the third axis 156, respectively. However, in an alternative embodiment, the apparatus 101 may include any number of such assemblies. For example, instead of the two assemblies 105 and 106, the apparatus 101 may require only one assembly 105 for moving the object 102 along the first axis 115. Similarly, the apparatus 101 may require only one assembly (e.g., the assembly 107) for moving the object 102 along the second axis 138. In one example embodiment, the apparatus 101 setup can be modified based on a required dimensional movement of the object 102. For example, if a testing scenario requires only one-dimensional movement of the object 102 (e.g., along the first axis 115 alone), the apparatus 101 setup is modified to have only the first assembly 105 and optionally the second assembly 106, or vice versa. In this scenario, the object 102 may be mounted on the moving assemblies 113 or 123, or the first connecting member 129. In another example, if a testing scenario requires two-dimensional movement of the object 102 (e.g., along the first axis 115 and the second axis 138), the apparatus 101 setup is modified to have only the assemblies 105 and 107, the assemblies 106 and 108, the assemblies 105, 106, and 107, the assemblies 105, 106, and 108, or the assemblies 105-108. In this scenario, the object 102 may be mounted on the moving assemblies 136 or 146, or a bar that interconnects the moving assemblies 136 or 146. In yet another example, if a testing scenario requires three-dimensional movement of the object 102, preferably, the apparatus 101 with all the assemblies 105-109 are used as shown in the FIG. 1. The embedded simulator 104 controls motion of the object 102 across any desired axes (e.g., the first axis 115, the second axis 138, and/or the third axis 156) by controlling motor control lines associated with the actuators 114, 124, 137, 147, and 155.
[0047] The embedded simulator 104 also controls a motion of the object 102 at a desired speed, as RPM of the actuators 114, 124, 137, 147, and 155 are controllable by voltage feed from the embedded simulator 104. The embedded simulator 104 can also control power feed to the actuators 114, 124, 137, 147, and 155 individually. Each of the actuators 114, 124, 137, 147, and 155 includes a brake assembly (not shown in the FIG. 1) for stopping a motion of a moving assembly through a guiding structure, and the embedded simulator 104 controls the brake assembly.
[0048] Particularly, the embedded simulator 104 stores algorithms developed using mathematical models for controlling operation of the actuators for validating the functionalities implemented by the device under test 103, and verifying accuracy of associated sensors and sensor fusion technology. Additionally, the embedded simulator 104 also stores a dynamics model for simulating dynamic properties of the real target.
[0049] Accordingly, the embedded simulator 104 includes a processing unit 161, an input and output unit 162, a signal conditioning unit 163, and a monitoring device 164. The processing unit 161 includes a mathematical model that controls the actuators 114, 124, 137, 147, and 155 in order to move the object 102 across one or more desired axes or desired trajectories at a desired speed. The processing unit 161 further receives sensor signals from position identifying sensors associated with each of the actuators 114, 124, 137, 147, and 155 via the input and output unit 162 when the object 102 is in motion. The processing unit 161 uses the mathematical model inside it to calculate positional data associated with each of the actuators 114, 124, 137, 147, and 155 based on the sensor signals. Subsequently, the processing unit 161 corrects errors that occur in controlling positions of the actuators 114, 124, 137, 147, and 155 based on the calculated positional data.
[0050] In one embodiment, the processing unit 161 controls and accurately sets velocity components of the object 102 along the first axis 115, the second axis 138, and the third axis 156 by controlling RPM of the actuators 114, 124, 137, 147, and 155 individually and independently, for example, in accordance with equations (1)-(3).

Velocity component in the first axis 115 = Resultant velocity expected * Cosine of an angle between the first axis 115 and a straight line connecting an origin and a resultant velocity (1)
Velocity component in the second axis 138 = Resultant velocity expected * Cosine of an angle between the second axis 138 and a straight line connecting an origin and a resultant velocity (2)
Velocity component in the third axis 156 = Resultant velocity expected * Cosine of an angle between the third axis 156 and a straight line connecting an origin and a resultant velocity (3)

[0051] Ability to simulate different velocity components of the object 102 in three axes for detection by the device under test 103 provides increased test coverage by enabling emulation of complex real world scenarios that may not be replicated in a lab environment using conventional emulators or even using real objects and real target objects in the actual intended testing environment. Specifically, the processing unit 161 communicates with the input and output unit 162 to generate hardware control signals for controlling motion of the actuators based on a command from the mathematical model. The input and output unit 162 also receives signals from one or more position identifying sensors and communicates the received signals to the mathematical model running in the processing unit 161. The signal conditioning unit 163 performs a bidirectional translation that is required at voltage and current levels as the processing unit 161 is capable of handling only transistor–transistor logic (TTL) voltage levels and a current of few milliamperes, whereas the real world current and voltage can take any values. For example, automotive embedded systems normally operate around 13 volts and may consume greater than 5 amperes of current. Since the processing unit 161 can handle only up to 5 Volts and few milliamps of current, the signal conditioning unit 163 is required for up conversion and down conversion of current and voltage values for enabling communication between the input and output unit 162 and the processing unit 161. The monitoring device 164 of the embedded simulator 104 is capable of receiving electrical outputs of the one or more sensors 103A of the device under test 103, and validates accuracy and robustness of the one or more sensors 103A as described in greater detail in the following sections. In one embodiment, a physical communication channel is available in between the one or more sensors 103A and the embedded system 103B that executes sensors signal processing and functional requirements. This physical communication channel is tapped and fed into the monitoring device 164. Extreme care is taken while tapping and feeding the electrical outputs to the monitoring device 164 to ensure that an electrical stability of the device under test 103 and especially a stable communication between the one or more sensors 103A and the embedded system 103B is not disturbed. For example, a park aid system used in automobiles includes one or more ultrasonic sensors (represented by 103A) and a park aid electronic control unit (represented by 103B). A Pulse Width Modulation (PWM) communication exists in between the one or more ultrasonic sensors and the park aid electronic control unit. These PWM lines are tapped and fed into the monitoring device 164 to check accuracy and robustness of the PWM output of the one or more ultrasonic sensors in a real-world scenario. Thus, the FIG. 1 describes the test system 100 including components, of the apparatus 101 and the embedded simulator 104, involved in validating functionalities of the device under test 103. Associated method for validating functionalities of the device under test 103 with the test system 100 is described in greater detail with reference to FIG. 2.
[0052] FIG. 2 is a flow diagram 200 illustrating a method for validating functionalities associated with the device under test 103 of FIG. 1. At step 202, the object 102 and the device under test 103 are positioned on the apparatus 101. As described above, the device under test 103 is placed at a designated distance in front of the object 102 as required by a testing scenario. At step 204, the embedded simulator 104 receives a user input including one or more operating parameters associated with a virtual object that mimics dynamics of a real object (e.g., a first vehicle) in a real-world scenario. In one exemplary testing scenario, the virtual object corresponds to a simulated model of the real object. In this testing scenario, the operating parameters associated with the virtual object include, but are not limited to, an initial speed, a desired speed, a maximum speed limit, a minimum speed limit, and/or a minimum distance to be maintained with respect to the object 102. Additionally, the operating parameters may also include instructions for enabling an advanced driver assistance system (ADAS) mode (e.g., Adaptive Cruise Control) or disabling the ADAS mode for vehicular applications. According to an aspect of the test system 100, values of the one or more parameters define a behavior of the virtual object for verifying a desired functionality of the embedded system 103B in a specified testing scenario. The desired functionality, for example, may correspond to detecting pedestrians while a vehicle is in motion.
[0053] Accordingly, at step 206, the embedded simulator 104 simulates the one or more operating parameters to mimic the behavior of the virtual object in the specified testing scenario. The embedded simulator 104 then provides the one or more simulated operating parameters to the embedded system 103B of the device under test 103. Particularly, the embedded simulator 104 provides one or more simulated operating parameters to the embedded system 103B as an input through specific protocols (e.g., Controller Area Network – CAN) for verification and/or validation of the desired functionality. Providing the simulated operating parameters imitates the real-world scenario in which the embedded system 103B receives a speed of a real object (e.g., an automotive vehicle) from an Engine Control Module (ECU). The embedded system 103B receives the one or more operating parameters associated with the virtual object, and determines that the embedded system 103B is in the real object (e.g., the first vehicle), which is moving at an initial speed along a defined path, where the speed value is received as input from the embedded simulator 104. In order to validate the desired functionality, the object 102 is initially placed outside an area of coverage of the one or more sensors 103A such that the device under test 103 detects no other objects in the defined path when the real object is moving at the initial speed.
[0054] At step 208, the object 102 is moved in front of the device under test 103 along one or more axes within the area of coverage of the one or more sensors 103A such that the one or more sensors 103A detect and subsequently continually track the object 102. The one or more sensors 103A transmit signals (e.g., RADAR signals) that reflect back from a surface of the object 102. As previously described, the apparatus 101 is composed of one or more materials that absorb sensor signals. Therefore, the one or more sensors 103A receive feedback signals that are reflected back only from the surface of the object 102. The embedded system 103B processes the feedback signals to determine one or more parameters associated with the object 102. The one or more parameters associated with the object 102, for example, include a position, a distance, and a change in a speed of the object 102 with respect to the device under test 103, a trajectory, a speed, a velocity, an acceleration, a class, a physical characteristic, and a shape of the object 102.
[0055] The embedded simulator 104, the input and output unit 162 and the monitoring device 164 read out the one or more parameters of the object 102 from the embedded system 103B and one or more sensors 103A using specific protocols for validating functionalities of the embedded system 103B, and accuracy and robustness of the one or more sensors 103A. In one embodiment related to automotive testing, the embedded simulator 104 and the input and output card 162 read out the parameters of the object 102 computed by the embedded system 103B via automotive specific protocols (e.g., Controller Area Network – CAN). This mimics a real-world scenario of sending signals from the device under test 103 to other electronic control unit (ECU’s) in the first vehicle such as an Engine control module (ECM). The embedded simulator 104 and the monitoring device 164 are both individually capable of validating functionalities of the device under test 103. The embedded simulator 104 simulates actions that mimic functions to be performed by components in an object (e.g., Other ECUs like ECM present in the first vehicle) in a real-world scenario for validating functionalities of the DUT 103. Whereas, the monitoring device 164 validates functionalities of the sensors 103A alone without simulating any actions but by comparing output signals from the one or more sensors 103A representing a distance between the object 102 and the DUT 103 with a physical distance between the object 102 and the DUT 103 as described in greater detail in the following sections.
[0056] At step 210, the embedded simulator 104 simulates actions including adjusting the one or more operating parameters associated with the virtual object to obtain one or more updated operating parameters based on the one or more parameters of the object 102. The actions mimic one or more functions to be performed by the components in an object (e.g., Other ECUs like ECM present in the first vehicle) in a real-world scenario. For example, an action includes reducing a speed of a virtual object that mimics the real-world scenario of reducing a speed of a first car following a second car when a distance between these cars is less than a specified value. Thus, the embedded simulator 104 obviates a need for use of the real object for validating various expected actions of the device under test 103 in different testing scenarios. At step 212, the embedded simulator 104 validates one or more functionalities of the device under test 103 by verifying that the one or more updated operating parameters are within a range of specified values. An exemplary method for verifying and validating the various functionalities of the device under test 103 is described in detail with reference to FIG. 3.
[0057] FIG. 3 is a flow diagram 300 illustrating a method for validating the device under test 103 having the embedded system 103B and one or more associated sensors 103A. For clarity, the present method is described with reference to automotive testing. However, it may be noted that the test system 100 is capable of testing any device under test that is adapted to detect and track any kind of objects in its surroundings, and consequently trigger one or more predefined actions. For example, the test system 100 can be used for testing a device under test that is adapted to detect and track under water objects and trigger an action (e.g., activate a brake system of a watercraft). In another example, the test system 100 can be used for testing a device under test that is adapted to detect and track nearby airplanes or flying creatures and trigger an alert system.
[0058] In the present embodiment, the test system 100 is used to test the device under test 103 to be placed within a first vehicle, which is operated in an ACC mode, for detecting and tracking a second vehicle that is moving ahead of the first vehicle in a real-world scenario. In order to provide the ACC functionality, the device under test 103 is adapted to maintain a minimum distance between the first vehicle and the second vehicle and to control speed of the first vehicle with respect to a determined distance and a determined speed of the second vehicle respectively. Implementation of the ACC functionality by the device under test 103 is tested using the test system 100 without requiring presence of the actual first vehicle and the actual second vehicle during the tests by performing the following steps.
[0059] At step 302, the object 102 having the same material composition and dimensions as a rear side of the second vehicle is placed on or supported by the fifth moving assembly 154 of the apparatus 101. At step 304, the device under test 103, including a RADAR sensor along with an embedded system implementing the ACC functionality, is placed on the front shaft 132 of the apparatus 101. Particularly, the device under test 103 is placed at an initial distance (e.g., ‘D_initial meters’) from the object 102. One or more of the actuators 114, 124, 137, 147, and 155 are used for moving the object 102 to a right or a left extreme of the apparatus 101 such that, initially, the object 102 is outside a coverage area of the RADAR sensor, and signals from the RADAR sensor do not fall on the object 102. The embedded simulator 104 simulates the electrical and automotive protocol signals for imitating the ACC mode of the first vehicle.
[0060] At step 306, the embedded simulator 104 receives, from a user (e.g., a test performer), one or more operating parameters associated with a virtual object that imitates dynamics of the first vehicle in which the device under test 103 is placed in a real-world scenario. The one or more operating parameters associated with the virtual object, for example, include a speed, a minimum distance to be maintained with respect to the second vehicle (hereinafter referred as a clearance gap), a maximum speed, a minimum speed, and toggled state of a switch for enabling or disabling the ACC mode. At step 308, the embedded simulator 104 simulates the one or more operating parameters. For example, the embedded simulator 104 simulates a first vehicle speed of ‘V Mph’, a clearance gap of ‘D meters,’ which is different from ‘D_initial meters’, a maximum speed of ‘V_Max Mph’, and a minimum speed of ‘V_Min Mph’.
[0061] The embedded simulator 104 provides one or more simulated operating parameters to the embedded system 103B of the device under test 103 as an input through specific protocols (e.g., Controller Area Network – CAN). The device under test 103 receives the one or more operating parameters associated with the virtual object, and determines the device under test 103 is in the first vehicle that is moving at the speed of ‘V Mph’.
[0062] At step 310, the object 102 is moved within the area of coverage of the RADAR sensor using actuators under the control of the processing unit 161. The object 102 is moved along one or more axes at a desired speed in front of the device under test 103 in order to emulate three dimensional motion of the second vehicle such that the RADAR sensor detects presence of the object 102 and subsequently tracks the object 102 over a specified period of time. The embedded simulator 104 stores one or more trajectories, 3D motion information and speed required across three axes for realizing different test scenarios.
[0063] In order to test the ACC functionality implemented by the device under test 103, at step 312, the dummy object 102 is placed at the initial distance ‘D_initial meters’ from the device under test 103 such that the object 102 remains stationary. Placing the object 102 in a stationary position with respect to the device under test 103 emulates a motion of the second vehicle moving at a speed ‘V Mph’ in a real-world scenario. This is because, feedback RADAR signals reflected from the surface of the object 102 indicate a presence of the object 102 at the distance of ‘D_initial meters,’ and subsequently, the device under test 103 continually detects the presence of the object 102 at the same distance. As the operating parameters input to the embedded system 103B make it appear as if the embedded system 103B is within the first vehicle that is moving at the speed ‘V Mph’, continuous detection of the object 102 at the same distance makes the embedded system 103B determine that the object 102 is also moving at the speed ‘V Mph’. In addition, the RADAR sensor determines a relative velocity of the first vehicle and the second vehicle as zero due to continuous detection of the object 102 at the same distance ‘D_initial meters’ makes the embedded system 103B to determine that the object 102 is also moving at the speed ‘V Mph’.
[0064] At step 314, the embedded system 103 requests the embedded simulator 104 for reducing the simulated speed ‘V Mph’ of the virtual object to an updated speed ‘U Mph’ upon determining that the distance ‘D_initial meters,’ is less than the required clearance gap of ‘D meters. Particularly, in one embodiment, the embedded system 103B requests the embedded simulator 104 to reduce the simulated speed ‘V Mph’ of the virtual object to an updated speed ‘U Mph’ by actuating a simulated brake control system such that the clearance gap is increased to ‘D meters’. The updated speed ‘U Mph’ is less than the simulated speed ‘V Mph’. In a real vehicle, the DUT 103 sends a vehicle speed request signal to an Electronic Control Unit (ECU) such as an Engine Control Module (ECM) to increase a speed if required, and to a brake pressure request to an Anti-Lock Brake System (ABS) ECU to decrease the speed of the first vehicle.
[0065] In certain embodiments, the embedded simulator 104 simulates the updated speed ‘U Mph’ of the virtual object that imitates a required speed of the first vehicle in a real-world scenario in order to maintain the clearance gap with respect to the second vehicle using a vehicle dynamics model. The vehicle dynamics model of the embedded simulator 104 also simulates a required brake pressure in order to adjust the simulated speed ‘V Mph’ of the virtual object to the updated speed ‘U Mph’. The vehicle dynamics model is a multi-paradigm mathematical model that simulates physical properties of the real object (e.g., the first vehicle) using standard theoretic equations that are known well in the art for calculating an actual speed and/or an actual brake pressure associated with the first vehicle. In a real world scenario, the embedded system 103B placed within the first vehicle requests an engine control electronic unit to increase a vehicle speed or an Anti-Lock Brake system ECU to apply brake pressure to a specific value through CAN messages. However, though ECUs associated with the first vehicle attempt to attain the specific value, the specific value may not be accurately attained due to vehicle dynamics associated with the first vehicle. Since, the embedded simulator 104 simulates such vehicle dynamics, the vehicle dynamics model of the embedded simulator 104 is capable of calculating an actual updated speed (i.e., it may or may not be different from the updated speed ‘U Mph’ supposed to be maintained) at which the first vehicle is currently moving, simulated by the virtual object. The actual updated speed associated with the first vehicle is updated to the embedded system 103B over CAN, such that embedded system 103B sends a new request to the embedded simulator 104 for updating the ‘actual updated speed’ to the required speed ‘U Mph’ in order to minimize an error incurred in a vehicle speed control, in a next iteration. In one embodiment, the vehicle dynamics model calculates the actual vehicle speed of the first vehicle using the mathematical model that takes one or more parameters into considerations, include but are not limited to, a current speed, a weight, an inertia, tire friction properties, aerodynamic drag, a driver behavior, a current brake pressure, and engine parameters of the first vehicle.
[0066] At step 316, ACC functionality of the DUT 103 is validated by verifying that updated speed ‘U Mph’ is within a range of specified values (i.e., between ‘V_Min Mph’ and ‘V_Max Mph’), such that, in a real-time, the embedded system 103B reduces ‘D_initial meters,’ to required clearance gap of ‘D meters’ between the first and second vehicle. At step 318, the object 102 placed stationary on the apparatus 101 is moved away from the device under test 103 for increasing the distance between the object 102 and the DUT 103. Increasing the distance between the object 102 and the DUT 103 causes the DUT 103 determine that the second vehicle is now moving at an increased speed as a simulated velocity of the first vehicle is kept constant at a nominal value, and therefore a distance between the first vehicle and the second vehicle increases. Thus, the function of maintaining the clearance gap by the device under test 103 is validated using the test system 100. In an alternative embodiment, a real-world scenario of reducing a clearance gap between the first vehicle and the second vehicle is emulated by moving the object 102 towards the device under test 103. Moving the object 102 towards the DUT 103 reduces an intermediate distance between the object 102 and the DUT 103 such that the embedded system 103B determines the second vehicle is now moving at a reduced speed, and therefore, the embedded system 103B requests the embedded simulator 104 for reducing the speed of the virtual object. In certain embodiments, a real-world scenario of reducing a distance between the first vehicle and the second vehicle is simulated by increasing an initial speed of the virtual object that causes the embedded system 103B to make it appear as if the first vehicle is now moving at an increased speed. Subsequently, the object 102 is moved towards the DUT 103 that makes the embedded system 103B determine the distance between the first and second vehicles is decreasing due to an increased speed of the first vehicle, and therefore, the embedded simulator 104 reduces the speed of the virtual object based on the request from the embedded system 103B.
[0067] As previously noted, in an embodiment, the device under test 103 is also adapted to control the speed of the first vehicle with respect to a determined speed of the second vehicle for providing the ACC functionality to the first vehicle is described. The embedded simulator 104 simulates the forward speed ‘V Mph’ of the virtual object using specific automotive protocol (e.g., CAN) that imitates the speed of the first vehicle, and provides the simulated speed ‘V Mph’ as an input to the device under test 103. Maintaining the object 102 at a fixed position from the device under test 103 emulates a real-world scenario where the second vehicle is also moving forward with the same speed of ‘V Mph’. The object 102 is subsequently moved towards the device under test 103 at a very low speed ‘V_1 Mph’, where V_1 is much lesser than the speed of the first or second vehicle ‘V Mph’. Moving the object 102 at the low speed ‘V_1 Mph’ towards the device under test 103 simulates a real-world scenario where the device under test 103 detects the second vehicle to be moving at a reduced speed of ‘(V – V_1) Mph’.
[0068] The embedded system 103B generates a request that is received by the embedded simulator 104 that reduces simulated speed ‘V Mph’ of the virtual object to an updated speed ‘(V – V_1) Mph’ based on the emulated reduced speed ‘(V – V_1) Mph’ of the second vehicle by requesting a high brake pressure value for validating the DUT 103. Thus, the ACC functionality of the device under test 103 is validated by verifying that the simulated speed ‘V Mph’ of the virtual object is reduced to an updated speed ‘(V – V_1) Mph’. Reducing the speed associated with the virtual object from the simulated speed ‘V Mph’ to the updated speed ‘(V – V_1) Mph’ simulates a real-world scenario of a gradual increase in a distance between the first vehicle and the second vehicle. This gradual increase in the distance between the first vehicle and second vehicle is emulated by moving the object 102 appropriately, using mathematical calculations, away from the device under test 103 by the embedded simulator 104. When the speed of the virtual object is simulated as being ‘(V – V_1) Mph’, the embedded system 103B requests the embedded simulator 104 to stop moving the object 102 further such that the device under test 103 determines both the first vehicle and the second vehicle to be moving at the same speed ‘(V – V_1) Mph’.
[0069] A similar method can be used for validating a real world scenario, in which the speed of the second vehicle is increasing, by moving the object 102 away from the device under test 103. To that end, the embedded system 103B requests the embedded simulator 104 to increase the simulated speed of the virtual object in accordance with movement of the object 102. When the embedded system 103B requests the embedded simulator 104 to reduce the simulated speed ‘V Mph’ of the virtual object to the updated speed ‘(V – V_1) Mph’, the embedded simulator 104 continually monitors the updated speed to be within a set maximum speed ‘V_Max Mph’ and a set minimum speed ‘V_Min Mph’. Thus, various actions of the device under test 103 for implementing the ACC functionality in the first vehicle with greater reliability can be easily validated using the test system 100 without needing presence of any actual objects.
[0070] Another functionality of the device under test 103 that can be validated using the test system 100 is an Advanced Emergency Brake Assist (AEBS) system in an automotive vehicle. To that end, the device under test 103 includes a RADAR sensor and an embedded system 103B storing requisite instructions and protocols for implementing the AEBS system. The object 102 that imitates a side part of an automotive vehicle or a dummy human model is placed on the apparatus 101. One or more of the actuators 114, 124, 137, 147, and 155 are used to ensure the object is not placed within an area of coverage of the device under test 103. A desired speed associated with a virtual object is simulated by the embedded simulator 104 and is provided to the device under test 103 to make it appear as if the device under test 103 is placed within a vehicle that is moving at the desired speed. In the present embodiment, the virtual object is a simulated model of a vehicle. The object 102 is then moved within the area of coverage of the device under test 103 from either a left or a right extreme of the apparatus 101 using one or more of the actuators 114, 124, 137, 147, and 155. The device under test 103 detects and tracks the object 102, and determines a presence of an obstacle at a designated distance in the path of the vehicle. The embedded system 103B requests the embedded simulator 104 to simulate a high brake pressure value, based on the designated distance, and the same is used by embedded simulator 104 for validating functionality of the device under test 103. In a real vehicle scenario, the device under test 103 detects and tracks a target object, and then requests a high brake pressure to a brake electronic control unit (ECU) such that the vehicle stops without hitting the target object.
[0071] The embedded system 103B requests the embedded simulator 104 to simulate the high brake pressure value and to adjust the simulated speed of the virtual object to an updated simulated speed using the vehicle dynamics model. In one embodiment, the updated simulated speed is used to control linear motion of one or more of the actuators 114, 124, 137, 147, and 155, and in turn, the object 102 for simulating desired testing conditions. Reducing the simulated speed, for example, causes the object 102 to move towards the device under test 103 with an appropriate speed such that the device under test 103 determines that the first vehicle is slowing down based on the high brake pressure value. In addition to maintaining a clearance gap and braking, the test system 100 is also capable of a speeding up and a speeding down motion of the object 102, sudden braking to stop the object 102 at a desired position, and sudden arrival or a removal of the object 102 from the area of coverage of the DUT 103. Thus, in an automotive application space, the test system 100 is adapted to validate the device under test implementing many Advanced Driver Assistance systems (ADAS), for example, including the ACC system and the AEBS system.
[0072] Another application of the test system 100 that is adapted to validate the device under test 103 implementing various functionalities related to avionics and/or aerospace technology is described in the following sections in greater detail. When used in an aircraft, the device under test 103 may include one or more sensors 103A such as a RADAR sensor, a LIDAR sensor, and an IR sensor. The device under test 103 further includes an embedded system along with the one or more sensors for implementing specific avionics and/or aerospace applications. In one embodiment, when validating an aerospace application, the object 102 may be a dummy model of a rear side, a right side, a left side, or a front side of an aircraft, a bird, space debris, or a space station. The device under test 103 can be validated as described above by emulating motion of a target object to be detected using the object 102 placed on the apparatus 101, and verifying actions to be taken by the device under test 103 in the real world scenario using the embedded simulator 104.
[0073] Unlike conventional emulators, that employ rails or ropes that restrict the emulated motion to two-dimensions or render the emulated motion shaky and unreliable, the test system 100 allows emulation of three dimensional movement of the object 102 with great accuracy and stability. Accordingly, a large number of real world scenarios can be imitated inside a laboratory set up such that costs, time, and effort spent in testing the device under test 103 with a real aircraft can be significantly reduced. Another advantage of the test system 100 includes simulating test scenarios that cannot be created easily even in the real world for testing (e.g., creating three dimensional movements of a space debris in front a spacecraft). The test system 100 can be advantageously used to emulate the motion of the space debris with respect to the spacecraft by using scaled-down or scaled up versions of the object 102 and the apparatus 101 without affecting requirements to be evaluated. Additionally, the test system 100 can validate the functioning of the device under test 103 to be installed within the spacecraft to detect and track the space debris present in a projected path of the spacecraft using one or more sensors and to adjust the speed and direction of the spacecraft to avoid collision. Certain other testing and validation studies that may be efficiently conducted in a laboratory environment using the test system 100 are described in greater detail with reference to FIGs 4-5.
[0074] FIG. 4 depicts a schematic view of a test system 400 for automatically validating a device under test by emulating scenarios related to a parking slot detector, a park assistance, and a distance alert system using one or more ultrasonic sensors 402. For the sake of clarity, an application of the test system 400 that is adapted to validate the device under test implementing the parking slot detector is described. In one embodiment, the test system 400 includes more than one apparatus 101 to emulate stationary obstacles and a parking slot of desired shapes and dimensions. The test system 400 further includes a vehicle frame 404 on which the one or more ultrasonic sensors 402 are placed at an appropriate position similar to their position in a real vehicle. The device under test to be tested includes the one or more ultrasonic sensors 402 and a parking Electronic Control Unit (ECU) 405.
[0075] In one exemplary implementation, a single stationary obstacle is emulated using a single apparatus 101 where the object 102 with an appropriate shape is moved to different relative positions with respect to the device under test for simulating different obstacle distances. Alternatively, parking slot emulation, a parking slot dimension control, and an introduction of an obstacle while simulating parking of a vehicle may be executed using more than one apparatus 101. Generally, a position of the apparatus 101 may be varied based on a feature to be evaluated. For example, when an obstacle detection is to be performed at a rear side of the vehicle, the apparatus 101 is placed at the rear side of the vehicle frame 404. The embedded simulator 406 can validate whether the device under test 103 is capable of detecting different parking slots created using the test system 400. An embedded system of the device under test requests an embedded simulator 406 to simulate a steering wheel angle of a virtual object according to positions of stationary objects 408 placed at a plurality of apparatuses that are similar to the apparatus 101. The embedded simulator 406 then transmits the steering wheel angle of the virtual object to the parking ECU by controlling one or more actuators to make it appear to the device under test that the vehicle is actually changing its steering angle with respect to the stationary objects 408. Subsequently, the embedded simulator 406 verifies whether the device under test 103 is operating as expected by analyzing reported steering wheel angle values and functionalities implemented in a simulated environment with reference to corresponding predefined values.
[0076] While the test system 100 enables validation of the device under test 103, the test system 400 may also be used to validate accuracy and robustness of a sensor such as an ultrasonic sensor. For validating an ultrasonic sensor, an object is placed at the apparatus 101 at different distances from the ultrasonic sensor, and a sensor output is monitored using the monitoring device 164 of the embedded simulator 406 to confirm whether the ultrasonic sensor is reporting accurate values of physical distance. In one embodiment, the monitoring device 164 analyzes and validates whether the electrical signal output from the one or more sensors 103A are in accordance with a physical distance between the object 102 and the DUT 103. For example, a cathode-ray oscilloscope (CRO) analyzes and validates if a duty cycle of Pulse Width Modulation (PWM) output of an ultrasonic sensor is in accordance with a physical distance between the object 102 and the DUT 103. In one embodiment, the ultrasonic sensor validation may be combined with the parking slot validation to ensure that the device under test 103 provides reliable vehicle parking functionality. Similarly, the test system 400 may be used to emulate testing scenarios where more than one sensor is to be arranged in a specific manner for the combined data to be used for a collective action by the device under test 103.
[0077] According to certain aspects described herein, the device under test 103 is adapted to detect and track the target object in the real-world scenario by moving the object 102 in front of the device under test 103 along one or more axes or in a specific trajectory that emulates motion of the target object in the real-world scenario. The movement of the object 102 may be detected and stored by the device under test 103 having the one or more sensors 103A and the embedded system 103B. The monitoring device 164 compares output of sensors signal from the one or more sensors 103A representing a distance between the object and the DUT 103 with a physical distance measured between the object 102 and the device under test 103 at a particular time in order to validate accuracy of the measurements acquired by the sensors. In one embodiment, the monitoring device 164 determines the physical distance between the object 102 in motion and the device under test 103 at any particular time based on an initial distance between the object 102 and the device under test 103, actuators’ shaft position before subjecting the object 102 to movement, and actuators’ shaft position at that particular time.
[0078] For example, the object 102 supported on the apparatus 101 is moved at different velocities with respect to a RADAR sensor in the device under test 103. The RADAR sensor detects the object 102 and measures distances between the object 102 and the device under test 103 at different instants of times. Sensor outputs from the RADAR sensor are monitored using the monitoring device 164 of the embedded simulator 406. The monitoring device 164 then compares sensor outputs from the RADAR sensor with a physical distance value determined between the object 102 and the DUT 103 at a particular time for validating accuracy and robustness of the one or more sensors 103A. The object 102 can be replaced for different target object sizes, dimensions, physical compositions for verifying robustness of the RADAR sensor. Thus, the test system 100 is capable of validating accuracy and robustness of the RADAR sensor. Similarly, the test system 100 is also capable of validating other sensors and the device under test 103 that makes use of the other sensors such as an ultrasonic sensor, and an infrared sensor. In addition, the test system 100 is also capable of testing a device under test using a LIDAR sensor that is generally used for 3D mapping of the environment.
[0079] Further, FIG. 5 illustrates a schematic diagram depicting a test system 500 for validating a sensor fusion system having more than one type of sensor and an embedded system for executing associated functionalities. A device under test may include a detection and tracking system that uses a combination of more than one type of sensors, for example, any combination of a RADAR sensor, a LIDAR sensor, a IR sensor, and/or or an ultrasonic sensor, and an associated embedded system which implements sensor fusion and functional execution. FIG. 5 depicts a sensor fusion system having two different sensors 502 and 504, and an associated embedded system 506 that are supported on a mechanical frame 508. In one embodiment, positions of the sensors 502 and 504 placed on the mechanical frame 508 are similar to positions of the sensors 502 and 504 with respect to a real vehicle.
[0080] The apparatus 101 having an object 510 is placed at a specified distance with respect to the mechanical frame 508 having the sensors 502 and 504 and the embedded system 506. In one embodiment, the sensors 502 and 504 are connected to the embedded system 506 using wires 514 for controlling operations of the sensors and to retrieve data from the sensors 502 and 504. Movement of the object 510 based on a control signal received from an embedded simulator 512 make it appear to both the sensors 502 and 504 and the embedded system 506 that an obstacle is actually moving within their area of coverage. Detection of the object 510 by the sensors 502 and 504 is used for validation of the sensor fusion system where a fusion of data coming from the sensors 502 and 504 exposed to a same set of physical target information should cause the device under test to execute required actions. The intuitive architecture and the protocols employed by the test system 100 aids in easy assessment of a performance of the sensors 502 and 504, the embedded system 506, and thereby, the sensor fusion system using the combination of data measured by the sensors 502 and 504.
[0081] Thus, the embodiments described herein present the test system 100 that includes the apparatus 101 including various structural assemblies and actuators that are capable of accurately moving an object in three-dimensional space with precise control of a position and a velocity or a speed of the object. Specifically, the test system 100 emulates accurate three dimensional motion of objects for emulating a plurality of testing scenarios and environments for verifying and validating different types of devices under test that include one or more sensors for detecting and tracking objects and an embedded system that performs functional execution based on the detection and tracking of the objects. The embedded system 103B receives reflected sensor signals from a surface of an object, performs signal processing, determines one or more parameters of the object from the processed signals, and controls the one or more sensors 103A based on the parameters. The test system 100, thus, obviates a need of a real object on which the device under test 103 is adapted to be placed and a real target object for validating functionalities associated with the device under test (103).
[0082] Furthermore, the test system 100 is also capable of validating sensor fusion technology where inputs from various kinds of sensors are used for detecting and tracking objects, thus allowing the test system 100 to validate functionalities of all such sensors at once. The test system 100 is also capable of emulating test scenarios that are very difficult to create even in a real-world scenario inside a lab or a lab like environment. More specifically, the test system 100 can be used to move the object 102 in a specific trajectory repeatedly for refining the actions of the embedded system 103B based on the results from a previous test run when a desired functionality is found to be not properly implemented in the device under test 103. Thus, costs, time, and effort spent in testing the device under test 103 with a real object in the real-world scenario is reduced significantly. Although, the test system 100 provides accurate 3D motion, the test system 100 is also capable of emulating only a two-dimensional motion or a one-dimensional motion of an object, if required, by disabling actuators that are not required for moving the object in one or more specific directions.
[0083] Although specific features of various embodiments of the present systems and methods may be shown in and/or described with respect to some drawings and not in others, this is for convenience only. It is to be understood that the described features, structures, and/or characteristics may be combined and/or used interchangeably in any suitable manner in the various embodiments shown in the different figures.
[0084] While only certain features of the present systems and methods have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the claimed invention.
, Description:1. An apparatus (101) for moving an object (102), comprising:
a first set of axial guiding structures (110,120), a second set of axial guiding structures (133,143) and a fifth guiding structure (153) that are all positioned axially with respect to each other, wherein the fifth guiding structure (153) supports the object (102);
a first set of axial moving assemblies (113,123) that are moveably secured to the first set of axial guiding structures (110,120), wherein the first set of axial moving assemblies (113,123) are adapted to move the second set of axial guiding structures (133,143), the fifth guiding structure (153) and the object (102) along a first axis (115), wherein each of the first set of axial guiding structures (110,120) is stationary and is positioned at a specified distance from other axial guiding structure in the first set of axial guiding structures (110,120);
a second set of axial moving assemblies (136,146) that is moveably secured to the second set of axial guiding structures (133,143) and is adapted to move the fifth guiding structure (153) and the object (102) along a second axis (138); and
a fifth moving assembly (154) that is moveably secured to the fifth guiding structure (153) and is adapted to support the object (102), wherein the fifth moving assembly (154) is capable of moving orthogonally with respect to the first set of axial guiding structures (110,120) and the second set of axial guiding structures (133,143) for moving the object (102) along a third axis (156).

2. A system for validating functionality of a device under test, comprising:
a first set of axial guiding structures (110;120), a second set of axial guiding structures (133;143) and a fifth guiding structure (153) that are all positioned axially with respect to each other, wherein each of the first set of axial guiding structures (110;120) is stationary and is positioned at a specified distance from other axial guiding structure in the first set of axial guiding structures (110,120), wherein the fifth guiding structure (153) supports a pseudo object (102);
one or more axial moving assemblies that are moveably secured to each of the first set of axial guiding structures (110;120), the second set of axial guiding structures (133;143) and the fifth guiding structure (153) for moving the pseudo object along one or more axes;
one or more sensors (103A) and an embedded system (103B) that are placed at a designated distance from the pseudo object (102); and
an embedded simulator (104) that is configured to
simulate one or more operating parameters associated with a virtual object, wherein the one or more operating parameters are provided as an input to the embedded system (103B), wherein the virtual object represents a model of a real object at which the one or more sensors (103A) and the embedded system (103B) are adapted to be placed;
adjust the one or more operating parameters associated with the virtual object to one or more specified values based on one or more parameters associated with the pseudo object (102) to obtain one or more updated operating parameters; and
validate at least one functionality of the one or more sensors (103A) and the embedded system (103B) based on the one or more updated operating parameters.

3. The system as claimed in claim 2, wherein the one or more axial moving assemblies comprise:
a first set of axial moving assemblies (113;123) that are moveably secured to the first set of axial guiding structures (110,120) and are adapted to move the second set of axial guiding structures (133,143), the fifth guiding structure (153) and the object (102) along a first axis;
a second set of axial moving assemblies (136,146) that are moveably secured to the second set of axial guiding structures (133,143) and are adapted to move the fifth guiding structure (153) and the object (102) along a second axis; and
a fifth moving assembly (154) that is moveably secured to the fifth guiding structure (153) and is adapted to move orthogonally with respect to the first set of axial guiding structures (110;120) and the second set of axial guiding structures (133,143) for moving the object (102) along a third axis, wherein the fifth guiding structure (153) is mounted on to the second set of axial moving assemblies (136,146) and is adapted to move with the second set of axial moving assemblies (136,146) along the second axis.

4. The system as claimed in claim 3, wherein the second set of axial moving assemblies (136,146) supports the fifth guiding structure (153) that supports the fifth moving assembly (154) and the pseudo object (102) mounted thereon such that a movement of the second set of axial moving assemblies (136,146) along the second axis also moves the fifth moving assembly (154) and pseudo object (102) along the second axis, and wherein the first set of axial moving assemblies (113,123) supports the second set of axial guiding structures (133,143) and the second set of axial moving assemblies (136,146) that support the fifth guiding structure (153) having the pseudo object (102) mounted thereon, such that, a movement of the first set of axial moving assemblies (113,123) along the first axis also moves the pseudo object (102) along the first axis.

5. The system as claimed in claim 3, further comprising: a first set of axial actuators (114,124) for simultaneously moving the first set of axial moving assemblies (113,123) through the first set of axial guiding structures (110,120); a second set of axial actuators (137,147) for simultaneously moving the second set of axial moving assemblies (136,146) through the second set of axial guiding structures (133,143); a fifth actuator (155) for moving the fifth moving assembly (154) through the fifth guiding structure (153); a first connecting member (129) that is coupled to the first set of axial moving assemblies (113,123) and is adapted to simultaneously move the first set of axial moving assemblies (113,123) together along the first axis; a second connecting member (130) that is adapted to interconnect the first set of axial actuators (114,124) that are coupled to the first set of axial guiding structures (110,120); a third connecting member (131) that is adapted to interconnect the first set of axial guiding structures (110,120); a fourth connecting member (152) that is adapted to interconnect the second set of axial actuators (137,147) that are coupled to the second set of axial guiding structures (133,143); and a front shaft (132) moveably coupled to the third connecting member (131) for positioning the one or more sensors (103A) and the embedded system (103B) at the designated distance from the pseudo object (102), wherein the embedded system (103B) is communicatively coupled to the one or more sensors (103A) and the embedded simulator (104),
wherein each of the first set of axial actuators (114,124), the second set of axial actuators (137,147), and the fifth actuator (155) comprises motor control lines (116, 117, 125, 126, 139, 140, 148, 149, 157, 158) and position identifying sensor output pins (118, 119, 127, 128, 141, 142, 150, 151, 159, 160) for identifying a corresponding position of each actuator.

6. The system as claimed in claim 5, wherein the embedded simulator (104) is configured to control positions of one or more actuators selected from the first set of axial actuators (114,124), the second set of axial actuators (137,147) and the fifth actuator (155) for moving the pseudo object (102) across at least one axis and across a desired trajectory at a desired speed, and wherein the embedded simulator (104) is further configured to control the first set of axial actuators (114,124) together for moving the first set of axial moving assemblies (113,123) through the first set of axial guiding structures (110,120), the second set of axial actuators (137,147) together for moving the second set of axial moving assemblies (136,146) through the second set of axial guiding structures (133,143), and the fifth actuator (155) separately for moving the fifth moving assembly (154) through the fifth guiding structure (153).

7. The system as claimed in claim 6, wherein the embedded simulator (104) is further configured to continuously receive positional data of the one or more actuators from the position identifying sensor output pins (118, 119, 127, 128, 141, 142, 150, 151, 159, 160) when the pseudo object (102) is under motion, and wherein the embedded simulator (104) is configured to correct one or more errors that occur in controlling positions of the one or more actuators while moving the pseudo object (102) across a three dimensional trajectory at the desired speed based on the positional data.

8. The system as claimed in claim 7, wherein the embedded simulator (104) is further configured to compare, output signals from the one or more sensors (103A) and the embedded system (103B), representing a distance between the one or more sensors (103A) and the pseudo object (102), a relative velocity and a direction of motion of the pseudo object (102) at a particular time when the pseudo object (102) is in motion, with, a physical distance between the one or more sensors (103A) and the pseudo object (102), a physical relative velocity and an actual direction of motion respectively at the particular time, for validating performance of the one or more sensors (103A).

9. The system as claimed in claim 5, wherein the first set of axial guiding structures (110,120), the first set of axial moving assemblies (113,123), the second set of axial guiding structures (133,143), the second set of axial moving assemblies (136,146), the fifth guiding structure (153), the fifth moving assembly (154), the first connecting member (129), the second connecting member (130), the third connecting member (131), and the fourth connecting member (152) are made to absorb a signal emitted from the one or more sensors (103A) such that the one or more sensors (103A) receive a corresponding reflected feedback signal only from the pseudo object (102), and wherein the embedded system (103B) is adapted to determine the one or more parameters associated with the pseudo object (102) that is supported at the fifth guiding structure (153) based on the reflected feedback signal.

10. The system as claimed in claim 9, wherein the one or more parameters associated with the pseudo object (102) comprise a position, a distance, a trajectory, a speed, a velocity, an acceleration, a change in the speed of the pseudo object (102) with respect to the device under test (103), a class, one or more physical characteristics, and a shape of the pseudo object (102), wherein the one or more operating parameters associated with the virtual object comprise one or more of a desired speed, a maximum speed limit, a minimum speed limit, a minimum distance that is to be maintained with respect to the pseudo object (102), and an advanced driver assistance system enabling mode and an advanced driver assistance system disabling mode for vehicular applications, and wherein the system is adapted to validate the one or more sensors (103A) and the embedded system (103B) that are configured to implement one or more advanced driver assistance system functionalities, the advanced driver assistance system functionalities comprising adaptive cruise control, advance emergency brake assist, and automatic parking functionality.

11. The system as claimed in claim 2, further comprising a vehicle frame (404) on which one or more sensors (402) and an embedded system (405) are placed for validating automatic parking functionality of a vehicle, wherein the embedded system (103B) corresponds to an electronic control unit in the vehicle, wherein the vehicle frame (404) is placed adjacent to an apparatus (101) that is representative of another vehicle such that the one or more sensors (402) determine relative position data corresponding to the apparatus (101) with respect to the vehicle frame (404), and wherein the embedded simulator (406) is configured to validate the automatic parking functionality of the embedded system (405) by moving one or more moving assemblies (113, 123, 136, 146, 154) of the apparatus (101) according to a steering wheel angle requested by the embedded system (405) and the relative position data so as to park the vehicle without colliding with the apparatus (101).

12. A method for validating a device under test (103), comprising:
placing a pseudo object (102) on an apparatus (101) at a specified distance from the device under test (103), wherein the device under test (103) comprises one or more sensors (103A) and an embedded system (103B) that are communicatively coupled to an embedded simulator (104);
simulating one or more operating parameters associated with a virtual object using the embedded simulator (104), wherein the one or more operating parameters are provided as an input to the embedded system (103B), wherein the pseudo object (102) is placed outside a designated area surrounding the device under test (103) such that the one or more sensors (103A) fail to detect a presence of the pseudo object (102);
moving the pseudo object (102) within the designated area surrounding the device under test (103) until the one or more sensors (103A) detect the presence of the pseudo object (102) based on a reflected feedback signal received by the one or more sensors (103A) from the pseudo object (102), wherein the embedded system (103B) determines one or more parameters associated with the pseudo object (102) based on the reflected feedback signal; and
adjusting the one or more operating parameters associated with the virtual object, using the embedded simulator (104), based on the one or more parameters determined by the embedded system (103B) to validate at least one functionality associated with the one or more sensors (103A) and the embedded system (103B).

13. The method as claimed in claim 12, wherein simulating one or more operating parameters associated with the virtual object comprises simulating a speed of the virtual object, wherein the embedded system (103B) determines if a distance between the pseudo object (102) and the device under test (103) corresponds to a defined value, and wherein adjusting the one or more operating parameters comprises adjusting the speed of the virtual object when the distance does not correspond to the predefined value for validating a functionality of the device under test (103).

14. The method as claimed in claim 13, further comprising: providing the speed or a velocity of the virtual object as an input to the embedded system (103B) such that the embedded system (103B) determines that the device under test (103) is placed within a real object that is moving at the speed or the velocity as set in the input; and maintaining the pseudo object (102) at the specified distance from the device under test (103) such that the embedded system (103B) determines that the pseudo object (102) is also moving at a same speed or a same velocity as set in the input by a continuous detection of the pseudo object (102) at the specified distance.

15. The method as claimed in claim 12, wherein simulating one or more operating parameters associated with the virtual object comprises simulating a velocity of the virtual object, wherein the embedded system (103B) determines a change in a velocity of the pseudo object (102) emulated by actuating one or more actuators of the apparatus, and wherein adjusting the one or more operating parameters comprises adjusting the velocity of the virtual object based on the change in the velocity of the pseudo object (102) for validating a functionality of the device under test (103).

Documents

Application Documents

# Name Date
1 201641028663-IntimationOfGrant02-02-2022.pdf 2022-02-02
1 Power of Attorney [23-08-2016(online)].pdf 2016-08-23
2 Form 5 [23-08-2016(online)].pdf 2016-08-23
2 201641028663-PatentCertificate02-02-2022.pdf 2022-02-02
3 Form 3 [23-08-2016(online)].pdf 2016-08-23
3 201641028663-CLAIMS [20-12-2021(online)].pdf 2021-12-20
4 201641028663-ENDORSEMENT BY INVENTORS [20-12-2021(online)].pdf 2021-12-20
5 Form 18 [23-08-2016(online)].pdf_187.pdf 2016-08-23
5 201641028663-FER_SER_REPLY [20-12-2021(online)].pdf 2021-12-20
6 Form 18 [23-08-2016(online)].pdf 2016-08-23
6 201641028663-FORM 3 [20-12-2021(online)].pdf 2021-12-20
7 201641028663-PETITION UNDER RULE 137 [20-12-2021(online)].pdf 2021-12-20
8 Description(Complete) [23-08-2016(online)].pdf 2016-08-23
8 201641028663-FER.pdf 2021-10-17
9 Correspondence by Agent_Form1_13-07-2018.pdf 2018-07-13
9 abstract 201641028663.jpg 2016-09-28
10 Form1_After Filing_13-07-2018.pdf 2018-07-13
10 Power of Attorney_After Filed_17-04-2017.pdf 2017-04-17
11 Correspondence by Agent_Form1,Form5,GPOA_17-04-2017.pdf 2017-04-17
11 Form5_After Filed_17-04-2017.pdf 2017-04-17
12 Form1_As Filed_17-04-2017.pdf 2017-04-17
13 Correspondence by Agent_Form1,Form5,GPOA_17-04-2017.pdf 2017-04-17
13 Form5_After Filed_17-04-2017.pdf 2017-04-17
14 Form1_After Filing_13-07-2018.pdf 2018-07-13
14 Power of Attorney_After Filed_17-04-2017.pdf 2017-04-17
15 abstract 201641028663.jpg 2016-09-28
15 Correspondence by Agent_Form1_13-07-2018.pdf 2018-07-13
16 201641028663-FER.pdf 2021-10-17
16 Description(Complete) [23-08-2016(online)].pdf 2016-08-23
17 201641028663-PETITION UNDER RULE 137 [20-12-2021(online)].pdf 2021-12-20
18 201641028663-FORM 3 [20-12-2021(online)].pdf 2021-12-20
18 Form 18 [23-08-2016(online)].pdf 2016-08-23
19 Form 18 [23-08-2016(online)].pdf_187.pdf 2016-08-23
19 201641028663-FER_SER_REPLY [20-12-2021(online)].pdf 2021-12-20
20 201641028663-ENDORSEMENT BY INVENTORS [20-12-2021(online)].pdf 2021-12-20
21 Form 3 [23-08-2016(online)].pdf 2016-08-23
21 201641028663-CLAIMS [20-12-2021(online)].pdf 2021-12-20
22 Form 5 [23-08-2016(online)].pdf 2016-08-23
22 201641028663-PatentCertificate02-02-2022.pdf 2022-02-02
23 Power of Attorney [23-08-2016(online)].pdf 2016-08-23
23 201641028663-IntimationOfGrant02-02-2022.pdf 2022-02-02

Search Strategy

1 2021-07-1013-11-17E_11-07-2021.pdf

ERegister / Renewals