Sign In to Follow Application
View All Documents & Correspondence

Improved Radar Camera Calibration Without Rotation And Translation Matrices

Abstract: Improved Radar-Camera Calibration Without Rotation and Translation Matrices [0039] The present invention discloses a method for radar-camera calibration that eliminates the need for intrinsic and extrinsic calibration parameters such as focal length, principal axis points, rotation, and translation matrices. The method involves positioning a calibration setup comprising a trihedral aluminum corner reflector and a whiteboard, capturing radar data points and corresponding camera pixel coordinates, normalizing the collected data to minimize numerical precision issues, and applying a Direct Linear Transformation (DLT) algorithm to derive a single transformation matrix. The transformation matrix converts radar coordinates into camera pixel coordinates, enabling real-time synchronization and accurate sensor fusion. The method streamlines intrinsic camera calibration and ensures precise sensor orientation, contributing to enhanced object detection and localization in autonomous vehicles. (Figure 1)

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
22 December 2023
Publication Number
26/2025
Publication Type
INA
Invention Field
PHYSICS
Status
Email
Parent Application

Applicants

NMICPS - Technology Innovation Hub on Autonomous Navigation Foundation
C/o Indian Institute of Technology (IIT) Hyderabad, Kandi-502285, Sangareddy, Telangana, India.
Indian Institute of Technology (IIT) Hyderabad
A417, Academic Block A, Indian Institute of Technology (IIT) Hyderabad, Kandi-502285, Sangareddy, Telangana, India

Inventors

1. Mr. Nitish Kumar
C/o Indian Institute of Technology (IIT) Hyderabad, Kandi-502285, Sangareddy, Telangana, India.
2. Prof. Rajalakshmi Pachamuthu
C/o Indian Institute of Technology (IIT) Hyderabad, Kandi-502285, Sangareddy, Telangana, India.

Specification

DESC:PRIORITY CLAIM:

[0001] This application claims priority from the provisional application number 202341079326 filed with Indian Patent Office, Chennai on 22nd November 2023 and postdated to 22nd December 2023, entitled “Improved Radar-Camera Calibration Without Rotation and Translation Matrices”, the entirety of which is expressly incorporated herein by reference.
PREAMBLE TO THE DESCRIPTION:

[0002] The following specification particularly describes the invention and the manner in which it is to be performed:
DESCRIPTION OF THE INVENTION
Technical field of the invention
[0003] The present invention generally relates to the field of advanced driver assistance systems for autonomous vehicles. More particularly, it pertains to a method to enhance the accuracy of the radar-camera calibration process without introducing any rotation and translation matrices.
Background of the invention
[0004] Numerous driving assistance systems, commonly referred to as assistants, are utilized in motor vehicles. These systems rely on video images captured by a single camera or multiple cameras positioned, for instance, at the front and/or rear of the vehicle. Their primary functions include detecting road lane markings, road boundaries, obstacles, other road users, and surveying or displaying the frontal and/or rear areas, especially during tasks like parking. However, a challenge arises due to the inherent lack of absolute fixation of the camera position. Tolerances exist during camera manufacturing and assembly into the vehicle. Crucially, the camera's position may change during the vehicle's lifespan, influenced by real-world conditions such as vibrations from rough roads, door impacts, car washes, repairs, part replacements, and the movement of pivoting side-view mirror housings. These factors collectively contribute to variations in camera position, including angular orientation, which can result in camera decalibration.
[0005] As the automotive industry progresses towards autonomy, precise sensor calibration becomes a fundamental aspect of the vehicle perception system. The calibration of radar and camera systems in autonomous and commercial vehicles is a critical step towards achieving a reliable and robust perception framework. The seamless integration of data from these sensors enhances the accuracy of object detection, localization, and tracking, ultimately contributing to the safety and efficiency of the vehicle.
[0006] The existing state-of-the-art methods for radar-camera calibration is extrinsic calibration. This process involves coordinating the transformation from one sensor to another, conducting intrinsic calibration for each individual sensor, determining precise mounting parameters, calculating rotation and translation parameters for each axis, and subsequently developing matrices that handle rotation, translation, and the conversion of radar points to pixel coordinates. However, this method is tedious, and the re-projection error is notably significant, representing a major drawback.
[0007] The intrinsic camera calibration involves utilizing a set of checker-board images captured in various orientations. These images are then processed using the camera calibrator toolbox in MATLAB to derive the intrinsic calibration matrix. Additionally, obtaining the relative orientation of sensors requires precise mounting parameters for each sensor. Given that the camera and radar possess distinct coordinate frames, it becomes essential to determine suitable rotation and translation shifts.
[0008] Several attempts have been made to address these challenges. For instance, Reference has been made to “Extrinsic parameter calibration of 2D radar-camera using point matching and generative optimization by Deokkyu Kim; Sungho Kim” disclosing a method of calibration between 2D radar and camera using point matching. The corner reflectors calibration target has been used to focus the radar signal at the center of the target. This method estimates the extrinsic parameters using point matching and top-down method.
[0009] Reference has been made to “Accurate and Automatic Extrinsic Calibration for a Monocular Camera and Heterogenous 3D LiDARs by Xingxing Li, Feiyang He, Shengyu Li, Yuxuan Zhou” disclosing calibration board with checkerboard grids and circular holes, through which the proper extrinsic parameters can be obtained automatically by matching the circular hole centers extracted from both images and heterogenous LiDAR scans. The proposed calibration method is designed for LiDARs with different precision and scanning modes, such as Velodyne LiDAR, Ouster LiDAR, and Livox LiDAR.
[0010] Reference has been further made to “A Method of Spatial Calibration for Camera and Radar by Dezhi Gao, Jianmin Duan, Xining Yang, Banggui Zheng” disclosing a method of spatial calibration of radar and the camera to implement the spatial calibration of the two sensors. To facilitate this transformation, coordinates pertinent to the radar and camera are introduced as constraints. Subsequently, by transforming features between the corresponding image coordinates and relative radar coordinates, calibration parameters are determined through a least square error function.
[0011] Reference has been further made to “3D Radar and Camera Co-Calibration: A flexible and Accurate Method for Target-based Extrinsic Calibration by Lei Cheng, Arindam Sengupta, Siyang Cao” disclosing a method for extrinsic calibration of 3D radar and camera. This entails employing a single corner reflector (CR) on the ground to iteratively capture radar and camera data concurrently using Robot Operating System (ROS). The process involves acquiring radar-camera point correspondences based on their timestamps, utilizing these correspondences as input to solve the perspective-n-point (PnP) problem, and ultimately deriving the extrinsic calibration matrix.
[0012] The Patent Application No CN115641380A entitled, “Camera and radar multi-angle combined external reference calibration method and system under rotation condition” discloses a method and a system for calibrating multi-angle combined external parameters of a camera and a radar under a rotating condition, wherein the method comprises the following steps: acquiring target images of a camera at zero and 180 horizontal degrees and corresponding target point clouds at two horizontal angles; respectively carrying out coordinate conversion on the target point clouds at the two horizontal angles so as to enable the target point clouds to be projected into target images at the corresponding horizontal angles, and enabling the point cloud projection points and image pixel points to be superposed to obtain coordinate conversion matrixes of the camera and the radar at the two horizontal angles, namely calibration external reference matrixes; and after the camera rotates a certain angle, acquiring a translation matrix and a rotation matrix relative to the horizontal angle, and obtaining the external parameter matrix at any rotation angle according to the calibrated external parameter matrix, the translation matrix and the rotation matrix. And multi-angle joint calibration of the camera and the radar is realized.
[0013] In this context, the previously discussed technical challenge of camera decalibration during real-world use leads to an inaccurate or incorrect camera position relative to the vehicle. This, in turn may lead to imprecise or erroneous input data for driver assistance system applications. Therefore, there is a need for an improved method to simplify the radar and camera calibration process without introducing rotation and translation matrices, providing a straightforward approach to generate a calibration matrix with minimal effort.
SUMMARY OF THE INVENTION
[0014] The present invention addresses the limitations of the prior art by disclosing a method for calibrating radar and camera sensors in a vehicle to address the challenges of decalibration and ensure accurate data for driver assistance systems. The method simplifies the calibration process by eliminating the need for complex rotation and translation matrices, employing a single transformation matrix derived through a Direct Linear Transformation (DLT) algorithm.
[0015] The calibration process involves the use of a trihedral aluminum corner reflector for precise radar and camera data alignment. Radar data points are matched with corresponding camera image points, that are manually determined and used as ground truth. These matched points are normalized to adjust scale and prevent numerical errors, enhancing the reliability of the calibration process. The Direct Linear Transformation (DLT) algorithm, enhanced through normalization, generates a 3x4 transformation matrix that directly maps radar coordinates to corresponding camera pixel coordinates.
[0016] The method ensures real-time radar-to-camera synchronization, enabling seamless integration of data for object detection, tracking, and localization. The method significantly reduces calibration complexity, minimizes re-projection errors, and improves the accuracy and reliability of radar-camera sensor fusion, contributing to safer and more efficient driver assistance systems.
BRIEF DESCRIPTION OF THE DRAWINGS:
[0017] The foregoing and other features of embodiments will become more apparent from the following detailed description of embodiments when read in conjunction with the accompanying drawings. In the drawings, like reference numerals refer to like elements.
[0001] FIG 1 illustrates a flowchart of a method for improved radar-camera calibration without the need for rotation and translation matrices, in accordance with an embodiment of the invention.

[0002] FIG. 2 illustrates a calibration setup utilized for radar-camera calibration, in accordance with an embodiment of the invention.

[0003] FIG 3 illustrates a plot of ground truth for radar point projection vs predictions from calibration matrix, in accordance with an embodiment of the invention.

[0004] FIG 4 illustrates an average Euclidean distance error comparison analysis for the Direct Linear Transformation (DLT) method and the Normalized Direct Linear Transformation (NDLT) method, in accordance with an embodiment of the invention.
DETAILED DESCRIPTION OF THE INVENTION:
[0018] In order to describe and point out the subject matter of the invention, the following definitions are provided for specific terms, which are used in the following written description more clearly and concisely.
[0019] The term "Radar Sensor" refers to a radar system capable of detecting objects by emitting radio waves and interpreting the reflected signals to determine the position and movement of objects in the environment.
[0020] The term "Corner Reflector" refers to a reflector specifically designed to reflect radar signals back to their source with high accuracy.
[0021] The term "Direct Linear Transformation (DLT)" refers to a mathematical algorithm that establishes a direct relationship between radar coordinates and camera pixel coordinates using a transformation matrix, without requiring rotation or translation matrices.
[0022] The term "Normalization" refers to a preprocessing step that adjusts the scale and centers the radar and camera data points to optimize their numerical properties, reducing errors during the transformation process.
[0023] The term "Transformation Matrix" refers to a single (3x4) matrix derived from the calibration process, which converts radar coordinates into camera pixel coordinates in real time.
[0024] The term "Ground Truth" refers to manually identified pixel coordinates in the camera image that correspond to radar points on the corner reflector, providing a reliable reference for aligning radar and camera data.
[0025] The present invention discloses a method to improve the accuracy of the radar-camera calibration process without the need for rotation and translation matrices. A single projection matrix is employed for both radar and camera calibration, facilitating real-time projection. The calibration process is based upon the Direct Linear Transformation (DLT) algorithm followed by normalization or pre-conditioning method on the radar data points that is to be projected on the camera image. The employment of DLT simplifies the calibration process, yielding a (3x4) single matrix that directly translates radar points into camera pixel coordinates. The method minimizes the effort required for the calibration of radar and camera, providing an easy and straightforward approach to generate a calibration matrix that seamlessly performs the calibration task.
[0026] The present invention eliminates the need for intrinsic calibrations of individual sensors, along with rotation and translation vectors. The normalization of DLT further reduces the re-projection error significantly, enhancing the accuracy and reliability of the calibration methodology for sensor fusion. The seamless integration of data from the radar sensors and cameras enhances the accuracy of object detection, localization, and tracking, ultimately contributing to the safety and efficiency of the vehicle.
[0027] The calibration method involves the utilization of trihedral aluminum corner reflectors, renowned for their efficacy with automotive-grade 77 GHz radar. The world coordinates in the Direct Linear Transformation (DLT) process are derived from the radar reflector points generated by the corner reflectors. The calibration setup comprises a corner reflector and a whiteboard, with the latter facilitating the camera's identification of the corner reflector at extended distances. The pixel coordinates corresponding to the corner reflector in the camera image are manually determined and serve as the ground truth for radar point transformation. The radar and camera data points are normalized and subsequently fed into the DLT algorithm to generate a single projection matrix employed for radar-to-camera projection.
[0028] FIG 1 illustrates a flowchart of a method for improved radar-camera calibration without the need for rotation and translation matrices, in accordance with an embodiment of the invention. The method (100) comprises the steps of positioning a calibration setup within an environment, the setup comprises a trihedral aluminum corner reflector and a whiteboard. Further, a radar sensor and camera is directed towards the corner reflector, such that the corner reflector reflects multiple radar signals and data from both the radar sensor and the camera is captured from a common spatial location, thereby ensuring synchronized data collection in step (101). In step (102), the radar data points corresponding to specific points on the corner reflector are captured by the radar sensor. The captured radar data points are used to generate radar point coordinates in a radar coordinate system. In step (103), multiple camera images are captured, corresponding to the same points on the corner reflector. Further, pixel coordinates within the captured camera images are manually identified. These pixel coordinates serve as the ground truth for the transformation of radar coordinates into camera pixel coordinates, providing a reference for aligning the radar data with the camera data. In step (104), the radar data points and camera pixel coordinates are preprocessed by normalizing the coordinates. The normalization process minimizes numerical precision issues, thereby enhancing the accuracy of the transformation process. In step (105), the normalized radar data points and camera pixel coordinates are input or fed into a Direct Linear Transformation (DLT) algorithm. The DLT algorithm generates a transformation matrix that directly converts radar coordinates into camera pixel coordinates. The resulting transformation matrix is a 3x4 matrix, that facilitates real-time calibration and synchronization of radar and camera data. Further, in step (106), the transformation matrix derived from the DLT algorithm is applied in real-time, enabling the conversion of radar data into camera pixel coordinates. More particularly, the real-time conversion allows for synchronized tracking and detection of objects by the radar sensors and camera, thereby improving sensor fusion for object detection, localization, and tracking in applications such as autonomous vehicles and Advanced Driver Assistance Systems (ADAS).
[0029] FIG. 2 illustrates a calibration setup utilized for radar-camera calibration, in accordance with an embodiment of the invention. The calibration setup comprises a trihedral aluminum corner reflector with a range of 5 to 6-inch base length, that are optimized for automotive-grade 77 GHz radar systems. Further, the calibration setup comprises a whiteboard positioned in the calibration environment to facilitate long-range detection and identification of the corner reflector by the camera sensor. The corner reflector is mounted on a tripod stand to ensure stability and precise positioning within the calibration setup. The radar sensors and cameras are mounted on a vehicle and are directed towards the corner reflector to capture radar signals and corresponding camera images. The calibration setup ensures accurate calibration by generating radar point coordinates and image pixel coordinates, that serve as input for the Direct Linear Transformation (DLT) algorithm.
[0030] FIG 3 illustrates a plot of ground truth for radar point projection vs predictions from calibration matrix, in accordance with an embodiment of the invention. The plot shows the image pixel points (represented as grey points) that serves as the ground truth for radar point projection, and the radar points obtained by multiplying the radar data with the derived calibration matrix (represented as black points). The figure clearly shows that the radar points (black points) closely align with the image pixel points (grey points), indicating an almost perfect projection of radar coordinates onto the corresponding camera image, thereby demonstrating the accuracy and effectiveness of the calibration matrix generated using the Direct Linear Transformation (DLT) algorithm with normalization.
[0031] According to the present invention the radar points are accurately projected onto the object in the camera image. Further, the normalization comprises centering and scaling the radar data points and camera pixel coordinates using mean vectors of the radar data points and camera pixel coordinates for accurate transformation.
[0032] In one embodiment of present invention the calibration process involves corner reflector setup and a corner reflector in front of radar using corner aluminum radar reflectors with a base length of 6 inches and a whiteboard for long-range calibration. A tripod stand is employed to hold the corner reflector, positioning it above the tripod stand. The test calibration setup and test vehicle setup have the camera and radar sensors mounted on the vehicle.

[0033] FIG 4 illustrates the Average Euclidean Distance Error Comparison Analysis for the Direct Linear Transformation (DLT) method and the Normalized Direct Linear Transformation (NDLT) method, in accordance with an embodiment of the invention. The horizontal axis represents the number of radar-camera data pairs (K), while the vertical axis indicates the Mean Absolute Error (MAE), measured as the average Euclidean distance between the ground truth points and the projected points. The graph demonstrates that the DLT method exhibits higher error values across the dataset, particularly when fewer radar-camera data pairs are employed for calibration. This elevated error is attributed to the absence of normalization in the DLT method, which results in numerical instability and reduced precision during the derivation of the transformation matrix. While the DLT method shows a gradual reduction in error as the number of data pairs increases, its performance remains inferior compared to the NDLT method throughout the analysis. Conversely, the NDLT method achieves consistently lower error values, irrespective of the number of radar-camera data pairs utilized. This improved performance is facilitated by the incorporation of normalization and optimization techniques, that address numerical precision issues and enhance the accuracy of the calibration process.

[0034] Having generally described this invention, a further understanding can be obtained by reference to a specific example, which is provided herein for the purpose of illustration only and is not intended to be limiting unless otherwise specified.
Example 1: Analysis of Radar-to-Camera Calibration Using the Proposed Invention

[0035] As an illustrative example, consider an efficient radar-camera calibration method utilizing a single transformation matrix to project radar coordinates into camera pixel coordinates. The calibration setup comprises a trihedral aluminum corner reflector and a whiteboard, wherein radar and camera sensors mounted on a vehicle, capture radar data points and corresponding camera images from a common spatial location. The corner reflector facilitates precise identification of points in the camera image, that are manually selected as ground truth for alignment.

[0036] The radar and camera data points undergo preprocessing through normalization to minimize numerical precision issues. The normalized data points are processed using the Direct Linear Transformation (DLT) algorithm, solved via Single Value Decomposition (SVD) to derive a 3x4 transformation matrix. The transformation matrix directly translates radar coordinates into camera pixel coordinates, enabling accurate and real-time object detection, localization, and tracking without requiring separate rotation and translation matrices.

[0037] The method demonstrates high accuracy, validated through iterative calibration with over 120-125 radar-camera data pairs at varying ranges and azimuths. The optimized projection of radar points onto camera coordinates using the Normalized Direct Linear Transformation (NDLT) ensures minimal error, as depicted through the Average Euclidean Distance Error comparison. The invention significantly enhances real-time sensor fusion for applications in autonomous vehicles and Advanced Driver Assistance Systems (ADAS), facilitating precise and synchronized detection and tracking of objects.

[0038] The present method reduces the re-projection error and also offers a faster and more efficient alternative to existing extrinsic calibration methods. The efficiency of the radar-to-camera calibration process is underscored by the requirement for only a single projection matrix to seamlessly convert radar points into corresponding camera pixels. The method is employed with ease for different sets of sensors, necessitating only the adjustment of corner reflector points from the radar and their associated pixel coordinates. The present method eliminates the necessity for intrinsic camera calibration to determine focal length and principal axis points, as well as extrinsic calibration for obtaining rotation and translation matrices. Further, by normalizing the collected radar and camera data and applying an optimization algorithm, the method achieves a substantial reduction in re-projection error, thereby enhancing the overall accuracy and precision of the calibration process.

,CLAIMS:We claim:
1. A method for improved radar-camera calibration without a rotation matrix and a translation matrix, the method (100) comprising the steps of:
a. positioning a calibration setup comprising a trihedral aluminum corner reflector and a whiteboard in an environment, wherein one or more radar sensors and one or more cameras are directed towards the trihedral aluminum corner reflector to reflect plurality of radar signals, and the data from both the radar sensor and the camera is captured from a common spatial location;
b. capturing plurality of radar data points corresponding to the points on the trihedral aluminum corner reflector, thereby generating plurality of radar point coordinates in a radar coordinate system;
c. capturing plurality of camera images corresponding to the radar data points on the trihedral aluminum corner reflector and manually identifying pixel coordinates within the camera images;
d. preprocessing the radar data points and the camera pixel coordinates by normalizing, thereby minimizing numerical precision issues;
e. applying a Direct Linear Transformation (DLT) algorithm to the normalized radar data points and the camera pixel coordinates to derive a transformation matrix, wherein the transformation matrix converts radar coordinates into camera pixel coordinates, enabling real-time calibration and synchronization.

2. The method (100) as claimed in claim 1, wherein the camera pixel coordinates corresponding to the trihedral aluminum corner reflector in the camera images act as a ground truth for transformation of radar data points.

3. The method (100) as claimed in claim 1, wherein the normalization comprises centering and scaling the radar data points and camera pixel coordinates using mean vectors of the radar data points and camera pixel coordinates for accurate transformation.

4. The method (100) as claimed in claim 1, wherein the transformation matrix is applied in real-time to project radar data points onto corresponding objects in the camera image, facilitating synchronized and accurate object detection, localization, and tracking for vehicles.

5. The method (100) as claimed in claim 1, wherein the positioning of the calibration setup comprises capturing plurality of radar-camera data pairs at different ranges and azimuths during multiple calibration iterations, thereby ensuring the acquisition of a comprehensive data set for accurate calibration.

Documents

Application Documents

# Name Date
1 202341079326-PROVISIONAL SPECIFICATION [22-11-2023(online)].pdf 2023-11-22
2 202341079326-PROOF OF RIGHT [22-11-2023(online)].pdf 2023-11-22
3 202341079326-POWER OF AUTHORITY [22-11-2023(online)].pdf 2023-11-22
4 202341079326-FORM 1 [22-11-2023(online)].pdf 2023-11-22
5 202341079326-DRAWINGS [22-11-2023(online)].pdf 2023-11-22
6 202341079326-APPLICATIONFORPOSTDATING [18-11-2024(online)].pdf 2024-11-18
7 202341079326-FORM-5 [20-12-2024(online)].pdf 2024-12-20
8 202341079326-FORM 3 [20-12-2024(online)].pdf 2024-12-20
9 202341079326-DRAWING [20-12-2024(online)].pdf 2024-12-20
10 202341079326-COMPLETE SPECIFICATION [20-12-2024(online)].pdf 2024-12-20
11 202341079326-FORM 18 [10-04-2025(online)].pdf 2025-04-10
12 202341079326-RELEVANT DOCUMENTS [18-11-2025(online)].pdf 2025-11-18
13 202341079326-POA [18-11-2025(online)].pdf 2025-11-18
14 202341079326-FORM 13 [18-11-2025(online)].pdf 2025-11-18