Loading [MathJax]/extensions/MathZoom.js
Online Calibration Between Camera and LiDAR With Spatial-Temporal Photometric Consistency | IEEE Journals & Magazine | IEEE Xplore

Online Calibration Between Camera and LiDAR With Spatial-Temporal Photometric Consistency


Abstract:

The fusion of 3D LiDAR and 2D camera data has gained popularity in the field of robotics in recent years. Extrinsic calibration is a critical issue in sensor data fusion....Show More

Abstract:

The fusion of 3D LiDAR and 2D camera data has gained popularity in the field of robotics in recent years. Extrinsic calibration is a critical issue in sensor data fusion. Poor calibration can lead to corrupt data and system failure. This letter introduces a method based on photometric consistency for detecting and recalibrating camera LiDAR miscalibrations in arbitrary environments, online and without the need for calibration targets or manual work. We make the assumption that, with correct extrinsic parameters and accurate LiDAR pose estimation, the projections of each LiDAR point onto different camera images will have similar photometric values. By utilizing covisibility information, an error term based on the aforementioned photometric consistency assumption is proposed, enabling the detection and correction of miscalibration. Multiple experiments were conducted using real-world data sequences.
Published in: IEEE Robotics and Automation Letters ( Volume: 9, Issue: 2, February 2024)
Page(s): 1027 - 1034
Date of Publication: 12 December 2023

ISSN Information:

Funding Agency:

No metrics found for this document.

I. Introduction

In Recent years, the combination of 3D LiDAR (Light Detection and Ranging) and cameras has become common in the fields of autonomous driving and mobile robotics. LiDAR sensors, with their good 3D ranging capability, are extensively used in applications involving area mapping [1], object tracking [2], obstacle avoidance [3], [4], [5] and other range-critical tasks. They usually have lower angular resolution compared to cameras. However, cameras, despite their generally good angular resolution, cannot easily capture range information. This is because monocular cameras discard range information during the imaging process. When multiple cameras are used to retrieve range information through methods like triangulation, the range quality is typically lower compared to LiDAR, particularly in large outdoor scenarios. Consequently, the fusion of 3D LiDAR data and 2D camera data has become an attractive solution in various applications, leveraging the complementary strengths of cameras and LiDAR sensors [6], [7].

Usage
Select a Year
2025

View as

Total usage sinceDec 2023:646
051015202530JanFebMarAprMayJunJulAugSepOctNovDec162425000000000
Year Total:65
Data is updated monthly. Usage includes PDF downloads and HTML views.

Contact IEEE to Subscribe

References

References is not available for this document.