Loading…

Online Calibration Between Camera and LiDAR With Spatial-Temporal Photometric Consistency

The fusion of 3D LiDAR and 2D camera data has gained popularity in the field of robotics in recent years. Extrinsic calibration is a critical issue in sensor data fusion. Poor calibration can lead to corrupt data and system failure. This letter introduces a method based on photometric consistency fo...

Full description

Saved in:
Bibliographic Details
Published in:IEEE robotics and automation letters 2024-02, Vol.9 (2), p.1027-1034
Main Authors: Jing, Yonglin, Yuan, Chongjian, Hong, Xiaoping
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The fusion of 3D LiDAR and 2D camera data has gained popularity in the field of robotics in recent years. Extrinsic calibration is a critical issue in sensor data fusion. Poor calibration can lead to corrupt data and system failure. This letter introduces a method based on photometric consistency for detecting and recalibrating camera LiDAR miscalibrations in arbitrary environments, online and without the need for calibration targets or manual work. We make the assumption that, with correct extrinsic parameters and accurate LiDAR pose estimation, the projections of each LiDAR point onto different camera images will have similar photometric values. By utilizing covisibility information, an error term based on the aforementioned photometric consistency assumption is proposed, enabling the detection and correction of miscalibration. Multiple experiments were conducted using real-world data sequences.
ISSN:2377-3766
2377-3766
DOI:10.1109/LRA.2023.3341768