I. Introduction
In mobile robotics and autonomous driving, precise positioning and navigation are crucial. Cameras provide rich color information, while LiDAR offers direct 3D measurements. Combining these sensors through multi-sensor fusion enhances performance, leveraging their complementary strengths 0. LiDAR and optical cameras are widely used together in SLAM [1] , autonomous driving [2] , and target detection [3] . However, inaccurate external parameters can lead to disastrous outcomes in the application of subsequent tasks, which increases the demand for precise external calibration.