I. INTRODUCTION
Multi-sensor fusion has been an essential developing trend in the robotics field. In particular, the LiDAR sensor is widely employed due to its high accuracy, robustness, and reliability, such as for localization, semantic mapping, object tracking, detection [1]–[5]. Despite these advantages of LiDAR sensors, the downside is from the fact that the LiDAR samples a succession of 3D-points at different times, thus the motion of the sensor carrier introduces distortion like the rolling-shutter effect. To address this issue, the Inertial Measurement Unit (IMU), widely used for the ego-motion estimation at a high frequency, can be utilized as a complementary sensor to correct the distortion. In general, a LiDAR-IMU system benefits from the component sensors and becomes feasible for reliable perception in various scenarios.