I. Introduction
Nowadays, and more than a decade after the first self-driving car winning the DARPA Challenge, the interest in developing and deploying fully autonomous vehicles has come to a full swing. An autonomous vehicle requires reliable solutions to provide an accurate mapping of the surroundings, which is only possible with multi-sensor perception systems relying on a combination of Radar, Cameras, and light detection and ranging (LiDAR) sensors [1]–[4]. Working together, they provide the ability to detect the distance and speed of nearby obstacles as well as their aspect to safely navigate the environment, contributing to different Society of Automotive Engineers (SAE) Levels of driving automation. While levels 0, 1, and 2 require the driver to monitor the surroundings, with higher levels the automated system monitors the entire driving environment. The utilization of LiDAR sensors in the automotive sector is relatively new, but already assumed as the key technology towards full driverless vehicles, since they can provide high-resolution 3D representations of the surroundings in real-time [5]–[7]. A LiDAR sensor works by illuminating a target with an optical pulse and measuring the characteristics of the return signal, where the target’s distance is obtained by calculating the round-trip delay of the reflected light. Despite simple, applying this principle is not straightforward since this technology is quite sensitive to several external disturbances.