Loading web-font TeX/Main/Regular
R - LIVE: A Robust, Real-Time, LiDAR-Inertial-Visual Tightly-Coupled State Estimator and Mapping | IEEE Journals & Magazine | IEEE Xplore

R ^2 LIVE: A Robust, Real-Time, LiDAR-Inertial-Visual Tightly-Coupled State Estimator and Mapping


Abstract:

In this letter, we propose a robust, real-time tightly-coupled multi-sensor fusion framework, which fuses measurements from LiDAR, inertial sensor, and visual camera to a...Show More

Abstract:

In this letter, we propose a robust, real-time tightly-coupled multi-sensor fusion framework, which fuses measurements from LiDAR, inertial sensor, and visual camera to achieve robust and accurate state estimation. Our proposed framework is composed of two parts: the filter-based odometry and factor graph optimization. To guarantee real-time performance, we estimate the state within the framework of error-state iterated Kalman-filter, and further improve the overall precision with our factor graph optimization. Taking advantage of measurements from all individual sensors, our algorithm is robust enough to various visual failure, LiDAR-degenerated scenarios, and is able to run in real time on an on-board computation platform, as shown by extensive experiments conducted in indoor, outdoor, and mixed environments of different scale (see attached video). Moreover, the results show that our proposed framework can improve the accuracy of state-of-the-art LiDAR-inertial or visual-inertial odometry. To share our findings and to make contributions to the community, we open source our codes on our Github.
Published in: IEEE Robotics and Automation Letters ( Volume: 6, Issue: 4, October 2021)
Page(s): 7469 - 7476
Date of Publication: 08 July 2021

ISSN Information:

Funding Agency:

References is not available for this document.

I. Introduction

With the capacity of estimating ego-motion in six degrees of freedom (DOF) and simultaneously building dense and high precision maps of surrounding environments, LiDAR-based SLAM has been widely applied in the field of autonomous driving vehicles [1], drones [2], [3], and etc. With the

https://youtu.be/9lqRHmlN_MA.

https://github.com/hku-mars/r2live.

development of LiDAR technologies, the emergence of low-cost LiDARs (e.g., Livox LiDAR [4]) makes LiDAR more accessible. Following this trend, a number of related works [5]–[9] draw the attention of the community to this field of research. However, the accuracy of LiDAR-based SLAM methods would significantly degrade or even fail in those scenarios with few available geometry features, which is more critical for those LiDARs with small FoV [10]. In such scenarios, adding visual features could increase the system's robustness and accuracy. In this work, we propose a LiDAR-inertial-visual fusion framework to obtain the state estimation of higher robustness and accuracy. The main contributions of our work are:

We develop a tightly-coupled LiDAR-inertial-visual system for real-time state estimation and mapping. Building on several key techniques from current state-of-the-art LiDAR-inertial and visual-inertial navigation systems, the system consists of a high-rate filter-based odometry and a low-rate factor graph optimization. The filter-based odometry fuses the measurements of LiDAR, inertial, and camera sensors within an error-state iterated Kalman filter to achieve real-time performance. The factor graph optimization refines a local map of keyframe poses and visual landmark positions.

We conduct various experiments showing that the developed system is able to run in various challenging scenarios with aggressive motion, sensor failure, and even in narrow tunnel-like environments with a large number of moving objects and small LiDAR field of view. It achieves more accurate and robust results than the current existing baselines and is accurate enough to be used to reconstruct large-scale, indoor-outdoor dense 3D maps of building structures (see Fig. 1).

We open-source the system, which could benefit the whole robotic community and serve as a baseline for comparison in this field of research.

Select All
1.
J. Levinson et al., "Towards fully autonomous driving: Systems and algorithms", Proc. IEEE Intell. Veh. Symp. (IV), pp. 163-168, 2011.
2.
A. Bry, A. Bachrach and N. Roy, "State estimation for aggressive flight in gps-denied environments using onboard sensing", Proc. IEEE Int. Conf. Robot. Automat., pp. 1-8, 2012.
3.
F. Gao, W. Wu, W. Gao and S. Shen, "Flying on point clouds: Online trajectory generation and autonomous navigation for quadrotors in cluttered environments", J. Field Robot., vol. 36, no. 4, pp. 710-733, 2019.
4.
Z. Liu, F. Zhang and X. Hong, "Low-cost retina-like robotic lidars based on incommensurable scanning", IEEE Trans. Mechatronics, 2021.
5.
W. Xu and F. Zhang, "Fast-lio: A fast robust lidar-inertial odometry package by tightly-coupled iterated kalman filter", IEEE Robot. Automat. Lett., vol. 6, no. 2, pp. 3317-3324, Apr. 2021.
6.
J. Lin, X. Liu and F. Zhang, "A decentralized framework for simultaneous calibration localization and mapping with multiple lidars", Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 4870-4877, 2020.
7.
Z. Liu and F. Zhang, "Balm: Bundle adjustment for lidar mapping", IEEE Robot. Automat. Lett., vol. 6, no. 2, pp. 3184-3191, Apr. 2021.
8.
X. Liu and F. Zhang, "Extrinsic calibration of multiple lidars of small fov in targetless environments", IEEE Robot. Automat. Lett., vol. 6, no. 2, pp. 2036-2043, Apr. 2021.
9.
C. Yuan, X. Liu, X. Hong and F. Zhang, "Pixel-level extrinsic self calibration of high resolution lidar and camera in targetless environments", 2021.
10.
J. Lin and F. Zhang, "Loam livox: A fast robust high-precision lidar odometry and mapping package for lidars of small fov", Proc. IEEE Int. Conf. Robot. Automat., pp. 3126-3131, 2020.
11.
J. Zhang and S. Singh, "LOAM: Lidar odometry and mapping in real-time", Proc. Robotics: Sci. Syst., vol. 2, no. 9, 2014.
12.
K.-L Low, "Linear least-squares optimization for point-to-plane icp surface registration", vol. 4, no. 10, pp. 1-3, 2004.
13.
T. Shan and B. Englot, "Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain", Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 4758-4765, 2018.
14.
J. Lin and F. Zhang, "A fast complete point cloud based loop closure for lidar odometry and mapping", 2019.
15.
H. Ye, Y. Chen and M. Liu, "Tightly coupled 3 d lidar inertial odometry and mapping", Proc. IEEE Int. Conf. Robot. Automat., pp. 3144-3150, 2019.
16.
T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti and D. Rus, "Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping".
17.
K. Li, M. Li and U. D. Hanebeck, "Towards high-performance solid-state-lidar-inertial odometry and mapping".
18.
C. Qin, H. Ye, C. E. Pranata, J. Han, S. Zhang and M. Liu, "Lins: A lidar-inertial state estimator for robust and efficient navigation", Proc. IEEE Int. Conf. Robot. Automat., pp. 8899-8906, 2020.
19.
J. Zhang and S. Singh, "Laser-visual-inertial odometry and mapping with high robustness and low drift", J. Field Robot., vol. 35, no. 8, pp. 1242-1264, 2018.
20.
W. Shao, S. Vijayarangan, C. Li and G. Kantor, "Stereo visual inertial lidar simultaneous localization and mapping".
21.
T. Laidlow, M. Bloesch, W. Li and S. Leutenegger, "Dense rgb-d-inertial slam with map deformations", Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 6741-6748, 2017.
22.
Y. Zhu, C. Zheng, C. Yuan, X. Huang and X. Hong, "Camvox: A low-cost and accurate lidar-assisted visual slam system".
23.
R. Mur-Artal and J. D. Tardós, "Orb-slam2: An open-source slam system for monocular stereo and rgb-d cameras", IEEE Trans. Robot., vol. 33, no. 5, pp. 1255-1262, Oct. 2017.
24.
X. Zuo, P. Geneva, W. Lee, Y. Liu and G. Huang, "Lic-fusion: Lidar-inertial-camera odometry", Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 5848-5854, 2019.
25.
X. Zuo, Y. Yang, J. Lv, Y. Liu, G. Huang and M. Pollefeys, "Lic-fusion 2.0: Lidar-inertial-camera odometry with sliding-window plane-feature tracking", Proc. IROS, pp. 5112-5119, 2020.
26.
T. Qin, P. Li and S. Shen, "Vins-mono: A robust and versatile monocular visual-inertial state estimator", IEEE Trans. Robot., vol. 34, no. 4, pp. 1004-1020, Aug. 2018.
27.
C. Hertzberg, R. Wagner, U. Frese and L. Schröder, "Integrating generic sensor fusion algorithms with sound state representations through encapsulation of manifolds", Inf. Fusion, vol. 14, no. 1, pp. 57-77, 2013.
28.
D. He, W. Xu and F. Zhang, "Embedding manifold structures into kalman filters".
29.
B. M. Bell and F. W. Cathey, "The iterated kalman filter update as a gauss-newton method", IEEE Trans. Autom. Control, vol. 38, no. 2, pp. 294-297, Feb. 1993.
30.
Z. Zhang and D. Scaramuzza, "A tutorial on quantitative trajectory evaluation for visual (-inertial) odometry", Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 7244-7251, 2018.
Contact IEEE to Subscribe

References

References is not available for this document.