Evaluation of Monocular Visual-Inertial SLAM: Benchmark and Experiment | IEEE Conference Publication | IEEE Xplore

Evaluation of Monocular Visual-Inertial SLAM: Benchmark and Experiment


Abstract:

Simultaneous Localization and Mapping (SLAM) is being developed as a hot topic issue in computer vision which nowadays, is the main core of self-localization and autonomo...Show More

Abstract:

Simultaneous Localization and Mapping (SLAM) is being developed as a hot topic issue in computer vision which nowadays, is the main core of self-localization and autonomous navigation in robotic technology and unmanned vehicles. In this way, Visual-Inertial SLAM algorithm is a popular strategy to attain high accurate 6-DOF state estimation. But such an accurate system is vulnerable to extreme movements and texture-less environments, and it sometimes fails in confronting such circumstances. In this paper, a tightly-coupled and optimization-based monocular Visual-Inertial SLAM system is proposed, which can tackle the scale ambiguity - a problem that arises by poor initialization. To perform this, the ORB-SLAM as the most reliable feature-based monocular SLAM algorithm has been selected as the base of our study. Then, to improve the accuracy, a Visual-Inertial Odometry (VIO) is carried out that fuses the camera information and Inertial Measurement Unit (IMU) data. We evaluate the performance of our system on the European Robotics Challenge (EuRoC) dataset and compare it with the state-of-the-art algorithms, providing better accuracy in some sequences owing to the improved initialization. Furthermore, we implement the real-world indoor experiment using a monocular-inertial camera to demonstrate the appropriate performance of our system.
Date of Conference: 20-21 November 2019
Date Added to IEEE Xplore: 20 April 2020
ISBN Information:

ISSN Information:

Conference Location: Tehran, Iran

I. Introduction

Visual SLAM (VSLAM) is a type of SLAM using the camera to simultaneously estimate the position and perform the mapping process in the environment. There is quite an extensive diversity of Monocular Visual SLAM algorithms, including Parallel Tracking and Mapping (PTAM) [1], MonoSLAM [2], RatSLAM [3] LSD-SLAM [4], ORB-SLAM [5], DTAM [6], etc. PTAM is the most well-known Visual SLAM algorithm in which the tracking threat estimates camera motion in real-time, and the mapping threat estimates accurate 3D positions of feature points with a computational cost.

Contact IEEE to Subscribe

References

References is not available for this document.