Loading [a11y]/accessibility-menu.js
Topology-Guided Perception-Aware Receding Horizon Trajectory Generation for UAVs | IEEE Conference Publication | IEEE Xplore

Topology-Guided Perception-Aware Receding Horizon Trajectory Generation for UAVs


Abstract:

The perception-aware motion planning method based on the localization uncertainty has the potential to improve the localization accuracy for robot navigation. How-ever, m...Show More

Abstract:

The perception-aware motion planning method based on the localization uncertainty has the potential to improve the localization accuracy for robot navigation. How-ever, most of the existing perception-aware methods pre-build a global feature map and can not generate the perception- aware trajectory in real time. This paper proposes a topology- guided perception-aware receding horizon trajectory generation method, which contains a topology-guided position trajectory generation and a perception-aware yaw angle trajectory generation. Specifically, a memorable active map is built by selectively storing the visual landmarks. After that, a library of candidate topological trajectories are generated, which are then evaluated in terms of the perception quality based on the active map, smoothness, collision possibility and feasibility. In addition, the yaw angle trajectory is obtained through a front-end multiple refined path search and a back-end path- guided trajectory optimization. Comparative simulation and real-world experiments are carried out to confirm that the proposed method can keep more visual features in view and reduce the localization error.
Date of Conference: 01-05 October 2023
Date Added to IEEE Xplore: 13 December 2023
ISBN Information:

ISSN Information:

Conference Location: Detroit, MI, USA

Funding Agency:


I. Introduction

Unmanned aerial vehicles (UAVs) are widely used for different tasks such as autonomous exploration [1], mapping [2], photography [3], goods transportation [4] and rescue [5]. In these applications, the state estimation module is crucial to obtain the position, velocity and rotation of UAVs. The camera and inertial measurement unit (IMU) are appropriate sensors for the state estimation of UAV s due to their low cost and weight. Typical visual-inertial navigation systems track multiple visual features to estimate the state of the robot. The motion planning module will affect the observation of visual features. Meanwhile, the quantity and the quality of the visual features in the view will also affect the localization accuracy. However, most existing motion planning methods focus on the time-optimal performance and the safety of the position trajectory [6]–[11], which do not consider the perception constraint.

Illustration of different motion planning methods for a quadrotor from the red starting point to the green ending point. The blue trajectory with perception awareness enables the quadrotor to see the regions with richer textures. On the other hand, the yellow trajectory, lacking perception awareness, is susceptible to crossing the areas with the fewer textures, resulting in inaccurate localization.

Contact IEEE to Subscribe

References

References is not available for this document.