Abstract:
Simultaneous localization and mapping (SLAM) and navigation systems are pivotal to realizing autonomous robotic capabilities. Research toward these systems will have dire...Show MoreMetadata
Abstract:
Simultaneous localization and mapping (SLAM) and navigation systems are pivotal to realizing autonomous robotic capabilities. Research toward these systems will have direct impacts on the already large and rapidly growing market of autonomous robotics. Typically, research efforts in SLAM and navigation isolate one of the two systems when benchmarking their performance. As such, potential confounding issues from integrating the full navigation stack with pose estimates from a SLAM algorithm, such as the compounded latency of both systems, never arise. This paper aims to remove this shortcoming by providing a cost-effective closed-loop benchmarking framework for several navigation stacks integrated with several SLAM algorithms on a physical platform (Turtlebot2 in an office environment). A frugal vision-based motion capture method is utilized to capture the localization data. The lessened cost of such a system removes the barriers to real-world testing of this type often presented by common expensive motion capture systems. In the benchmarking framework, conventional performance metrics for both SLAM and navigation research and novel metrics were collected to characterize the closed-loop performance. An analysis of these metrics will enable the selection of algorithms for real-world applications since the framework tests complete mobile autonomy stacks in a task-driven closed-loop environment.
Published in: 2024 IEEE Opportunity Research Scholars Symposium (ORSS)
Date of Conference: 15 April 2024 - 15 July 2024
Date Added to IEEE Xplore: 02 October 2024
ISBN Information: