Neural Network-Based Pose Estimation for Noncooperative Spacecraft Rendezvous | IEEE Journals & Magazine | IEEE Xplore

Neural Network-Based Pose Estimation for Noncooperative Spacecraft Rendezvous


Abstract:

This article presents the Spacecraft Pose Network (SPN), the first neural network-based method for on-board estimation of the pose, i.e., the relative position and attitu...Show More

Abstract:

This article presents the Spacecraft Pose Network (SPN), the first neural network-based method for on-board estimation of the pose, i.e., the relative position and attitude, of a known noncooperative spacecraft using monocular vision. In contrast to other state-of-the-art pose estimation approaches for spaceborne applications, the SPN method does not require the formulation of hand-engineered features and only requires a single grayscale image to determine the pose of the spacecraft relative to the camera. The SPN method uses a convolutional neural network (CNN) with three branches to solve the problem of relative attitude estimation. The first branch of the CNN bootstraps a state-of-the-art object detection algorithm to detect a 2-D bounding box around the target spacecraft in the input image. The region inside the 2-D bounding box is then used by the other two branches of the CNN to determine the relative attitude by initially classifying the input region into discrete coarse attitude labels before regressing to a finer estimate. The SPN method then estimates the relative position by using the constraints imposed by the detected 2-D bounding box and the estimated relative attitude. Further, with the detection of 2-D bounding boxes of subcomponents of the target spacecraft, the SPN method is easily generalizable to estimate the pose of multiple target geometries. Finally, to facilitate integration with navigation filters and perform continuous pose tracking, the SPN method estimates the uncertainty associated with the estimated pose. The secondary contribution of this article is the generation of the Spacecraft PosE Estimation Dataset (SPEED), which is used to train and evaluate the performance of the SPN method. SPEED consists of synthetic as well as actual camera images of a mock-up of the Tango spacecraft from the PRISMA mission. The synthetic images are created by fusing OpenGL-based renderings of the spacecraft's 3-D model with actual images of the Earth captu...
Published in: IEEE Transactions on Aerospace and Electronic Systems ( Volume: 56, Issue: 6, December 2020)
Page(s): 4638 - 4658
Date of Publication: 02 June 2020

ISSN Information:

No metrics found for this document.

I. Introduction

Close-range proximity operations between spacecraft has a rich history dating back to the Apollo program [1]. Since then examples of close-range proximity operations include the assembly and resupply of the International Space Station and five on-orbit repair missions of the Hubble Space Telescope [2]. The close-range navigation in these missions was made possible by the presence of onboard crew, cooperation, and active interspacecraft communication. Current and future generations of space robotics missions such as the RemoveDEBRIS mission by Surrey Space Centre [3], the Phoenix program by DARPA [4], and the Restore-L mission by NASA [5] also require close-range navigation. These missions include technology demonstrators for applications such as repair, refuel, and deorbiting of end-of-life, and nonfunctional satellites. The main challenge with performing close-range navigation in actual on-orbit servicing and debris removal missions is that the target spacecraft may be uncooperative [6]. Here, “uncooperative” implies that the target spacecraft may not be equipped with an active communication link or identifiable markers such as light-emitting diodes or corner cube reflectors for distance and attitude estimation. Further, ground-based estimates of the motion of the target spacecraft may be affected by significant uncertainties, and frequent inputs from the ground-station may not be possible for these missions. Therefore, both the relative position and orientation of the target spacecraft must be estimated onboard the servicer spacecraft using the onboard sensor suite.

Usage
Select a Year
2025

View as

Total usage sinceJun 2020:3,467
010203040506070JanFebMarAprMayJunJulAugSepOctNovDec645855000000000
Year Total:177
Data is updated monthly. Usage includes PDF downloads and HTML views.

Contact IEEE to Subscribe

References

References is not available for this document.