Robust On-Manifold Optimization For Uncooperative Space Relative Navigation With A Single Camera

JOURNAL OF GUIDANCE CONTROL AND DYNAMICS(2021)

Cited 6|Views16
No score
Abstract
Optical cameras are gaining popularity as the suitable sensor for relative navigation in space due to their attractive sizing, power, and cost properties when compared with conventional flight hardware or costly laser-based systems. However, a camera cannot infer depth information on its own, which is often solved by introducing complementary sensors or a second camera. In this paper, an innovative model-based approach is demonstrated to estimate the six-dimensional pose of a target relative to the chaser spacecraft using solely a monocular setup. The observed facet of the target is tackled as a classification problem, where the three-dimensional shape is learned offline using Gaussian mixture modeling. The estimate is refined by minimizing two different robust loss functions based on local feature correspondences. The resulting pseudomeasurements are processed and fused with an extended Kalman filter. The entire optimization framework is designed to operate directly on the SE(3) manifold, uncoupling the process and measurement models from the global attitude state representation. It is validated on realistic synthetic and laboratory datasets of a rendezvous trajectory with the complex spacecraft Envisat, demonstrating estimation of the relative pose with high accuracy over full tumbling motion. Further evaluation is performed on the open-source SPEED dataset.
More
Translated text
Key words
uncooperative space relative navigation,robust,on-manifold
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined