Robust Feature Detection, Acquisition and Tracking for Relative Navigation in Space with a Known Target
Holzinger, Marcus J.
MetadataShow full item record
Recent advances in robotics and computer vision have enabled the implementation of sophisticated vision-based relative navigation algorithms for robotic spacecraft using a single calibrated monocular camera. These techniques, initially developed for ground robots, show great promise for robotic spacecraft applications. However, several challenges still exist, which hinder the direct use of these approaches in the space environment without further modifications. For example, the use of a monocular camera for robotic space-craft operations with respect to a known target configuration may be limited owing to the abrupt illumination changes in a low-Earth orbit, long duration target tracking requirements during large target image change in scale, background outliers, and the necessity to perform (semi)autonomous relative navigation in the presence of limited resources (fuel, onboard computer hardware, etc). This paper proposes a relative navigation scheme in space that makes use of three different ingredients. First, two different feature detectors are used to ensure reliable feature detection over diverse distances, and subsequently fast feature selection/filtering is applied to detect the visual features of the fiducial marker. Next, a feature-pattern matching algorithm via robust affine registration is used for relative navigation to achieve robust automated re-acquisition in case of a lost target. Finally, a probabilistic graphical model-based fixed-lag smoothing based on factor graphs is used to accurately propagate relative translation and orientation 6-DOF state estimates and their velocities. The proposed approach is validated on hardware-in-the-loop 5-DOF spacecraft simulation facility at Georgia Tech.