This paper presents the development and implementation of vision-based state estimation algorithms to enable a free-flyer robotic spacecraft to navigate and explore an asteroid environment. A hybrid state estimation approach is developed that is composed of two distinct extended Kalman filter implementations that utilize IMU data and vision-based measurements. The first approach entails identifying known landmark features in the image plane. It is assumed that the locations of these landmarks, which may correspond to a marked take-off and landing zone, are known, providing GPS-like measurements for the navigation filter. The second approach addresses the scenario in which the vehicle flies away from known landmarks in order to explore an unknown environment. In this case, a homography-based filter implementation is employed that utilizes tracked planar feature points to extract estimates of the frame-to-frame translation and rotation of the vehicle. When the vehicle returns to an area with known landmarks, the landmark-based filter can then be used to correct for drift in the vehicle state estimates, resulting in improved accuracy. The performance of the hybrid state estimation algorithm is studied using results from a quadcopter simulation, followed by experimental results using monocular camera images and IMU data obtained from a quadcopter UAV. The simulation and experimental results demonstrate that, for scenarios in which landmarks are not always in view of the camera, the hybrid filter approach yields more accurate state estimation than the landmark-based or homography-based filters alone.
展开▼