|dc.description.abstract||The reliance of unmanned aerial vehicles (UAVs) on GPS and other external navigation aids has become a limiting factor for many missions. UAVs are now physically able to fly in many enclosed or obstructed environments, due to the shrinking size and weight of electronics and other systems. These environments, such as urban canyons or enclosed areas, often degrade or deny external signals. Furthermore, many of the most valuable potential missions for UAVs are in hostile or disaster areas, where navigation infrastructure could be damaged, denied, or actively used against the vehicle. It is clear that developing alternative, independent, navigation techniques will increase the operating envelope of UAVs and make them more useful.
This thesis presents work in the development of reliable monocular vision-aided inertial navigation for UAVs. The work focuses on developing a stable and accurate navigation solution in a variety of realistic conditions. First, a vision-aided inertial navigation algorithm is developed which assumes uncorrelated feature and vehicle states. Flight test results on a 80 kg UAV are presented, which demonstrate that it is possible to bound the horizontal drift with vision aiding. Additionally, a novel implementation method is developed for integration with a variety of navigation systems. Finally, a vision-aided navigation algorithm is derived within a Bierman-Thornton factored extended Kalman Filter (BTEKF) framework, using fully correlated vehicle and feature states. This algorithm shows improved consistency and accuracy by 2 to 3 orders of magnitude over the previous implementation, both in simulation and flight testing. Flight test results of the BTEKF on large (80 kg) and small (600 g) vehicles show accurate navigation over numerous tests.||