A miniature vision-based localization system for indoor blimps
MetadataShow full item record
The blimp, a self-floating airship, has received increasing attention among the robotic community. In the past decade, most research focused on the blimp structure and control system design, while few researchers have shown interest in the blimp localization system. Here I propose developing an incremental vision-based localization system to enable blimps to localize themselves in an indoor environment autonomously. The localization system estimates a camera trajectory with input video sequences and a prebuilt map. Before running the system, I initially reconstruct an indoor environment by employing Structure from Motion with SuperPoint visual features. Next, with the previously built sparse point cloud map, the system generates camera poses by continuously employing pose estimation on matched visual features observed from the map. In this project, the blimp only serves as a reference mobile platform that constraints the weight of the perception system. The perception system contains one monocular camera and a WiFi adaptor to capture and transmit visual data to a ground PC station where the algorithms will be executed. The success of this project will transform remote control indoor blimps into autonomous indoor blimps, which can be utilized for applications such as entertainment, surveillance, and advertisement.