dc.contributor.author | Ta, Duy-Nguyen | |
dc.contributor.author | Ok, Kyel | |
dc.contributor.author | Dellaert, Frank | |
dc.date.accessioned | 2014-04-10T20:02:25Z | |
dc.date.available | 2014-04-10T20:02:25Z | |
dc.date.issued | 2013-11 | |
dc.identifier.citation | Ta, D-N; Ok, K.; & Dellaert, F. (2013). “Monocular Parallel Tracking and Mapping with Odometry Fusion for MAV Navigation in Feature-lacking Environments”. IEEE/RSJ International Workshop on Vision-based Closed-Loop Control and Navigation of Micro Helicopters in GPS-denied Environments (IROS 2013), 7 November, 2013. | en_US |
dc.identifier.uri | http://hdl.handle.net/1853/51585 | |
dc.description | ©2013 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | en_US |
dc.description | Presented at the IEEE/RSJ International Workshop on Vision-based Closed-Loop Control and Navigation of Micro Helicopters in GPS-denied Environments (IROS 2013), November 7, 2013, Tokyo, Japan. | |
dc.description.abstract | Despite recent progress, autonomous navigation
on Micro Aerial Vehicles with a single frontal camera is still
a challenging problem, especially in feature-lacking environ-
ments. On a mobile robot with a frontal camera, monoSLAM
can fail when there are not enough visual features in the scene,
or when the robot, with rotationally dominant motions, yaws
away from a known map toward unknown regions. To overcome
such limitations and increase responsiveness, we present a
novel parallel tracking and mapping framework that is suitable
for robot navigation by fusing visual data with odometry
measurements in a principled manner. Our framework can
cope with a lack of visual features in the scene, and maintain
robustness during pure camera rotations. We demonstrate our
results on a dataset captured from the frontal camera of a quad-
rotor flying in a typical feature-lacking indoor environment. | en_US |
dc.language.iso | en_US | en_US |
dc.publisher | Georgia Institute of Technology | en_US |
dc.subject | Autonomous navigation | en_US |
dc.subject | Indoor environment | en_US |
dc.subject | MAV | en_US |
dc.subject | Monocular | en_US |
dc.subject | MonoSLAM | en_US |
dc.subject | Parallel tracking and mapping; | en_US |
dc.subject | Quadrotor | en_US |
dc.subject | SLAM | en_US |
dc.subject | Wall-floor intersection features | en_US |
dc.title | Monocular Parallel Tracking and Mapping with Odometry Fusion for MAV Navigation in Feature-Lacking Environments | en_US |
dc.type | Poster | en_US |
dc.contributor.corporatename | Georgia Institute of Technology. College of Computing | en_US |
dc.contributor.corporatename | Georgia Institute of Technology. Institute for Robotics and Intelligent Machines | en_US |
dc.publisher.original | Institute of Electrical and Electronics Engineers | |
dc.embargo.terms | null | en_US |