Show simple item record

dc.contributor.authorTa, Duy-Nguyen
dc.contributor.authorOk, Kyel
dc.contributor.authorDellaert, Frank
dc.date.accessioned2014-04-10T20:02:25Z
dc.date.available2014-04-10T20:02:25Z
dc.date.issued2013-11
dc.identifier.citationTa, D-N; Ok, K.; & Dellaert, F. (2013). “Monocular Parallel Tracking and Mapping with Odometry Fusion for MAV Navigation in Feature-lacking Environments”. IEEE/RSJ International Workshop on Vision-based Closed-Loop Control and Navigation of Micro Helicopters in GPS-denied Environments (IROS 2013), 7 November, 2013.en_US
dc.identifier.urihttp://hdl.handle.net/1853/51585
dc.description©2013 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.descriptionPresented at the IEEE/RSJ International Workshop on Vision-based Closed-Loop Control and Navigation of Micro Helicopters in GPS-denied Environments (IROS 2013), November 7, 2013, Tokyo, Japan.
dc.description.abstractDespite recent progress, autonomous navigation on Micro Aerial Vehicles with a single frontal camera is still a challenging problem, especially in feature-lacking environ- ments. On a mobile robot with a frontal camera, monoSLAM can fail when there are not enough visual features in the scene, or when the robot, with rotationally dominant motions, yaws away from a known map toward unknown regions. To overcome such limitations and increase responsiveness, we present a novel parallel tracking and mapping framework that is suitable for robot navigation by fusing visual data with odometry measurements in a principled manner. Our framework can cope with a lack of visual features in the scene, and maintain robustness during pure camera rotations. We demonstrate our results on a dataset captured from the frontal camera of a quad- rotor flying in a typical feature-lacking indoor environment.en_US
dc.language.isoen_USen_US
dc.publisherGeorgia Institute of Technologyen_US
dc.subjectAutonomous navigationen_US
dc.subjectIndoor environmenten_US
dc.subjectMAVen_US
dc.subjectMonocularen_US
dc.subjectMonoSLAMen_US
dc.subjectParallel tracking and mapping;en_US
dc.subjectQuadrotoren_US
dc.subjectSLAMen_US
dc.subjectWall-floor intersection featuresen_US
dc.titleMonocular Parallel Tracking and Mapping with Odometry Fusion for MAV Navigation in Feature-Lacking Environmentsen_US
dc.typePosteren_US
dc.contributor.corporatenameGeorgia Institute of Technology. College of Computingen_US
dc.contributor.corporatenameGeorgia Institute of Technology. Institute for Robotics and Intelligent Machinesen_US
dc.publisher.originalInstitute of Electrical and Electronics Engineers
dc.embargo.termsnullen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record