Show simple item record

dc.contributor.authorScaramuzza, Davide
dc.date.accessioned2017-10-03T19:17:01Z
dc.date.available2017-10-03T19:17:01Z
dc.date.issued2017-09-22
dc.identifier.urihttp://hdl.handle.net/1853/58819
dc.descriptionPresented on September 22, 2017 from 12:15 p.m.-1:15 p.m. in the Technology Square Research Building (TSRB) Banquet Hall, Georgia Tech.en_US
dc.descriptionDavide Scaramuzza is an associate professor of robotics and perception in the departments of Informatics (University of Zurich) and Neuroinformatics (University of Zurich and ETH Zurich), where he completes research at the intersection of robotics, computer vision, and neuroscience. Scaramuzza completed his Ph.D. in robotics and computer vision at ETH Zurich under the direction of Roland Siegwart and held a postdoctoral position at the University of Pennsylvania, working with Vijay Kumar and Kostas Daniilidis. From 2009 to 2012, he led the European project sFly, which introduced the PX4 autopilot and pioneered visual-SLAM–based autonomous navigation of micro drones. For his research contributions, he was awarded the IEEE Robotics and Automation Society Early Career Award, the Misha Mahowald Neuromorphic Engineering Award, the SNSF-ERC Starting Grant (equivalent to an NSF Career Award), a Google Research Award, the European Young Researcher Award, and several conference paper awards. Scaramuzza coauthored the book Introduction to Autonomous Mobile Robots (published by MIT Press) and more than 100 papers on robotics and perception. In 2015, he co-founded a venture, called Zurich- Eye, dedicated to the commercialization of visual-inertial navigation solutions for mobile robots, which later became Facebook-Oculus VR.en_US
dc.descriptionRuntime: 63:09 minutesen_US
dc.description.abstractAutonomous quadrotors will soon play a major role in search‐and‐rescue and remote‐inspection missions, where a fast response is crucial. Quadrotors have the potential to navigate quickly through unstructured environments, enter and exit buildings through narrow gaps, and fly through collapsed buildings. However, their speed and maneuverability are still far from those of birds. Indeed, agile navigation through unknown, indoor environments poses a number of challenges for robotics research in terms of perception, state estimation, planning, and control. In this talk, I will show that active vision is crucial in order to plan trajectories that improve the quality of perception. Also, I will talk about our recent results on event based vision to enable low latency sensory motor control and navigation in low light and high dynamic environment, where traditional vision sensor fail.en_US
dc.format.extent63:09 minutes
dc.language.isoen_USen_US
dc.publisherGeorgia Institute of Technologyen_US
dc.relation.ispartofseriesIRIM Seminar Seriesen_US
dc.subjectDroneen_US
dc.subjectRoboticsen_US
dc.subjectVisionen_US
dc.titleAutonomous, Agile, Vision‐Controlled Drones: From Frame to Event Visionen_US
dc.typeLectureen_US
dc.typeVideoen_US
dc.contributor.corporatenameGeorgia Institute of Technology. Institute for Robotics and Intelligent Machinesen_US
dc.contributor.corporatenameEidgenössische Technische Hochschule Zürich (ETH Zurich)en_US
dc.contributor.corporatenameUniversity of Zurichen_US


Files in this item

Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

  • IRIM Seminar Series [110]
    Each semester a core seminar series is announced featuring guest speakers from around the world and from varying backgrounds in robotics.

Show simple item record