Show simple item record

dc.contributor.authorSchuster, Martin J.en_US
dc.contributor.authorOkerman, Jasonen_US
dc.contributor.authorNguyen, Haien_US
dc.contributor.authorRehg, James M.en_US
dc.contributor.authorKemp, Charles C.en_US
dc.date.accessioned2011-03-04T16:05:56Z
dc.date.available2011-03-04T16:05:56Z
dc.date.issued2010-12
dc.identifier.citationMartin J. Schuster, Jason Okerman, Hai Nguyen, James M. Rehg, and Charles C. Kemp,"Perceiving Clutter and Surfaces for Object Placement in Indoor Environments," Proceedings of the 10th IEEE-RAS International Conference on Humanoid Robots, 2010.en_US
dc.identifier.isbn978-1-4244-8689-2
dc.identifier.isbn978-1-4244-8688-5
dc.identifier.urihttp://hdl.handle.net/1853/37072
dc.description©2010 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.en_US
dc.descriptionPresented at the 2010 IEEE-RAS International Conference on Humanoid Robots, Nashville, TN, USA, December 6-8, 2010.en_US
dc.description.abstractHandheld manipulable objects can often be found on flat surfaces within human environments. Researchers have previously demonstrated that perceptually segmenting a flat surface from the objects resting on it can enable robots to pick and place objects. However, methods for performing this segmentation can fail when applied to scenes with natural clutter. For example, low-profile objects and dense clutter that obscures the underlying surface can complicate the interpretation of the scene. As a first step towards characterizing the statistics of real-world clutter in human environments, we have collected and hand labeled 104 scans of cluttered tables using a tilting laser range finder (LIDAR) and a camera. Within this paper, we describe our method of data collection, present notable statistics from the dataset, and introduce a perceptual algorithm that uses machine learning to discriminate surface from clutter. We also present a method that enables a humanoid robot to place objects on uncluttered parts of flat surfaces using this perceptual algorithm. In cross-validation tests, the perceptual algorithm achieved a correct classification rate of 78.70% for surface and 90.66% for clutter, and outperformed our previously published algorithm. Our humanoid robot succeeded in 16 out of 20 object placing trials on 9 different unaltered tables, and performed successfully in several high-clutter situations. 3 out of 4 failures resulted from placing objects too close to the edge of the table.en_US
dc.language.isoen_USen_US
dc.publisherGeorgia Institute of Technologyen_US
dc.subjectCamerasen_US
dc.subjectHumanoid robotsen_US
dc.subjectImage segmentationen_US
dc.subjectLaser rangingen_US
dc.subjectLearning (artificial intelligence)en_US
dc.subjectManipulatorsen_US
dc.subjectOptical radaren_US
dc.subjectRadar clutteren_US
dc.subjectRobot visionen_US
dc.titlePerceiving Clutter and Surfaces for Object Placement in Indoor Environmentsen_US
dc.typeProceedingsen_US
dc.typePost-printen_US
dc.contributor.corporatenameGeorgia Institute of Technology. Healthcare Robotics Laben_US
dc.contributor.corporatenameTechnische Universität Münchenen_US
dc.contributor.corporatenameGeorgia Institute of Technology. Center for Robotics and Intelligent Machinesen_US
dc.publisher.originalInstitute of Electrical and Electronics Engineersen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record