Show simple item record

dc.contributor.authorDantam, Neil
dc.contributor.authorStilman, Mike
dc.date.accessioned2012-07-19T20:04:56Z
dc.date.available2012-07-19T20:04:56Z
dc.date.issued2011-06
dc.identifier.citationDantam, N. & Stilman, M. (2011). "The Motion Grammar: Linguistic Perception, Planning, and Control". Proceedings of the 2011 Robotics: Science and Systems Conference VII (RSS), 27-30 June 2011. Online.en_US
dc.identifier.urihttp://hdl.handle.net/1853/44342
dc.descriptionPresented at the 2011 Robotics: Science and Systems Conference VII (RSS), 27-30 June 2011, Los Angeles, CA.en_US
dc.description.abstractWe present and analyze the Motion Grammar: a novel unified representation for task decomposition, perception, planning, and control that provides both fast online control of robots in uncertain environments and the ability to guarantee completeness and correctness. The grammar represents a policy for the task which is parsed in real-time based on perceptual input. Branches of the syntax tree form the levels of a hierarchical decomposition, and the individual robot sensor readings are given by tokens. We implement this approach in the interactive game of Yamakuzushi on a physical robot resulting in a system that repeatably competes with a human opponent in sustained gameplay for the roughly six minute duration of each match.en_US
dc.language.isoen_USen_US
dc.publisherGeorgia Institute of Technologyen_US
dc.subjectControlen_US
dc.subjectFormal methodsen_US
dc.subjectGrammarsen_US
dc.subjectManipulationen_US
dc.subjectRoboticsen_US
dc.titleThe Motion Grammar: Linguistic Perception, Planning, and Controlen_US
dc.typeProceedingsen_US
dc.contributor.corporatenameGeorgia Institute of Technology. Center for Robotics and Intelligent Machines
dc.contributor.corporatenameGeorgia Institute of Technology. School of Interactive Computing
dc.publisher.originalMIT


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record