Decoupling Behavior, Perception, and Control for Autonomous Learning of Affordances

View/ Open
Date
2013-05Author
Hermans, Tucker
Rehg, James M.
Bobick, Aaron F.
Metadata
Show full item recordAbstract
A novel behavior representation is introduced that
permits a robot to systematically explore the best methods by
which to successfully execute an affordance-based behavior for
a particular object. The approach decomposes affordance-based
behaviors into three components. We first define controllers that specify how to achieve a desired change in object state through changes in the agent’s state. For each controller we
develop at least one
behavior primitive
that determines how
the controller outputs translate to specific movements of the
agent. Additionally we provide multiple
perceptual proxies
that define the representation of the object that is to be computed as input to the controller during execution. A variety of proxies may be selected for a given controller and a given proxy may
provide input for more than one controller. When developing an appropriate affordance-based behavior strategy for a given object, the robot can systematically vary these elements as
well as note the impact of additional task variables such as location in the workspace. We demonstrate the approach
using a PR2 robot that explores different combinations of
controller, behavior primitive, and proxy to perform a push or
pull positioning behavior on a selection of household objects,
learning which methods best work for each object.