Search
Now showing items 1-10 of 13
Bio-inspired Assistive Robotics: Service Dogs as a Model for Human-Robot Interaction and Mobile Manipulation
(Georgia Institute of Technology, 2008-10)
Service dogs have successfully provided assistance to thousands of motor-impaired people worldwide. As a step towards the creation of robots that provide comparable assistance, we present a biologically inspired robot ...
EL-E: An Assistive Mobile Manipulator that Autonomously Fetches Objects from Flat Surfaces
(Georgia Institute of Technology, 2008-03-12)
Objects within human environments are usually found on flat surfaces that are orthogonal to gravity, such as floors, tables, and shelves. We first present a new assistive robot that is explicitly designed to take advantage ...
PPS-Tags: Physical, Perceptual and Semantic Tags for Autonomous Mobile Manipulation
(Georgia Institute of Technology, 2009-10)
For many promising application areas, autonomous mobile manipulators do not yet exhibit sufficiently robust performance. We propose the use of tags applied to task-relevant locations in human environments in order to help ...
Autonomous Active Learning of Task-Relevant Features for Mobile Manipulation
(Georgia Institute of Technology, 2011)
We present an active learning approach that enables a mobile manipulator to autonomously learn task-relevant features. For a given behavior, our system trains a Support Vector Machine (SVM) that predicts the 3D locations ...
RF vision: RFID receive signal strength indicator (RSSI) images for sensor fusion and mobile manipulation
(Georgia Institute of Technology, 2009-10)
In this work we present a set of integrated methods that enable an RFID-enabled mobile manipulator to approach and grasp an object to which a self-adhesive passive (battery-free) UHF RFID tag has been affixed. Our primary ...
A Point-and-Click Interface for the Real World: Laser Designation of Objects for Mobile Manipulation
(Georgia Institute of Technology, 2008-03)
We present a novel interface for human-robot interaction that enables a human to intuitively and unambiguously se- lect a 3D location in the world and communicate it to a mo- bile robot. The human points at a location of ...
A Clickable World: Behavior Selection Through Pointing and Context for Mobile Manipulation
(Georgia Institute of Technology, 2008-09)
We present a new behavior selection system for human-robot interaction that maps virtual buttons overlaid on the physical environment to the robotpsilas behaviors, thereby creating a clickable world. The user clicks on a ...
The complex structure of simple devices: A survey of trajectories and forces that open doors and drawers
(Georgia Institute of Technology, 2010-09)
Instrumental activities of daily living (IADLs) involve physical interactions with diverse mechanical systems found within human environments. In this paper, we describe our efforts to capture the everyday mechanics of ...
Robots for Humanity: A Case Study in Assistive Mobile Manipulation
(Georgia Institute of Technology, 2013-03)
Assistive mobile manipulators have the potential to one day serve as surrogates and helpers for people with disabilities, giving them the freedom to perform tasks such as scratching an itch, picking up a cup, or socializing ...
Autonomously learning to visually detect where manipulation will succeed
(Georgia Institute of Technology, 2013-09)
Visual features can help predict if a manipulation behavior will succeed at a given location. For example, the success of a behavior that flips light switches depends on the location of the switch. We present methods that ...