An Architecture for Gesture-Based Control of Mobile Robots

View/ Open
Date
1999-10Author
Iba, Soshi
Vande Weghe, J. Michael
Paredis, Chris
Khosla, Pradeep K.
Metadata
Show full item recordAbstract
Gestures provide a rich and intuitive form of interaction for controlling robots. This paper presents an approach for controlling a mobile robot with hand gestures. The system uses Hidden Markov Models (HMMs) to spot and recognize gestures captured with a data glove. To spot gestures from a sequence of hand positions that may include non-gestures, we have introduced a "wait state" in the HMM. The system is currently capable of spotting six gestures reliably. These gestures are mapped to robot commands under two different modes of operation: local and global control. In the local control module, the gestures are interpreted in the robot's local frame of reference, allowing the user to accelerate, decelerate, and turn. In the global control module, the gestures are interpreted in the world frame, allowing the robot to move to the location at which the user is pointing.