Contextual Computing Group's research creates computational interfaces and agents for use in everyday mobile environments. They combine wearable and ubiquitous computing technologies with techniques from the fields of artificial intelligence (AI), pattern recognition, and human computer interaction (HCI).

Collections in this community

Recent Submissions

  • The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring 

    Starner, Thad; Auxier, Jake; Ashbrook, Daniel; Gandy, Maribeth (Georgia Institute of Technology, 2000-10)
    In this paper we present a wearable device for control of home automation systems via hand gestures. This solution has many advantages over traditional home automation interfaces in that it can be used by those with ...
  • Mobile Capture for Wearable Computer Usability Testing 

    Lyons, Kent; Starner, Thad (Georgia Institute of Technology, 2001-10)
    The mobility of wearable computers makes usability testing difficult. In order to fully understand how a user interacts with the wearable, the researcher must examine both the user’s direct interactions with the computer, ...
  • Using Multiple Sensors for Mobile Sign Language Recognition 

    Brashear, Helene; Starner, Thad; Lukowicz, Paul; Junker, Holger (Georgia Institute of Technology, 2003-10)
    We build upon a constrained, lab-based Sign Language recognition system with the goal of making it a mobile assistive technology. We examine using multiple sensors for disambiguation of noisy data to improve recognition ...
  • Towards a One-Way American Sign Language Translator 

    Brashear, Helene; Henderson, Valerie; Hernandez-Rebollar, Jose; McGuire, R. Martin; Ross, Danielle S.; Starner, Thad (Georgia Institute of Technology, 2004-05)
    Inspired by the Defense Advanced Research Projects Agency's (DARPA) recent successes in speech recognition, we introduce a new task for sign language recognition research: a mobile one-way American Sign Language ...
  • GART: The Gesture and Activity Recognition Toolkit 

    Brashear, Helene; Kim, Jung Soo; Lyons, Kent; Starner, Thad; Westeyn, Tracy (Georgia Institute of Technology, 2007-07)
    The Gesture and Activity Recognition Toolit (GART) is a user interface toolkit designed to enable the development of gesture-based applications. GART provides an abstraction to machine learning algorithms suitable for ...
  • The Gesture Watch: A Wireless Contact-Free Gesture Based Wrist Interface 

    Kim, Jungsoo; He, Jiasheng; Lyons, Kent; Starner, Thad (Georgia Institute of Technology, 2007-10)
    We introduce the Gesture Watch, a mobile wireless device worn on a user’s wrist that allows hand gesture control of other devices. The Gesture Watch utilizes an array of infrared proximity sensors to sense hand gestures ...