Show simple item record

dc.contributor.authorBrashear, Helene
dc.contributor.authorKim, Jung Soo
dc.contributor.authorLyons, Kent
dc.contributor.authorStarner, Thad
dc.contributor.authorWesteyn, Tracy
dc.date.accessioned2009-04-27T16:22:03Z
dc.date.available2009-04-27T16:22:03Z
dc.date.issued2007-07
dc.identifier.urihttp://hdl.handle.net/1853/27819
dc.descriptionPresented at the 12th International Conference on Human-Computer Interaction, Beijing, China, July 2007.en
dc.descriptionThe original publication is available at www.springerlink.com
dc.description.abstractThe Gesture and Activity Recognition Toolit (GART) is a user interface toolkit designed to enable the development of gesture-based applications. GART provides an abstraction to machine learning algorithms suitable for modeling and recognizing different types of gestures. The toolkit also provides support for the data collection and the training process. In this paper, we present GART and its machine learning abstractions. Furthermore, we detail the components of the toolkit and present two example gesture recognition applications.en
dc.language.isoen_USen
dc.publisherGeorgia Institute of Technologyen
dc.subjectGesture recognitionen
dc.subjectUser interface toolkiten
dc.titleGART: The Gesture and Activity Recognition Toolkiten
dc.typeProceedingsen
dc.contributor.corporatenameGeorgia Institute of Technology. College of Computing
dc.contributor.corporatenameGeorgia Institute of Technology. Graphics, Visualization and Usability Center


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record