Show simple item record

dc.contributor.authorRouben, Anna
dc.contributor.authorTerveen, Loren
dc.date.accessioned2013-12-29T02:47:19Z
dc.date.available2013-12-29T02:47:19Z
dc.date.issued2007-06
dc.identifier.urihttp://hdl.handle.net/1853/50039
dc.description.abstractCell phones and other mobile devices let people receive information anywhere, anytime. Navigation information – directions and distance to a destination, interesting nearby locations, etc. – is especially promising. However, there are challenges to delivering information on a cell phone, particularly with a GUI. GUIs aren't ideal when a person's visual attention is elsewhere, e.g., scanning for landmarks, assessing safety, etc. And they don't work at all for blind people, who particularly need navigation assistance. Our work responds to this challenge. We investigate the use of two non-visual techniques for delivering navigation information, speech and sonification [[3], . We conducted an experiment to compare user performance with and preference for the two techniques, in both single task (navigate to a target) and dual task (navigate to a target and respond to an auditory stimulus) conditions. Users performed better with and preferred sonification in both conditions. We discuss the implications of these results for the design of navigation aids.en_US
dc.language.isoen_USen_US
dc.publisherGeorgia Institute of Technologyen_US
dc.subjectAuditory displayen_US
dc.subjectNavigationen_US
dc.subjectSonificationen_US
dc.subjectCognitive loaden_US
dc.subjectSecondary tasken_US
dc.titleSpeech and Non-Speech Audio: Navigational Information and Cognitive Loaden_US
dc.typeProceedingsen_US
dc.contributor.corporatenameResearch in Motionen_US
dc.contributor.corporatenameUniversity of Minnesota. Department of Computer Science and Engineeringen_US
dc.publisher.originalInternational Community on Auditory Displayen_US
dc.embargo.termsnullen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record