Contextual audio in haptic graph browsing
Abstract
This paper presents a ``think-aloud'' study investigating the ability of visually impaired participants to make comparisons between haptic and audio line graphs. Graphs with two data series were presented. One data series was explored with a PHANTOM haptic device, whilst the other was sonified using one of two data - sound mappings. The results show that participants can make comparisons between the two lines. However, there is some cross-modal interference which makes it difficult to extract detailed information about the data series presented in audio.