Interacting with an information space using sound: Accuracy and patterns
Human auditory perception is suited to receiving and interpreting information from the environment but this knowledge has not been used extensively in designing computer-based information exploration tools. It is not known how accurate humans can be in navigating an auditory display. Furthermore, it is not known if listeners will conform to known pattern search techniques in a search task using sound alone. An auditory display was created using PD (Pure Data), a graphical programming language used primarily to manipulate digital sound. The visual interface for the auditory display was a blank window. The auditory interface was based on ground level ozone concentration data. When the cursor is moved around in this window, the sound generated changes based on the underlying data value at any given point. An experiment was conducted to determine how accurately subjects were able to locate the highest concentration level using the auditory display. The four attributes of sound tested were frequencysine waveform, frequency-sawtooth waveform, loudness and tempo. Results indicate that sonic display of data yields less resolution than visual. It is also shown that people will generally utilize recognizable search patterns when exploring the information space.