Use of personalized binaural audio and interactive distance cues in an auditory goal-reaching task
MetadataShow full item record
While the angular spatialization of source sounds through individualized Head-related transfer functions (HRTFs) has been extensively investigated in auditory display research, also leading to effective real-time rendering of these functions, conversely the interactive simulation of egocentric distance information has received less attention. The latter, in fact, suffers from the lack of realtime rendering solutions also due to a too sparse literature on the perception of dynamic distance cues. By adding a virtual environment based on a Digital waveguide mesh (DWM) model simulating a small tubular shape to a binaural rendering system through selection techniques of HRTF, we have come up with an auditory display affording interactive selection of absolute 3D spatial cues of angular spatialization as well as egocentric distance. The tube metaphor in particular minimized loudness changes with distance, hence providing mainly direct-to-reverberant and spectral cues. A goal-reaching experiment assessed the proposed display: participants were asked to explore a virtual map with a pen tablet and reach a sound source (the target) using only auditory information; then, subjective time to reach and traveled distance were analyzed. Results suggest that participants achieved a first level of spatial knowledge, i.e., knowledge about a point in space, by performing comparably to when they relied on more robust, although relative, loudness cues. Further work is needed to add fully physical consistency to the proposed auditory display.