2000 Proceedings of the International Conference on Auditory Display. The annual ICAD conference is the prime forum for academia and industry to exchange ideas and discuss developments in the field of auditory display.

Recent Submissions

  • Websound: A generic web sonification tool, and its application to an auditory web browser for blind and visually impaired users 

    Petrucci, Lori Stefano; Harth, Eric; Roth, Patrick; Assimacopoulos, Andre; Pun, Thierry (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    The inherent visual nature of Internet browsers makes the Web inaccessible to the visually impaired. Although several nonvisual browsers have been developed, they usually transform the visual content of HTML documents into ...
  • The NAVE: Design and implementation of a 3D audio system for a low cost spatially immersive display 

    Wilson, Jeff; Pair, Jarrell (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    The NAVE is a low cost spatially immersive display system developed by the Georgia Tech Virtual Environments Group. The NAVE audio environment uses two independent speaker systems driven by a dedicated audio PC with two ...
  • Tools for auditory display research 

    Tucker, Timothy; Mann, David; Wilson, Wilard (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    TDT will demonstrate its Power sDAC Convolver and RP2 Real-Time Processor systems for generating real-time 3D auditory displays based on head-related transfer functions (HRTFs) and head-tracker data. These tools are designed ...
  • The effect of earcons on reaction times and error-rates in a dual-task vs. a single-task experiment 

    Lemmens, Paul M.C; Bussemakers, Myra P; de Haan, Abraham (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    An experiment with two picture categorization tasks with auditory distracters containing redundant information was carried out to investigate the effects distracters, in this case earcons, have on categorization. In the ...
  • Teaching orientation and mobility skills to blind children using computer generated 3D sound environments 

    Inman, Dean P; Loge, Ken; Cram, Aaron (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    This paper describes a computer program designed to teach orientation and mobility skills to visually impaired persons. The system utilizes off-the-shelf computer hardware and a proprietary virtual reality authoring library ...
  • Sonification of particle systems via de Broglie's Hypothesis 

    Sturm, Bob L (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    Quantum mechanics states a particle can behave as either a particle or a wave. Thus systems of particles might be likened to a complex superposition of dynamic waves. Motivated by this, the author develops methods for the ...
  • Spatial auditory displays for use within attack rotary wing aircraft 

    Shilling, Russell D; Letowski, Tomasz; Storms, Russell (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    Spatial auditory displays were designed, flown, and evaluated using the Simulation Training Research Advanced Testbed for Aviation (STRATA), a AH-64a Apache simulator, located at the Army Research Institute (ARI) at Fort ...
  • Sonification and the interaction of perceptual dimensions: Can the data get lost in the map? 

    Neuhoff, John G; Kramer, Gregory; Wayand, Joseph (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    Many sonification techniques use acoustic attributes such as frequency, intensity, and timbre to represent different characteristics of multidimensional data. Here we demonstrate a perceptual interaction between changes ...
  • Subjective testing of the performance of reverberation enhancement using virtual reality environmens 

    Larsson, Pontus; Kleiner, Mendel; Vastfjall, Daniel; Olsson, Conny; Dalenback, Bengt-Inge (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    Various systems for the purpose of performing subjective audiovisual tests have been evaluated. Auralizations and visualizations of two different halls in the Göteborg University School of Music have been made using ...
  • Psychophysical scaling of sonification mappings 

    Walker, Bruce N.; Kramer, Gregory; Lane, David M (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    We determined preferred data-to-display mappings by asking experiment participants directly and then examined the psychophysical scaling functions relating perceived data values to underlying acoustic parameters. Presently, ...
  • Principle curve sonification 

    Hermann, T; Meinicke, P; Ritter, H (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    This paper describes a new approach to render sonifications for high-dimensional data, allowing the user to perceive the ``main'' structure of the data distribution. This is achieved by computing the principal curve of the ...
  • Music monitor: Dynamic data display 

    Tran, Quan T; Mynatt, Elizabeth (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    In this demo, we present an interface prototype of Music Monitor, an application targeted for home use that dynamically conducts music in real-time to reflect parallel activities in disparate locations (e.g., preparing ...
  • Musical phrase-structured audio communication 

    Hankinson, John CK; Edwards, Alistair DN (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    It has previously been shown that musical grammars can impose structural constraints upon the design of earcons, thereby providing a grammatical basis to earcon combinations. In this paper, more complex structural combinations ...
  • Learning reverberation: Considerations for spatial auditory displays 

    Shinn-Cunningham, Barbara (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    Reverberation has both beneficial and detrimental effects on auditory localization. This paper reviews evidence that listeners adapt to the reverberation in a room. Results show that reverberation degrades perception of ...
  • Hear & there: An augmented relity system of linked audio 

    Rozier, Joseph; Karahalios, Karrie; Donath, Judith (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    This paper presents an augmented reality system using audio as the primary interface. Using the authoring component of this system, individuals can leave "audio imprints," consisting of several layers of music, sound ...
  • Haptic music: Non-musicians collaboratively creating music 

    Gandy, Maribeth; Quay, Andrew (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    The Haptic Music system allows up to three people to collaboratively create music in a jazz style. The users can have varying musical backgrounds, and even a novice can use his/her creativity to create a novel composition. ...
  • Guided by voices: An audio augmented reality system 

    Lyons, Kent; Gandy, Maribeth; Starner, Thad (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    This paper presents an application of a low cost, lightweight audio-only augmented reality infrastructure. The system uses a simple wearable computer and a RF based location system to play digital sounds corresponding to ...
  • Experiments in computer-assisted annotation of audio 

    Tzanetakis, George; Cook, Perry R (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    Advances in digital storage technology and the wide use of digital audio compression standards like MPEG have made possible the creation of large archives of audio material. In order to work efficiently with these large ...
  • Designing non-speech sounds to support navigation in mobile phone menus 

    Leplatre, Gregory; Brewster, Stephen A (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    This paper describes a framework for integrating non-speech audio to hierarchical menu structures where the visual feedback is limited. In the first part of this paper, emphasis is put on how to extract sound design ...
  • A software-based system for interactive spatil sound synthesis 

    Wenzel, Elizabeth M; Miller, Joel D; Abel, Jonathan S (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
    This paper discusses development issues for a software-based, real-time virtual audio rendering system, Sound Lab (SLAB), designed to work in the personal computer environment using a standard signal-processing library. ...

View more