Now showing items 1-20 of 27

    • Abstract sound objects to expand the vocabulary of sound design for visual and theatrical media 

      Somers, Eric (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      In this design paper the author explores the kind of sound objects which are typically used in designing sound for theatre and media then proposes to expand the ``vocabulary'' of traditional sound design through the use ...
    • Designing non-speech sounds to support navigation in mobile phone menus 

      Leplatre, Gregory; Brewster, Stephen A (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      This paper describes a framework for integrating non-speech audio to hierarchical menu structures where the visual feedback is limited. In the first part of this paper, emphasis is put on how to extract sound design ...
    • A software-based system for interactive spatil sound synthesis 

      Wenzel, Elizabeth M; Miller, Joel D; Abel, Jonathan S (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      This paper discusses development issues for a software-based, real-time virtual audio rendering system, Sound Lab (SLAB), designed to work in the personal computer environment using a standard signal-processing library. ...
    • A model for interaction in exploratory sonification displays 

      Saue, Sigurd (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      This paper presents a general model for sonification of large spatial data sets (e.g. seismic data, medical data) based on ideas from ecological acoustics. The model incorporates not only what we hear (the sounds), but ...
    • Guided by voices: An audio augmented reality system 

      Lyons, Kent; Gandy, Maribeth; Starner, Thad (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      This paper presents an application of a low cost, lightweight audio-only augmented reality infrastructure. The system uses a simple wearable computer and a RF based location system to play digital sounds corresponding to ...
    • Experiments in computer-assisted annotation of audio 

      Tzanetakis, George; Cook, Perry R (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      Advances in digital storage technology and the wide use of digital audio compression standards like MPEG have made possible the creation of large archives of audio material. In order to work efficiently with these large ...
    • Hear & there: An augmented relity system of linked audio 

      Rozier, Joseph; Karahalios, Karrie; Donath, Judith (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      This paper presents an augmented reality system using audio as the primary interface. Using the authoring component of this system, individuals can leave "audio imprints," consisting of several layers of music, sound ...
    • Auditory-visual cross-modal perception 

      Storms, Russell L (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      The quality of realism in virtual environments is typically considered to be a function of visual and audio fidelity mutually exclusive of each other. However, the virtual environment participant, being human, is multi-modal ...
    • Haptic music: Non-musicians collaboratively creating music 

      Gandy, Maribeth; Quay, Andrew (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      The Haptic Music system allows up to three people to collaboratively create music in a jazz style. The users can have varying musical backgrounds, and even a novice can use his/her creativity to create a novel composition. ...
    • An auditory display system for aiding interjoint coordination 

      Ghez, Claude; Dubois, R Luke; Rikakis, Thanassis; Cook, Perry R (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      Patients with lack of proprioception are unable to build and maintain `internal models' of their limbs and monitor their limb movements because these patients do not receive the appropriate information from muscles and ...
    • Active sensory tuning for immersive spatialized audio 

      Runkle, Paul; Yendiki, Anastasia; Wakefield, Gregory H (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      Unlike their visual counterparts, immersive spatialized audio displays are highly sensitive to individual differences in the signal processing parameters associated with source placement in azimuth and elevation. We introduce ...
    • A case study of auditory navigation in virtual acoustic environments 

      Lokki, Tapio; Grohn, Matti; Savioja, Lauri; Takala, Tapio (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      We report results of an auditory navigation experiment. In auditory navigation sound is employed as a navigational aid in a virtual environment. In our experiment, the test task was to find a sound source in a dynamic ...
    • An extensible toolkit for creating virtual sonic environments 

      Fouad, Hesham; Ballas, James A; Brock, Derek (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      The Virtual Audio Server (VAS) is a toolkit designed for exploring problems in the creation of realistic Virtual Sonic Environments (VSE). The architecture of VAS pr
    • The effect of earcons on reaction times and error-rates in a dual-task vs. a single-task experiment 

      Lemmens, Paul M.C; Bussemakers, Myra P; de Haan, Abraham (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      An experiment with two picture categorization tasks with auditory distracters containing redundant information was carried out to investigate the effects distracters, in this case earcons, have on categorization. In the ...
    • Teaching orientation and mobility skills to blind children using computer generated 3D sound environments 

      Inman, Dean P; Loge, Ken; Cram, Aaron (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      This paper describes a computer program designed to teach orientation and mobility skills to visually impaired persons. The system utilizes off-the-shelf computer hardware and a proprietary virtual reality authoring library ...
    • Websound: A generic web sonification tool, and its application to an auditory web browser for blind and visually impaired users 

      Petrucci, Lori Stefano; Harth, Eric; Roth, Patrick; Assimacopoulos, Andre; Pun, Thierry (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      The inherent visual nature of Internet browsers makes the Web inaccessible to the visually impaired. Although several nonvisual browsers have been developed, they usually transform the visual content of HTML documents into ...
    • The NAVE: Design and implementation of a 3D audio system for a low cost spatially immersive display 

      Wilson, Jeff; Pair, Jarrell (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      The NAVE is a low cost spatially immersive display system developed by the Georgia Tech Virtual Environments Group. The NAVE audio environment uses two independent speaker systems driven by a dedicated audio PC with two ...
    • Tools for auditory display research 

      Tucker, Timothy; Mann, David; Wilson, Wilard (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      TDT will demonstrate its Power sDAC Convolver and RP2 Real-Time Processor systems for generating real-time 3D auditory displays based on head-related transfer functions (HRTFs) and head-tracker data. These tools are designed ...
    • Musical phrase-structured audio communication 

      Hankinson, John CK; Edwards, Alistair DN (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      It has previously been shown that musical grammars can impose structural constraints upon the design of earcons, thereby providing a grammatical basis to earcon combinations. In this paper, more complex structural combinations ...
    • Learning reverberation: Considerations for spatial auditory displays 

      Shinn-Cunningham, Barbara (Georgia Institute of TechnologyInternational Community for Auditory Display, 2000-04)
      Reverberation has both beneficial and detrimental effects on auditory localization. This paper reviews evidence that listeners adapt to the reverberation in a room. Results show that reverberation degrades perception of ...