• Login
    View Item 
    •   SMARTech Home
    • International Conference on Auditory Display (ICAD)
    • International Conference on Auditory Display, 2005
    • View Item
    •   SMARTech Home
    • International Conference on Auditory Display (ICAD)
    • International Conference on Auditory Display, 2005
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Wireless gesture controllers to affect information sonification

    Thumbnail
    View/Open
    Beilharz2005.pdf (4.007Mb)
    Date
    2005-07
    Author
    Beilharz, Kirsty
    Metadata
    Show full item record
    Abstract
    This paper proposes a framework for gestural interaction with information sonification in order to both monitor data aurally and, in addition, to interact with it, transform and even modify the source data in a two-way communication model (Figure 1). Typical data sonification uses automatically generated computational modelling of information, represented in parameters of auditory display, to convey data in an informative representation. It is essentially a one-way data to display process and interpretation by users is usually a passive experience. In contrast, gesture controllers, spatial interaction, gesture recognition hardware and software, are used by musicians and in augmented reality systems to affect, manipulate and perform with sounds. Numerous installation and artistic works arise from motion-generated audio. The framework developed in this paper aims to conflate those technologies into a single environment in which gestural controllers allow interactive participation with the data that is generating the sonification, making use of the parallel between spatial audio and spatial (gestural) interaction. Converging representation and interaction processes bridge a significant gap in current sonification models. A bi-modal generative sonification and visualisation example from the author's sensate laboratory illustrates mappings between socio-spatial human activity and display. The sensor cow project, using wireless gesture controllers fixed to a calf, exemplifies some real time computation and representation issues to convey spatial motion in an easily recognised sonification, suitable for ambient display or intuitive interaction.
    URI
    http://hdl.handle.net/1853/50198
    Collections
    • International Conference on Auditory Display, 2005 [73]

    Browse

    All of SMARTechCommunities & CollectionsDatesAuthorsTitlesSubjectsTypesThis CollectionDatesAuthorsTitlesSubjectsTypes

    My SMARTech

    Login

    Statistics

    View Usage StatisticsView Google Analytics Statistics
    • About
    • Terms of Use
    • Contact Us
    • Emergency Information
    • Legal & Privacy Information
    • Accessibility
    • Accountability
    • Accreditation
    • Employment
    • Login
    Georgia Tech

    © Georgia Institute of Technology

    • About
    • Terms of Use
    • Contact Us
    • Emergency Information
    • Legal & Privacy Information
    • Accessibility
    • Accountability
    • Accreditation
    • Employment
    • Login
    Georgia Tech

    © Georgia Institute of Technology