Show simple item record

dc.contributor.authorBeilharz, Kirsty
dc.date.accessioned2014-01-13T01:41:07Z
dc.date.available2014-01-13T01:41:07Z
dc.date.issued2005-07
dc.identifier.citationProceedings of ICAD 05-Eleventh Meeting of the International Conference on Auditory Display, Limerick, Ireland, July 6-9, 2005. Ed. Eoin Brazil. International Community for Auditory Display, 2005.en_US
dc.identifier.urihttp://hdl.handle.net/1853/50198
dc.descriptionPresented at the 11th International Conference on Auditory Display (ICAD2005)en_US
dc.description.abstractThis paper proposes a framework for gestural interaction with information sonification in order to both monitor data aurally and, in addition, to interact with it, transform and even modify the source data in a two-way communication model (Figure 1). Typical data sonification uses automatically generated computational modelling of information, represented in parameters of auditory display, to convey data in an informative representation. It is essentially a one-way data to display process and interpretation by users is usually a passive experience. In contrast, gesture controllers, spatial interaction, gesture recognition hardware and software, are used by musicians and in augmented reality systems to affect, manipulate and perform with sounds. Numerous installation and artistic works arise from motion-generated audio. The framework developed in this paper aims to conflate those technologies into a single environment in which gestural controllers allow interactive participation with the data that is generating the sonification, making use of the parallel between spatial audio and spatial (gestural) interaction. Converging representation and interaction processes bridge a significant gap in current sonification models. A bi-modal generative sonification and visualisation example from the author's sensate laboratory illustrates mappings between socio-spatial human activity and display. The sensor cow project, using wireless gesture controllers fixed to a calf, exemplifies some real time computation and representation issues to convey spatial motion in an easily recognised sonification, suitable for ambient display or intuitive interaction.en_US
dc.language.isoen_USen_US
dc.publisherGeorgia Institute of Technologyen_US
dc.subjectAuditory displayen_US
dc.subjectSonificationen_US
dc.subjectWireless gesture controlen_US
dc.titleWireless gesture controllers to affect information sonificationen_US
dc.typeProceedingsen_US
dc.contributor.corporatenameUniversity of Sydney. Faculty of Architecture, Key Center for Design Computing and Cognitionen_US
dc.publisher.originalInternational Community on Auditory Displayen_US
dc.embargo.termsnullen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record