Show simple item record

dc.contributor.authorDiniz, Nuno
dc.contributor.authorDeweppe, Alexander
dc.contributor.authorDemey, Michiel
dc.contributor.authorLeman, Marc
dc.date.accessioned2013-12-04T16:50:46Z
dc.date.available2013-12-04T16:50:46Z
dc.date.issued2010-06
dc.identifier.citationProceedings of the 16th International Conference on Auditory Display (ICAD2010), Washington, DC, USA. 9-15 June, 2010. Ed. Eoin Brazil. International Community for Auditory Display, 2010en_US
dc.identifier.isbn0-9670904-3-1
dc.identifier.urihttp://hdl.handle.net/1853/49768
dc.descriptionPresented at the 16th International Conference on Auditory Display (ICAD2010) on June 9-15, 2010 in Washington, DC.en_US
dc.description.abstractIn this paper, a framework for interactive sonification is introduced. It is argued that electroacoustic composition techniques can provide a methodology for structuring and presenting multivariable data through sound. Furthermore, an embodied music cognition driven interface is applied to provide an interactive exploration of the generated music-based output. The motivation and theoretical foundation for this work are presented as well as the framework’s implementation and an exploratory use case.en_US
dc.language.isoen_USen_US
dc.publisherGeorgia Institute of Technologyen_US
dc.subjectAuditory displayen_US
dc.subjectSonificationen_US
dc.titleA Framework for Music-based Interactive Sonificationen_US
dc.typeProceedingsen_US
dc.contributor.corporatenameGhent University. Institute of Psychoacoustics and Electronic Musicen_US
dc.publisher.originalInternational Community on Auditory Displayen_US
dc.embargo.termsnullen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record