Show simple item record

dc.contributor.authorBarrett, Natasha
dc.contributor.authorNymoen, Kristian
dc.date.accessioned2015-11-19T20:36:50Z
dc.date.available2015-11-19T20:36:50Z
dc.date.issued2015-07
dc.identifier.citationBarrett, N. & Nymoen, K., "Investigations in coarticulated performance gestures using interactive parameter-mapping 3D sonification". Extended Abstract. In K. Vogt, A. Andreopoulou, & V. Goudarzi, eds. Proceedings of the 21st International Conference on Auditory Display (ICAD 2015). July 6-10, 2015, Graz, Styria, Austria.en_US
dc.identifier.isbn978-3-902949-01-1
dc.identifier.urihttp://hdl.handle.net/1853/54174
dc.descriptionPresented at the 21st International Conference on Auditory Display (ICAD2015), July 6-10, 2015, Graz, Styria, Austria.en_US
dc.description.abstractSpatial imagery is one focus of electroacoustic music, more recently advanced by 3D audio furnishing new avenues for exploring spatio-musical structures and addressing what can be called a tangible acousmatic experience. In this paper we present new insights into spatial, temporal and sounding coarticulated (contextually smeared) gestures by applying interactive parameter-mapping sonification in three-dimensional highorder ambisonics, numerical analysis and spatial composition. 3D motion gestures and audio performance data are captured and then explored in sonification. Spatial motion combined with spatial sound is then numerically analyzed to isolate gestural objects and smaller coarticulated atoms in time, space and sound. The results are then used to explore the acousmatic coarticulated image and as building blocks for a composed dataset embodying the original gestural performance. This new data is then interactively sonified in 3D to create acousmatic compositions embodying tangible gestural imagery.en_US
dc.language.isoen_USen_US
dc.publisherGeorgia Institute of Technologyen_US
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License..
dc.rights.urihttp://creativecommons.org/licenses/by-nc/4.0/
dc.subjectAuditory displayen_US
dc.subjectElectroacoustic compositionen_US
dc.subjectSpatial imageryen_US
dc.titleInvestigations in coarticulated performance gestures using interactive parameter-mapping 3D sonificationen_US
dc.typeProceedingsen_US
dc.contributor.corporatenameUniversity of Oslo. Department of Musicologyen_US
dc.publisher.originalUniversity of Music and Performing Arts Graz. Institute of Electronic Music and Acoustics
dc.embargo.termsnullen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License..
Except where otherwise noted, this item's license is described as This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License..