Investigations in coarticulated performance gestures using interactive parameter-mapping 3D sonification
MetadataShow full item record
Spatial imagery is one focus of electroacoustic music, more recently advanced by 3D audio furnishing new avenues for exploring spatio-musical structures and addressing what can be called a tangible acousmatic experience. In this paper we present new insights into spatial, temporal and sounding coarticulated (contextually smeared) gestures by applying interactive parameter-mapping sonification in three-dimensional highorder ambisonics, numerical analysis and spatial composition. 3D motion gestures and audio performance data are captured and then explored in sonification. Spatial motion combined with spatial sound is then numerically analyzed to isolate gestural objects and smaller coarticulated atoms in time, space and sound. The results are then used to explore the acousmatic coarticulated image and as building blocks for a composed dataset embodying the original gestural performance. This new data is then interactively sonified in 3D to create acousmatic compositions embodying tangible gestural imagery.