The Development Of A Method For Designing Auditory Displays Based On Soundtrack Composition
Abstract
This paper details work toward the design of a method for creating
auditory displays for the human-computer interface, based
on soundtrack composition. We begin with the benefits to this
approach before discussing methods for auditory display design
and the need for a unification of different design techniques. We
then outline our on-going investigation into the tools and techniques
employed within the working practices of sound designers
and soundtrack composers. Following this we report our observations
of the main priorities that influence how composers create
soundtracks and propose ways in which our method may support
these. We argue that basing the first steps of the method on a
‘cue sheet could enable designers to identify actions, objects and
events within an HCI scenario whilst taking into account the user
and the context of use. This is followed by some initial observations
of a preliminary study into whether a participant can successfully
use this cue sheet methodology. We conclude by identifying
that certain elements of the methodology need to be changed: Further
investigation and subsequent design needs to be carried out
into ways participants can successfully comprehend and systematically
use the cue sheet to identify seen and unseen events, actions
and objects within the human-computer interface. Additionally we
need to investigate how best categorize and map these elements to
sound. We conclude our paper with our plans for future work