Emotionface: Prototype facial expression display of emotion in music
EmotionFace is a software interface for visually displaying the self-reported emotion expressed by music. Taken in reverse, it can be viewed as a facial expression whose auditory connection or exemplar is the time synchronized, associated music. The present instantiation of the software uses a simple schematic face with eyes and mouth moving according to a parabolic model: Smiling and frowning of mouth represents valence (happiness and sadness) and amount of opening of eyes represents arousal. Continuous emotional responses to music collected in previous research have been used to test and calibrate EmotionFace. The interface provides an alternative to the presentation of data on a two-dimensional emotion-space, the same space used for the collection of emotional data in response to music. These synthesized facial expressions make the observation of the emotion data expressed by music easier for the human observer to process and may be a more natural interface between the human and computer. Future research will include optimization of EmotionFace, using more sophisticated algorithms and facial expression databases, and the examination of the lag structure between facial expression and musical structure. Eventually, with more elaborate systems, automation and greater knowledge of emotion and associated musical structure, it may be possible to compose music meaningfully from synthesized and real facial expressions.