Interactive control of music using emotional body expressions

Duration: 4 mins 38 secs
Share this media item:
Embed this media item:


About this item
Image inherited from collection
Description: We infer a person's emotional state from body posture and gesture, and then use this to control the synthesis of music with an appropriate mood.
 
Created: 2010-02-18 15:09
Collection: Rainbow Graphics and Interaction Research Group
Publisher: University of Cambridge
Copyright: Professor Peter Robinson
Language: eng (English)
Keywords: affective; computing; emotion; recognition; full-body; interaction; machine; learning; music-mixing;
Credits:
Actor:  Daniel Bernhardt
Producer:  Peter Robinson
 
Abstract: This video presents a novel music mixing interface which allows users to blend between pieces of music by moving their whole body in different emotional styles. Although the interface itself would be most applicable for the performing arts and gaming, the principles concerning the use of emotions and body motion analysis apply to many other areas interested in the design of intelligent user interfaces. We report the results of a pilot user study which suggest that such an interface could afford an emotionally immersive experience. However, individual user differences in the expression of emotions need to be accounted for.
Available Formats
Format Quality Bitrate Size
iPod Video 256x192    1.26 Mbits/sec 43.88 MB View Download
Auto * (Allows browser to choose a format it supports)