Mind-reading machines

Duration: 5 mins 39 secs
Share this media item:
Embed this media item:


About this item
Image inherited from collection
Description: Facial expressions provide an important spontaneous channel for the communication of both emotional and social displays. This video shows how facial expression information can be used to make useful inferences about a user’s mental state in a natural computing environment.
 
Created: 2011-01-04 10:04
Collection: Rainbow Graphics and Interaction Research Group
Publisher: University of Cambridge
Copyright: Professor Peter Robinson
Language: eng (English)
Keywords: affective; computing; emotion; facial; expression;
Credits:
Actor:  Peter Robinson
Actor:  Rana el Kaliouby
Actor:  Neil Dodgson
 
Abstract: Can you read minds? The answer is most likely ‘yes’. You may not consider it mind reading but our ability to understand what people are thinking and feeling from their facial expressions and gestures is just that. People express their mental states all the time through facial expressions, vocal nuances and gestures. We have built this ability into computers to make them emotionally aware.

The ability to attribute mental states to others from their behaviour and then to use that information to guide our own actions or predict those of others is known as the ‘theory of mind’. Although research on this theory has been around since the 1970s, it has recently gained attention due to the growing number of people with Autism conditions, who are thought to be ‘mind-blind’. That is, they have difficulty interpreting others' emotions and feelings from facial expressions and other non-verbal cues.

Our computer system is based on the latest research in the theory of mind by Professor Simon Baron-Cohen, Director of the Autism Research Centre at Cambridge. His research provides a taxonomy of facial expressions and the emotions they represent. In 2004, his group published the Mind Reading DVD, an interactive computer-based guide to reading emotions from the face and voice. The DVD contains videos of people showing 412 different mental states. We have developed computer programs that can read facial expressions using machine vision, and then infer emotions using probabilistic machine learning trained by examples from the DVD.

Machine vision is getting machines to ‘see’, giving them the ability to extract, analyze and make sense of information from images or video, in this case footage of facial expressions. Probabilistic machine learning describes the mechanism of enabling a machine to learn an association between features of an image such as facial expression and other classes of information, in this case emotions, from training examples. The most likely interpretation of the facial expressions is then computed using probability theory.
Available Formats
Format Quality Bitrate Size
MPEG-4 Video 640x360    1.84 Mbits/sec 78.30 MB View
iPod Video 480x270    505.83 kbits/sec 20.93 MB View
MP3 44100 Hz 125.38 kbits/sec 4.99 MB Listen
Auto * (Allows browser to choose a format it supports)