Carnival masks

When we get machines to detect human emotions, what are they ‘seeing’? Is it the emotion we feel inside, or the emotion we are expressing to the outside world? The two can be very different.. and both can be difficult to ‘read’. 

Emotion detection algorithms that use computer vision to analyse facial expressions are predominantly built using the Facial Action Coding System (FACS). FACS specifies what combinations of facial muscle positions correspond to what emotional expression. Hence ‘smiling without your eyes’ is assumed to be a false smile because the muscles around your eyes are not in the position expected given the position of muscles controlling your mouth.

But just how consistently do facial expressions match external judgements or inner feelings? The book ‘How Emotions Are Made: The Secret Life of the Brain’ by Lisa Feldman Barrett challenges the classical view that emotions have a facial signature, instead making a convincing argument that you cannot interpret a facially-expressed emotion without knowing the context in which it was constructed. Which could be a cause for concern when companies use emotion detection algorithms based on facial expressions…

Emotient’s software reads the expressions of individuals and crowds to gain insights that can be used by advertisers to assess viewer reaction or a medical practitioner to better understand signs of pain

Apple buys artificial intelligence startup Emotient, Reuters, 2016

To demonstrate how this might be a concern, have a look at the following faces:

Face A
Face B
Face C
Face D

Consider what emotion is being felt and/or expressed in each picture.

Try choosing a word that best matches the face from the following list:

1) happiness, 2) surprise, 3) sadness, 4) fear, 5) anger, 6) disgust

All images are publicly available online, issued with Creative Commons licenses for re-use (links to follow)

…take your time (a.k.a. padding out the page so hopefully the answers don’t appear before you’ve thought about it.)

When pushed to select from the limited range of emotion words provided (as is common in many psychological studies, including those that the FACS is based upon), which did you pick for each face?

You might think that the extreme emotions at least would be easy to spot – laughing and smiling representing happy and joyful states versus crying and anger representing sad and miserable states. Yet we can cry tears of joy, laugh like a scary clown rather than a funny one, scream at the top of our lungs when achieving a hard-fought objective. An expression that, out of context, can appear extremely intimidating. And we can smile pretty convincingly, even with our eyes, whilst suppressing the darkest thoughts…

Why smile when we don’t feel happy? Well, one personal benefit is that the simple act of smiling can make you feel better in some circumstances. The more common reason is that we are conditioned throughout life to conform to social and cultural norms of behaviour. The more clearly we are able to express what people expect, the easier it is to communicate. We like using expressions so much in communication, we created emojis as an alternative to words in text-based messaging 🙂 😦 😮 :-/ 😉

The expressions we make whilst completing tasks where communication is not the priority are a very different matter… They aren’t clear. They are full of variety.

The images above are expressions occurring during the following (click on the link to view the full image):

Perhaps the most common misinterpretation of facial emotion is the frown. It can represent sadness, anger, confusion, contemplation, tiredness, stress… or simply be the default ‘resting’ face when thinking about just about anything.

This was demonstrated beautifully in a celebrity interview. Host Graham Norton was interviewing actor Jeremy Renner and they discussed the issue of Jeremy’s resting face. The clip below explains…:

This is unlikely to be surprising to anyone. Many of us will have experienced somebody telling us to ‘stop frowning’, or ‘cheer up, it might never happen’. Shaking us out of internal thoughts that have nothing to do with our facial expression… aside from the rare soul for whom the expression does match the feeling. For them, being told to stop or ‘cheer up’ is just about the least useful thing a person could say… perhaps topped only by ‘calm down’…

In short, we humans are pretty rubbish at detecting emotion from facial expressions most of the time. Without any context, we struggle to consistently label emotions expressed. And even with context, we can fail to appreciate inner feelings when applying overly-simplistic social and cultural stereotypes about the faces people should or should not be expressing for a given situation. The science of emotion is an immature field being challenged both by new research and criticism of past techniques. There is also the reality that expressing and feeling emotions can be two different human states occurring simultaneously. Which one is being taught and measured?

If humans can so easily fail to empathise, and the science is flawed, do we think the machines we build are going to do it better…?

Related posts

References and further reading

Image of carnival masks for saleFeatured image: Carnival masks, source: iStockphoto (licensed for non-commercial use on this site, not for reuse).

Behaviour, Blog

Join the conversation! 6 Comments

  1. This is a wonderful discussion, Sharon. I always appreciate the way you challenge (or find people who challenge) trending wisdom. The problem I worry about is how bade AI toolkits are being built ahead of real understanding. We seem to be incorporating bias and misinformation into our tools. I wonder if we will be diligent in upgrading all the tools as we come to understand more.

  2. Thanks Dan, I so appreciate your comments and feedback. And yes, that’s exactly the concern – that tools are being built based on flawed foundations without acknowledging their weaknesses. We need better methods for reporting the confidence and variance in predictions made by AI.

    This topic is particularly tricky because humans are also pretty bad at reading emotions, including professionals who ought to know better than make judgements without acknowledging or incorporating uncertainty.

  3. […] This matters because the original research has been embraced within the cognitive science community and is being used in algorithmic emotion detection. For more details, view related post: (Failing) to detect emotions. […]

  4. […] discovers, or is alerted to, a flaw in the algorithm? This is a particular concern of mine about affective computing and the flawed theories on which emotion detection algorithms are currently being built.  What […]

  5. […] Failing to detect emotions, May 2019 […]

  6. […] (Failing) To Detect Emotions, May 2019 […]

Comments are closed.

%d bloggers like this: