Quick access:

Go directly to content (Alt 1) Go directly to first-level navigation (Alt 2)

Emotion Recognition Technologies
“We Cannot Begin to Talk About Benefits”

Illustration: Three heads with different facial expressions
Misreading human emotions | © Lena Ziyal

What does it imply when computers recognize and decode emotions? An interview with Alexa Hagerty, researcher on ethical issues in artificial intelligence at the University of Cambridge.

Ms Hagerty, you have developed the website Emojify.info with a group of scientists which received a lot of positive feedback. What are the goals of the website?

We developed “Emojify” to engage the public in experiencing the limits and implications of emotion recognition technologies (ERTs) through animation, serious games and question prompts. It demonstrates how computers can scan facial expressions to identify emotions to decrypt. Our research team sought to research public understanding of and views towards ERTs and to spark a more thoughtful public conversation about the potential societal impacts.

How does the website work?                                                        

It gives players hands-on experience of interacting with an ERT, framed around the question, “Is emotion recognition technology ‘emojifying’ you?” The first game ‘Wink/Blink’ shows images of winks or blinks and asks players to guess which is which – thereby demonstrating how an ERT model can’t do the same, because it can’t understand the contextual information which differentiates the two. ‘Fake Smile’, another feature, gets players to trick the model by rapidly moving through different facial expressions, thereby showing how external expressions aren’t necessarily an expression of internal emotional state. After playing, users are asked questions to gauge their new understanding of ERT.

What are the risks of ERT?

Concerning racial bias a study has shown that systems consistently read black people’s faces as angrier than white people’s faces, regardless of the person’s expression. ERTs, like all forms of AI-based facial recognition, are prone to bias, discrimination and misidentification when dealing with people who differ from an assumed white European masculine norm of appearance.

However, ERTs are also premised on a theory – first articulated by psychologist Pau Ekman in the 1960s – that human emotions are biological, rather than differing between cultures. In particular, there are six fundamental emotions - joy, sadness, fear, anger, surprise, and disgust - which are not just universally experienced, but also expressed in universally consistent facial expressions. This simplistic theory has been increasingly challenged (if not outright debunked) in recent years – but it hasn’t stopped the wave of ERTs from companies such as Microsoft, IBM, and Amazon which claim to accurately identify those universal expressions.
“It’s a reductionist understanding of human emotions,” argues anthropologist Igor Rubinov, who led the website development. “Its real-world application should be managed incredibly carefully, if not paused indefinitely, while we have a conversation about this.”

This technology must be scientifically reviewed and regulated by policy makers.

Alexa Hagerty

What should happen to reduce the negative impact of ERT?

This technology must be scientifically reviewed, subject to robust human rights assessments, regulated by policy makers, and some uses (possibly all uses) should be banned because they are fundamentally incompatible with human rights.

Are there any benefits of ERT at all?

We cannot begin to talk about benefits when a technology is fundamentally flawed and discriminatory and is not yet properly regulated.

Where is ERT used?

Many companies use ERT to test customer reactions to their products. It can also be used in situations such as in hiring, by airport security to flag faces as revealing deception or fear, in border control, in policing to identify “dangerous people” or in education to monitor students’ engagement with their homework.

Which companies are the major players to use ERT?

This is not public knowledge. For the most part we don’t know where and how the technology is being used. It is largely used without public knowledge or consent, often “rolled into” other forms of biometric technology like facial recognition systems. However, as part of our research we were able to discover some current uses, which can be found emojify.info/resources.
 

Alexa Hagerty is one of the speakers at the festival “When Machines Dream the Future – A Hybrid Festival on Living With Artificial Intelligence” in Dresden.
When Machines Dream the Future © schech.net Festival “When Machines Dream the Future“
12th–14th November 2021
Deutsches Hygiene-Museum, Dresden
and online

Top