The Grift of Emotion AI

How facial scanning digital emotion detectors mistake human beings for book covers

Jonathan Cook
5 min readAug 23, 2019

This article is based upon parts of the transcript of The Business of Human Emotion, an episode of the podcast This Human Business.

Concerns about the economic impact of artificial intelligence are typically dismissed by its apologists as “luddite” dystopian fantasies. It’s silly to worry, they say, that AI systems will take away all our jobs. Yes, they admit, artificial intelligence actually is taking away a large number of jobs, but the priests of the temples of silicon tell us not to be concerned. AI only takes away the most menial, meaningless jobs, they say, leaving human beings to conduct the beautiful business that only human beings are capable of. Work that pursues artistic achievement, social connection, and emotional significance will always be there for us, they promise.

These reassurances have yet to reckon with Emotion AI.

It’s an industry that goes under many names. Affective computing is one, while some have tried out the innocuous term sentiment analysis. The label Emotion AI, in its brutal combination of mutually incompatible terms, feels the most honest in its ambitions. A layman might ask what on earth emotion and artificial intelligence have to do with each other, but for those who still have faith in the End Times of the Singularity, there isn’t anything that artificial intelligence cannot touch.

Emotion AI, it is said, can use digital cameras connected to machine learning algorithms to automatically read and interpret emotion in human beings, just by conducting geometric analyses of our facial expressions. So, the jobs of those people who seek to practice empathy with other human beings are not safe after all, it seems. Neither is the work of artists and humanists protected from the onslaught of artificial intelligence. The editorial choices of movie and TV show directors are now subject to review by artificial intelligence software that scans the faces of audience members, telling where the film should be cut, and how, to produce the maximum emotional effect. Caregivers for the sick and elderly will soon have to surrender their jobs as well, we are told, to robotic replacements who will scan the faces of the old and infirm, and provide emotionally appropriate formulaic responses.

There’s Just One Problem

The domination of artificial intelligence seems almost complete. If robots equipped with Emotion AI can perform even the most emotionally sensitive jobs through the wonders of facial scanning software, what need is there for human beings at all anymore?

Before you surrender and wait meekly for your one thousand dollars in universal basic income to arrive, there’s one thing you ought to know: Emotion AI doesn’t actually work.

The sales teams for Emotion AI products will be eager to tell you how the software packages they represent have been designed using scientific research by genuine, grade-A psychologists. What they won’t be so eager to tell you is that the psychological theory that Emotion AI relies upon is as old as their grandparents. It was developed in the 1960s, back when psychologists believed that lobotomies were an effective form of treatment for many forms of mental illness.

The truth is that, for many years, the experimental studies that are used to justify Emotion AI products have failed the test of replicatation by other researchers. It seems that the theory of basic emotions, the psychological model that all Emotion AI relies upon, was constructed through a series of studies that were rife with methodological and conceptual error.

The academic justification for Emotion AI was always weak, but a few weeks ago, the nail in the coffin was driven home by a team of psychologists who conducted a thorough scientific review of the studies that have been conducted into the relationship between facial expressions and emotional frames of mind. What they discovered is that the theoretical basis for Emotion AI is not supported by the evidence. Facial expressions simply don’t provide an adequate foundation for the reliable detection and understanding of emotion. The reviewers explained:

“The facial configurations in question are not “fingerprints” or diagnostic displays that reliably and specifically signal particular emotional states regardless of context, person, and culture. It is not possible to confidently infer happiness from a smile, anger from a scowl, or sadness from a frown, as much of current technology tries to do when applying what are mistakenly believed to be the scientific facts.”

The truth is, to any honest reviewer, the claims of Emotion AI enterprises like Affectiva, Kyon, Emotient, and Rekognition have always been suspect. The very idea that people’s internal states can be reliably described based on their facial expressions flies in the face of thousands of years of recorded struggles with liars and cheats. People don’t always show their feelings on their faces.

Lauren Rhue, Professor of Information Systems at the Robert H. Smith School of Business, explains the difficulty:

“There’s been a philosophical debate about the nature of emotion. Can you really know the emotional state of another human being? A lot of the claims I think that are being made are predicated on this assumption that yes, if you walk into a room and you see somebody else, you will be able to tell how they’re feeling and that’s not necessarily the case at all. Another person might walk in and they would be a great actor or actress, and you can tell what they’re trying to project but you can’t necessarily know how they actually feel. I think that what a lot of these companies are saying is that they might be able to tell the perception, and maybe people would agree with that, maybe not, but there’s a lot of culture and context that are embedded in any type of an emotional analysis. Emotions are by their very nature extremely subjective.”

The number of aphorisms warning us not to confuse superficial appearance for deep character is striking. We’re told that true beauty is more than skin deep, but Emotion AI disregards this ancient wisdom, evaluating our feelings only at face value. To the digital face scanners at Rekogniton and Affectiva, people are book covers with nothing of significance inside.

Business needs to do better than Emotion AI. Emotion can be studied, but it’s a complex subject, full of subtleties and nuance, expressed in a variety of ways dependent upon both individual and cultural factors. Emotions, as Professor Rhue points out, are inherently subjective. It takes a subject to understand subjectivity.

No matter how intricate their algorithms, machine learning schemes can never grasp what it feels like to have even the most straightforward of human feelings. It’s time to call out the Emotion AI industry for the pseudoscientific grift that it is.

Let’s leave emotion research to the humans.

For more ideas on research of emotions in business, listen to the latest episode of the podcast This Human Business.

--

--

Jonathan Cook

Using immersive research to pursue a human vision of commerce, emotional motivation, symbolic analysis & ritual design