This week, I’m writing a series of articles about sentiment analysis, which is often referred to as Emotional AI. Engineers of this new brand of technology claim to be able to detect and analyze emotion using electronic sensors and machine learning. To date, media coverage of this emerging field of has been rather credulous, accepting Silicon Valley’s assertions about Emotional AI at face value. In this series, I’m attempting to balance that fawning coverage with critical questions, building toward suggestions for ways in which sentiment analysis can be more meaningfully employed by businesses that sincerely wish to enhance their emotional connection with the human beings they serve.
This is the fourth article in the series, which also includes the following: Can AI Understand Your Emotions? /Emotional AI Is Not Ready For Prime / What Emotional AI Fails To Grasp About Emotion / Should AI Cure Humanity Of Its Emotions? / The Mythology of Emotional AI / The Missing Companion of Emotional AI. A new article in the series will be published every day for the rest of the week.
As we observe the proliferation of Emotional AI systems, it isn’t enough for people in business to consider whether the functional benefits of these technologies fit our needs. The change that’s beng proposed, not just to commerce, but to society as a whole, is too consequential to regard as a matter of mere practicality.
We need to step back, take a moment, and look at the bigger picture. We need to ask fundamental questions whenever we see sentiment analysis being deployed: Why are these systems to subject emotion to machine learning being designed? What is the unmet need that they address?
The answers to these questions are not reassuring.
Beyond Verbal is an “emotions analytics” company that promises its clients to be able to use voice analysis software to help mobile apps “interact with us on an emotional level, just as humans do.”
Inventing a machine that can interact with people emotionally just like a human being sounds great, but then, most sales pitches do. Take a look beneath the promise, and a darker reality emerges.
One of the case studies Beyond Verbal provides to potential clients involves the use of its sentiment analysis system by adult children to keep track of the emotional state of their elderly parents. “Mom always kept track of us. With an interconnect system of VPA’s we can now keep better track of her,” reads a promotional image from the company.
Beyond Verbal promises to “interact with us on an emotional level, just as humans do,” but when decent human beings want to know how their mothers are feeling, they show up in person, put their mobile phones away, hold their mothers’ hands, and patiently engage in sensitive conversation. When human beings interact with each other on an emotional level, they don’t share their emotions with statistical line graphs.
Our moms didn’t just track us. They cared for us. They carried us in their wombs, fed us from their breasts, held us, cleaned us, nurtured us patiently even when we threw tantrums. Our mothers talked with us, and watched us, and showed us how to be human by example, by spending years and years of time together with us. They didn’t raise us using digital codes. Their love was analog.
And now, when they need us, are we going to return these countless acts of devotion by giving them an app to communicate their emotions to us by clicking on emoticons? Are we really going to tell our mothers that if they want to tell us how they feel, they can do it through our virtual personal assistants?
We need to be honest about what’s motivating the development of sentiment analysis apps for situations like this. It’s exhausting and heartbreaking to take care of an elderly parent. It’s devastating to watch the people we have always depended on for love and guidance grow weaker, get sick, and die. Any adult child of a mother or father struggling with the humiliations of old age who says that they never feel the temptation to withdraw from their geriatric parent is lying.
Nonetheless, we do the right thing. We show up. We do what we can. We do it because we love them.
Emotional AI systems like Beyond Verbal give us an alternative to showing up to do the hard work of maintaining our emotional relationships. They offer to outsource our intimacy to an automatic, soulless algorithm.
The motivation for these systems isn’t to enhance emotional relationships. The motivation is to retreat... and to exploit.
In the field of artificial intelligence, there is no bigger player than IBM Watson. The supercomputer has become famous for using machine learning to win games of chess and go against human opponents. It’s an unsettling thing when the engineers of artificial intelligence believe that the best way to demonstrate its prowess is to defeat humans, rather than to help them.
Of course, chess and go are games of calculation. Emotional AI is a totally different game — one that IBM Watson is eager to play.
The IBM Watson Tone Analyzer is a sentiment analysis tool that uses to the power of the supercomputer to “enable your chatbot to detect customer tones so you can build dialog strategies to adjust the conversation accordingly.” The goal of the service (though this goal has yet to be met) is to create emotionally-passable chatbot conversations that can supplant human customer service.
Tools such as Watson’s Tone Analyzer aren’t augmenting human conversations so much as replacing them. It’s become a struggle for consumers to work through the systems of robotic “assistants” put in their way to find a human voice capable of providing genuine empathy, even on a telephone customer service line.
Instead, the choice to deploy sentiment analysis is the end result of a line of thinking that concludes that genuine, human-to-human emotional interactions are too expensive to bother with — customer satisfaction just isn’t worth the trouble. Emotional AI systems are programmed to leverage customers’ feelings, while enabling the client company to maintain a cool distance.
It’s a shortcut, a cheat, a hack to create the impression of empathy. True empathy, though, isn’t a trick.
These sentiment analysis systems study people’s emotions in order to find opportunities for tactical advantage. They simulate emotion without actually feeling any emotion themselves.
That’s the technique of a psychopath.
Criminal psychologist Robert Hare describes psychopaths as “people who are so emotionally disconnected that they can function as if other people are objects to be manipulated and destroyed without any concern.” As psychologist Angela Book explains, “these individuals are successful because they’re able to look normal; that would include emotional mimicry.”
The manipulation of people through emotional mimicry by a cold, disconnected entity is an apt description of the manner in which most sentiment analysis programs are employed. The practice is becoming so commonplace that we’re beginning to take it for granted, but when you stop to think about it, the implications are chilling.
As she reviews the 3 Ways AI Is Getting More Emotional, Sophie Kleber describes sentiment analysis apps that are supposed to help people dealing with emotional crisis. She writes, “These applications aim to coach users through crises using techniques from behavioral therapy. Ellie helps treat soldiers with PTSD. Karim helps Syrian refugees overcome trauma. Digital assistants are even tasked with helping alleviate loneliness among the elderly.”
At first blush, these systems sound great, but when one stops to consider the social context of the development of hese Emotional AI apps, the story becomes much more disturbing.
Why are soldiers with post-traumatic stress disorder, refugees fleeing from countries destroyed by war, and lonely old people not getting the human attention they deserve?
These digital programs are certainly better than nothing, but better than nothing is a terrible standard of success when it comes to meeting human emotional needs. Emotional AI systems don’t work as well as real therapists, who are in short supply.
So why are we funding second-rate digital replacements for human therapists instead of training new therapists? Well, human therapists are more expensive than the technological alternative, and our society has apparently decided that combat veterans, traumatized refugees and old people aren’t worth the investment. So, these people get the minimally viable option — Emotional AI simulations of compassion. Instead of connecting to other human beings, they get electronic devices loaded with sentiment analysis apps.
The clear message that’s being sent by organizations that rely on sentiment analysis is that their customers aren’t important enough to talk to. It’s a rather unstable foundation to build a brand upon.
Emotional AI Can Do Better
A proponent of Emotional AI might argue that, as cold and calculating as Emotional AI systems often are, it’s naive to expect that large corporations would behave any differently. After all, businesses exist to make money. So, why wouldn’t, and why shouldn’t, businesses seek to leverage people’s emotions with the minimum investment of money, time, and trouble they can get away with?
The thing is, the psychopathic model of business isn’t the only one that works. Business can be beautiful. Business can be human. Businesses can succeed by fostering authentic and rich emotional encounters between people, rather than by encouraging people to keep as much distance from each other as possible.
I know it seems dark right now, at the end of this fourth article on Emotional AI, and we still have some shadows to travel through. I want to reassure readers that there is some light at the end of the tunnel.
Though the flaws in current Emotional AI systems are significant, the situation is not completely hopeless. Many people working in sentiment analysis genuinely mean well, even when their projects fail to match their visions of digital empathy. If we’re willing to be honest about the problems that currently beset efforts to make artificial intelligence emotionally sensitive, we can work through them, to create the potential for a future society that fosters more opportunities for emotionally satisfying interactions than we’ve ever experienced before.
Perhaps Emotional AI is acting like a psychopath now, but it doesn’t have to be incurable. If we care enough not to turn away from the emotional pain, we can make it better.
This article is the fourth part of an ongoing series exploring both the capabilities and flaws of sentiment analysis and Emotional AI. The goal of these articles is to suggest ways that the technology could be more effective in the future by reintroducing the human context of full emotional complexity, while calling attention to the exaggerations and misleading claims that currently threaten the field’s credibility.
The first article in the series, Can AI Understand Your Emotions?, introduced the concept of of Emotional AI in its cultural context. The second article, Emotional AI Is Not Ready For Prime Time, dealt with the practical question of whether present day Emotional AI systems can actually do the tasks their designers say they can do. The third article, What Emotional AI Fails To Grasp About Emotion, articulates a distinction between emotion, sentiment, and their physical manifestations. The fifth article addresses the question, Should AI Cure Humanity Of Its Emotion? If human emotion isn’t a design flaw, but a source of our unique abilities, how can Emotional AI shift from enforcing rationality to supporting effective affect? The sixth article, The Mythology of Emotional AI, retells some ancient legends with a digital twist. The series concludes with The Missing Companion of Emotional AI — Our Humanity.