Businesses of all kinds are investing in AI to improve operations and customer experience. However, as the average person experiences on a daily basis, interacting with machines can be frustrating when they’re incapable of understanding emotions.

For one thing, humans don’t always communicate clearly with machines, and vice versa. The inefficiencies caused by such miscommunications tend to frustrate end users. Even more unsettling in such scenarios is the failure of the system to recognize emotion and adapt.

To facilitate more effective human-to-machine interactions, artificial intelligence systems need to become more human-like, and to do that, they need the ability to understand emotional states and act accordingly.

A Spectrum of Artificial Emotional Intelligence

Merriam Webster’s primary definition of empathy is:

 “The action of understanding, being aware of, being sensitive to and vicariously experiencing the feelings, thoughts and experience of another of either the past or present without having the feelings, thoughts and experience fully communicated in an objectively explicit manner; alsothe capacity for this.”

To achieve artificial empathy, according to this definition, a machine would have to be capable of experiencing emotion. Before machines can do that, they must first be able to recognize emotion and comprehend it.

Non-profit research institute SRI International and others have succeeded with the recognition aspect, but understanding emotion is more difficult. For one thing, individual humans tend to interpret and experience emotions differently.

“We don’t understand all that much about emotions to begin with, and we’re very far from having computers that really understand that. I think we’re even farther away from achieving artificial empathy,” said Bill Mark, president of Information and Computing Services at SRI International, whose AI team invented Siri. “Some people cry when they’re happy, a lot of people smile when they’re frustrated. So, very simplistic approaches, like thinking that if somebody is smiling they’re happy, are not going to work.”

Emotional recognition is an easier problem to solve than emotional empathy because, given a huge volume of labeled data, machine learning systems can learn to recognize patterns that are associated with a particular emotion. The patterns of various emotions can be gleaned from speech (specifically, word usage in context, voice inflection, etc.), as well as body language, expressions and gestures, again with an emphasis on context. Like humans, the more sensory input a machine has, the more accurately it can interpret emotion.

Recognition is not the same as understanding, however. For example, computer vision systems can recognize cats or dogs based on labeled data, but they don’t understand the behavioral characteristics of cats or dogs, that the animals can be pets or that people tend to love them or hate them.

Similarly, understanding is not empathy. For example, among three people, one person may be angry, which the other two understand. However, the latter two are not empathetic: The second person is dispassionate about the first person’s anger and the third person finds the first person’s anger humorous.

In recent history, Amazon Alexa startled some users by bursting into laughter for no apparent reason. It turns out that, the system heard, “Alexa, laugh” when the user said no such thing. Now imagine a system laughing at a chronically ill, depressed, anxious, or suicidal person who is using the system as a therapeutic aid.

“Siri and systems like Siri are very good at single-shot interactions. You ask for something and it responds,” said Mark. “For banking, shopping or healthcare, you’re going to need an extended conversation, but you won’t be able to state everything you want in one utterance so you’re really in a joint problem-solving situation with the system. Some level of emotional recognition and the ability to act on that recognition is required for that kind of dialogue.”

Personalization versus Generalization

Understanding the emotions of a single individual is difficult enough, because not everyone expresses or interprets emotions in the same way. However, like humans, machines will best understand a person with whom it has extensive experience.

“If there has been continuous contact between a person and a virtual assistant, the virtual assistant can build a much better model,” said Mark. “Is it possible to generalize at all? I think the answer to that is, ‘yes,’ but it’s limited.”

Generalizing is more difficult, given the range of individual differences and all the factors that cause individuals to differ, including nature and nurture, as well as culture and other factors.

Recognizing emotion and understanding emotion are a matter of pattern recognition for both humans and machines. According to Keith Strier, EY Advisory Global and Americas AI leader at professional services firm EY, proofs of concept are now underway in the retail industry to personalize in-store shopping experiences.

“We’re going to see this new layer of machine learning, computer vision and other tools applied to reading humans and their emotions,” said Strier. “[That information] will be used to calibrate interactions with them.”

In the entertainment industry, Strier foresees entertainment companies monitoring the emotional reactions of theater audiences so that directing and acting methods, as well as special effects and music can deliver more impactful entertainment experiences that are scarier, funnier or more dramatic.

“To me it’s all the same thing: math,” said Strier. “It’s really about the specificity of math and what you do with it. You’re going to see a lot of research papers and POCs coming out in the next year.”

Personalization Will Get More Personal

Marketers have been trying to personalize experiences using demographics, personas and other means to improve customer loyalty as well as increase engagement and share of wallet. However, as more digital tools have become available, such as GPS and software usage analytics, marketers have been attempting to understand context so they can improve the economics and impact of campaigns.

“When you add [emotional intelligence], essentially you can personalize not just based on who I am and what my profile says, but my emotional state,” said Strier. “That’s really powerful because you might change the nature of the interaction, by changing what you say or by changing your offer completely based on how I feel right now.”

Artificial Emotional Intelligence Will Vary by Use Case

Neither AI nor emotions are one thing. Similarly, there is not just one use case for artificial emotional intelligence, be it emotional recognition, emotional understanding or artificial empathy.

“The actual use case matters,” said Strier. “Depending on the context, it’s going to be super powerful or maybe not good enough.”

At the present time, a national bank is piloting a smart ATM that uses a digital avatar which reads customers’ expressions. As the avatar interacts with customers, it adapts its responses.

“We can now read emotions in many contexts. We can interpret tone, we can we can triangulate body language and words and eye movements and all sorts of proxies for emotional state. And we can learn over time whether someone is feeling this or feeling that. So now the real question is what do we do with that?” said Strier. “Artificial empathy changes the art of the possible, but I don’t think the world quite knows what to do with it yet. I think the purpose question is probably going to be a big part of what going to occupy our time.”

Bottom Line

Artificial emotional intelligence can improve the quality and outcomes of human-to-machine interactions, but it will take different forms over time, some of which will be more sophisticated and accurate than others.

Artificial empathy raises the question of whether machines are capable of experiencing emotions in the first place, which, itself, is a matter of debate. For now, it’s fair to say that artificial emotional intelligence, and the advancement of it, are both important and necessary to the advancement of AI.