peshkov - stock.adobe.com

Emotion AI shows promise for IT leaders in the enterprise

Although still in its infancy, experts have no doubt that artificial emotional intelligence will be of great value. Here's why your organization should develop a strategy.

Emotion AI, or Affective computing, may be the next big thing in AI. Enterprises are starting to explore emotion analytics technology to improve customer and employee experiences and offer innovative services. Although still in its infancy, CIOs and other IT leaders should keep emotional tracking technology on their radar as many experts believe it could play a valuable role in giving customers what they want.

Facial expression analysis is attractive because it can be done at a distance with nothing but a camera. The idea behind recording people's facial expressions is to infer how they feel about products such as movies and music, and to gauge service interactions without the need for extensive questionnaires.

"Unobtrusive recording lowers the burden on consumers and will make it possible for managers to better tailor products, services and marketing actions such as advertising and promotions to individual customer needs," said Michel Wedel, distinguished university professor and PepsiCo Chair in consumer science in the University of Maryland's Online Master of Science of Business Analytics program. This will ultimately increase consumer satisfaction and welfare and positively impact the economy.

However, new research evaluating hundreds of studies on monitoring emotions in facial expressions argues that there are widespread misconceptions in the power of this kind of technology. Experts agree there is still strong evidence that other emotional tracking tools based on voice, eye movements and text-based sentiment analysis can provide value. In the long run, these kinds of techniques could be combined with contextual data and facial expressions to usher in the use of emotional intelligence and emotion AI.

University of Maryland's Michel WedelMichel Wedel

CIOs and other IT executives may want to consider a variety of modalities in crafting an emotion AI strategy for customers and employees, including voice, behavior and context. It's also important to address ethical concerns to head off any backlash among customers and employees. In the end, many executives believe this technology could have a significant impact on traditional businesses.

How to track emotions

Researchers measure emotions in a variety of ways. "Questionnaire methods are certainly not perfect and are not suited to capture the dynamics of emotions well," Wedel said. "Facial tracking is useful but must be applied carefully. Skin conductance and [electroencephalogram] EEG measurement has provided some useful information in some cases. Eye tracking is not used to detect emotions, although pupil dilation has sometimes been used with some success."

Developing better tools will require continuous improvement and validation of measurement tools and, if possible, a combination of algorithms, Wedel said. Big data and computer science algorithms like AI and deep learning will affect the development of face-tracking algorithms and their accuracy.

Lisa Feldman Barrett, professor of psychology at Northeastern University, and her colleagues recently identified three key shortcomings in the science of tracking emotions on the face.

  • Limited reliability. Someone may be scowling from working too hard, for example, rather than feeling angry.
  • Lack of specificity. There is no unique mapping between a facial expression and a category of emotions.
  • Lack of generalizability. Various cultures express facial emotions differently.

Regardless of whether emotional-tracking algorithms are accurate or not, they do predict downstream behaviors that enterprises are interested in. Wedel's team has used the technology to predict box office success and the streaming behavior of movies. "But one has to design the research carefully," he said.

Emotion AI is still in its infancy

Emotion AI, which uses emotional data that can improve UX, is still considered a young industry. The first attempts to automatically detect emotions were based on the theory of universality of facial expressions of emotion introduced by American psychologist Paul Ekman, said George Pliev, founder and CEO of Neurodata Lab, an emotional tracking platform. The theory states humans experience six basic emotions that we all express the same way.

Neurodata Lab's George PlievGeorge Pliev

"[The theory] became so widespread that it marked the beginning of an entire new industry," Pliev said.

However, this view has also been widely debated, starting with American psychologist James Russell. Emotional intelligence researchers are now considering more sophisticated approaches to making sense of emotions. The idea is that emotional categories are culturally dependent and learned and could be taught to machines. "[Emotions] are not as primitive as anticipated in earlier studies," Pliev said.

More categories required for emotion AI

In Ekman's early work, scientists hypothesized that the six basic emotions mentioned above could be tracked in the face and included disgust, sadness, happiness, fear, anger and surprise. Researchers now believe more categories are needed.

Affectiva's Rana el KalioubyRana el Kaliouby

"We do not like the naivete of the industry, which is fixated on the six basic emotions and a prototypic one-to-one mapping of facial expressions to emotional states," said Rana el Kaliouby, CEO and co-founder of Affectiva, an emotional analytics platform provider. For example, a facial expression may not signal just an emotion; it could be a social, cognitive or behavioral cue, or a physiological response.

El Kaliouby said a one-to-one correspondence of facial expressions to emotional states is overly simplistic. "Under this common simplistic view, an eyebrow raise would be an expression of surprise. Yet in the real world, it could also serve as a greeting, an invitation for social contact, a sign of thanks, an initiation of flirting and more," she said.

To disambiguate this, more information is required. What else is happening on someone's face? How does this expression unfold over time? Are there other physical signals such as vocal intonations or gestures? To get this information, Affectiva is investing in a multimodal approach tuned to specific use cases.

"Expressions like interest, confusion, frustration [and] empathy are difficult to qualify directly in relation to the basic emotions," said Bart Cooreman, who has a Ph.D. in cognitive neuroscience and is a product specialist at physiology tracking platform iMotions. However, it is usually those more nebulous types of emotional expressions that are of interest in real-life scenarios.

iMotions' Bart CooremanBart Coorema

For example, a good TV commercial may not elicit full-blown joy, but a funny joke would usually produce a smile. A messy website may not make someone angry or sad, but it could result in a brief expression of frustration in the form of a brow furrow before a user clicks away from the website for good. In real-world scenarios, it is usually beneficial to work with more specific emotional expressions in the form of mouth, nose, eye or eyebrow behaviors rather than with basic emotions, Cooreman said.

Many emotional categories are subtle and difficult to recognize. Neurodata Lab's Pliev said the two main challenges are creating a comprehensive list of emotional expressions and being able to indicate if any of these emotions are present in a situation. Depending on the task, the list of emotions can grow. There is no objective reference as to a standard expression for an emotion or the number of emotions present. "People just won't agree on either the category, the intensity of expression or the number of emotions present," Pliev said.

Voice shows promise with emotion AI

UC Berkeley's Dacher KeltnerDacher Keltner

Researchers believe they may be able to automatically detect emotions through the voice with more granularity. Researchers at University of California, Berkeley are developing algorithms to detect 27 different emotions through voice analytics. Dacher Keltner, a professor at UC Berkeley who worked on the project, said emotions are communicated to different extents in different modalities such as facial expressions, gaze, voice, touch and body movement. "The voice may be the richest source of emotional communication, followed by the face," he said.

Some emotions are communicated in one modality but not others. For example, Keltner's research has found that gratitude can be consistently detected only via touch. "As the methods and statistics get better at capturing patterns of response, we believe that about 20 emotions will prove to have reliable signals across different kinds of measures," he said. These include:

  • negative signals such as anger, anxiety, contempt, disgust, embarrassment, fear, guilt, sadness, shame and terror; and
  • positive reactions such as amusement, awe, contentment, desire, ecstasy, interest, love, pride, sympathy and triumph.

New training tools required

Emotional tracking also shows promise in creating entirely new kinds of emotional and communication training tools. Keltner believes that audio feedback, for instance, may be an important factor in people learning to feel their own emotions and connect with others.

"It's a great emphasis for training purposes because it is a modality people are aware of -- we hear our voice but don't see our face. It is continuously part of our social interaction, and I think it may be a modality we have a bit more control over in thinking about applying feedback to our emotional expression," he said.

Voice Vibes' Deb CancroDeb Cancro

Some enterprises are foregoing traditional ideas of emotions to focus on how voices are perceived. For example, VoiceVibes has developed a suite of training tools for call center workers and executives practicing their public speaking skills. The tools focus on how others perceive the quality of someone's voice, an approach the company calls "vibes."

"Vibes can be more appropriate for employee training and self-awareness," said Deb Cancro, CEO at VoiceVibes. "Despite our emotions, we often need to adopt a vibe that is most effective, even if it's not in line with how we feel." For example, someone may feel sad but need to use a personable tone when they answer the phone, or an executive may want to learn how to maintain a calm presence in a stressful situation.

Ethical and privacy concerns with emotion AI

There is fine line between using more sophisticated emotional tracking technology to benefit consumers and employees and spooking them. "Secretly monitoring employees is a temptation that companies should resist," said Armen Berjikly, co-founder of Motive Software, a firm dedicated to using emotion insights to solve business problems. Some national security or financial services organizations might have to monitor employees, but the typical company risks violating employee trust in an attempt to improve the employee experience.

Armen Berjikly, co-founder of Motive SoftwareArmen Berjikly

Companies should be concerned with ethical issues, as privacy concerns and breaches may backfire and pose a threat to the company itself, Wedel said. He recommends converting videos into facial expression data and stripping other personally identifiable information from the data.

The key to personalized marketing is to avoid being" creepy," and this rule also applies to new applications of technology such as emotion recognition, said Brooke Niemiec, CMO of consulting firm Elicit. The trick is to ensure the technology is used only to improve the situation at hand. For example, facial emotion recognition cameras could measure an attendee's general satisfaction with an event. "However, if the same technology were to be used to single out unhappy individuals so someone could approach them directly during the event, I would say that crosses the 'don't be creepy' line," Niemiec said.

In addition, some people are very good at hiding their true feelings. Any perception of impropriety or a feeling of being violated could cause people to put up a shield, effectively preventing the technology from being used, Niemiec said.

Valeo's Guillaume DevauchelleGuillaume Devauchelle

But emotional tracking could provide a significant competitive differentiator in traditional industries like auto manufacturing. For example, Valeo, an automobile components manufacturer, has made a significant investment in tracking driver and passenger comfort. This could not only increase customer satisfaction but lead to safer cars.

"The next evolution in artificial intelligence is emotional intelligence" said Guillaume Devauchelle, vice president of innovation at Valeo. "In the not so distant future, machines will become more empathic and in tune to our emotions so that they can better interact with us."

Dig Deeper on Digital transformation

Cloud Computing
Mobile Computing
Data Center
Sustainability and ESG
Close