Emotional AI (or emotion recognition AI) can be described as the automated detection of human emotions and feelings using software. Emotional AI is also known as affective computing, combining AI techniques such as machine learning with the field of human psychology. The field of affective computing is more than 2 decades old, and one of the original objectives (which still holds) is to improve the quality of human-machine interactions to make them more natural and engaging.

This article provides an overview of emotional AI and potential use cases/applications, as well as highlights key challenges and best practices to keep in mind as you consider implementing emotional AI technologies in your organization.

How Does Emotion Recognition Work?

Content/text analysis applications (such as sentiment detection of social media posts and online reviews) are quite common. But these rely only on text content. However, human communication encompasses much more than the written word—such as nonverbal cues (gestures, facial expressions, speech patterns, etc.). Emotion recognition systems make it possible to automatically detect the meaning contained in signals such as facial expressions, speech patterns, and, increasingly, pulse rate and other biometrics.

Let me note here that machines may interpret and react in a prescribed manner to detected emotions but do not have feelings. Most of what we see in emotional AI systems is simulated understanding (i.e., they are capable of matching pre-existing patterns and making educated guesses).

Emotion recognition systems rely on natural language text, voice recording, or facial image inputs. For example, given an image of a person’s face, an emotion recognition system may categorize it as happy, angry, sad, confused, disgusted, or calm.

What Are the Applications of Emotional AI?

The applications of emotional AI fall into two main categories:

  • Applications that detect the consumer’s emotional state and then, based on that, take appropriate action or trigger a certain response. The majority of the current applications fall into this category.
  • Applications that try to build emotion into AI applications (i.e., we attempt to imbue certain emotional qualities and empathy to the software programs). Emotion recognition is often a first step in these applications as well. In patient care/elder care, such “artificial empathy” can boost patient care experience, but such applications are still in the experimental phase.

Emotional AI is potentially applicable in a wide variety of scenarios in which it is beneficial to better understand the emotional state or feelings at the individual or group level. Potential applications span a wide variety of scenarios, such as:

  • Advertising and marketing: gauge reaction to campaigns, products, and services
  • Automotive: detect whether drivers are tired or stressed based on their facial expressions and provide alerts about unsafe driving
  • Content and media: what content is resonating well with users?
  • Customer support: detect how angry/upset callers are based on speech patterns and accordingly trigger different workflows
  • Education: assess how students comprehend content based on facial expressions during lectures
  • Finance: use social media sentiment analysis for inputs into trading and investment strategies
  • Gaming: track satisfaction levels, including for augmented reality and virtual reality experiences
  • Governments: track social media to measure citizen sentiment to policy proposals and announcements
  • Healthcare robots: in relatively early stages, but caregiver robots that can display emotions. Another use is detection of emotional state in counseling.
  • IoT (Internet of Things)/smart devices: devices and appliances that respond based on user moods, detected via voice/face expressions
  • Market research: very useful to gauge consumer reactions to new products
  • Recruitment: use candidate videos and analytics derived from them for initial screening. Some companies are already doing this.
  • Retail: gain insights about in-store shopper experience
  • Workplace: track workplace sentiment using internal social network/forum messages; improve physical workspace design and comfort

Large vendors such as Amazon, Google, IBM, and Microsoft provide emotion detection software (e.g., facial expression APIs, tone/sentiment detection APIs). There are also specialist vendors for voice emotion detection and startups such as Affectiva, Beyond Verbal, Kairos, NuraLogix, NVISO, and Sensum that offer emotion AI tools and applications. It is beyond the scope of this guide to go into the specifics of these offerings, but suffice to say that they are mature enough to overlay or integrate into your business processes and enterprise applications.

What Are the Concerns About Emotional AI?

The public is still grappling with privacy concerns that come along with being tracked online. Now our emotional states are also machine-readable. Emotional AI brings its own opportunities and obligations. We’d be remiss to focus only on the upside and not on the potential pitfalls.

As you get ready to experiment with emotional AI applications in your enterprise, keep in mind that you’ll have to carefully consider two types of issues: one that is common to most enterprise AI applications (such as contextual integration, scaling, and production deployment) and another that is specific to emotional AI. Similar to traditional machine learning applications, the challenge is also to provide context—devoid of context, emotion detection may not be of much value.

Let’s look closer at some concerns specific to emotional Al. Some experts contend that the spectrum of human emotions is vast and the current recognition capabilities offered via extant software are too simplistic. The criticism is valid, but warts and all, their capabilities may still be enough for your use cases. Some experts worry about the potentially exploitative nature of the applications. For instance, based on their emotional state, users can be manipulated. Data collection that respects user privacy, informed user content, and opt-in are best practices that should be adhered in order to prevent user backlash and mistrust.

Conclusion

What comes to your mind when you think of AI? Thinking machines? Or logical reasoning systems making decisions based on data? Software getting better and better? But what about feelings or emotions? Maya Angelou memorably said that people will never forget how you made them feel. Emotions have such a central role in our everyday experiences, and emotional AI enables a more holistic understanding of user experience. It can make our interactions with technology more intuitive and make technology more responsive to our needs.