Hey, Siri, how am I feeling today?
According to Apple, “Siri” — perhaps the best-known voice-controlled virtual assistant — can check the weather, send texts, and, amongst other tasks, learn “how to be…more helpful.”
It can even recognize pop culture references.
User: Beam me up, Scotty.
Siri: Where have I heard this before?
Apple’s Siri, Amazon’s Alexa, and Microsoft’s Cortana use natural language processing and other artificial intelligence (AI) models to receive, interpret, and answer commands and queries. They even adapt and respond to user vocalizations. This begs the question: can Siri also recognize tone, body language, and emotions?
To the disappointment of sci-fi fans everywhere, the answer is no. Siri cannot recognize human tones and body language nor assign emotions to them. However, experts suggest that Siri will become capable of recognizing emotions in the near future — and the key to unlocking this ability is “emotion AI.”
Below, we define this term and discuss a few of its business applications. If this catches your interest, consider checking out our webinar, live July 29, 2020 at 4pm ET, which examines this topic in greater depth.
What is Emotion AI?
Essentially, emotion AI describes machines or programs that interpret and/or respond to human affect. It analyzes factors like tone, facial expression, and body language to determine what a person or group of people feel. Then, it can respond with the appropriate emotional response. Smile, and emotion AI smiles back.
Emotion AI is similar to sentiment analysis, which we discuss in a previous post. However, it’s more refined and offers additional applications in business. While sentiment analysis pertains only to text-based data, emotion AI can interpret abstract data such as vocal cadence. Also, while sentiment analysis can only discern between “positive” and “negative” sentiment, emotion AI can return a specific emotion.
How Can Businesses Leverage This Technology?
Businesses can integrate emotion AI into customer-facing apps, platforms such as Zoom, and even internal systems. Its potential is limitless, bound only by the ingenuity of AI software developers. That’s why Annette Zimmermann, research vice president at Gartner, believes that “by 2022, 10% of personal devices will have emotion AI capabilities.”
Here are some applications already embracing the future of AI:
- Cybersecurity. We used emotion AI to build a machine vision model that captures emotions (e.g., happy, sad, etc.) and another model that measures stress (by looking at the heart rate) for a client. This system checks that the user is not only who they say they are but that they are acting of their own volition.
- Mental health screening. Emotion AI can help diagnose mental illnesses like depression and BPD. It can also allow businesses and universities to screen employees or students for signs of excess stress, enabling these institutions to provide proactive mental health support. In the past, we have worked with Dicio to achieve exactly this. We developed an AI model that analyzes employees’ heart rates and blood flow to determine their stress level.
- Customer service. If you’ve ever interacted with a customer service chatbot, you know it’s frustrating to communicate with a robot that doesn’t understand how you feel. Emotion AI eliminates this issue by reading and responding to customers’ emotions — increasing their likelihood of a positive experience.
- Marketing. One means of determining the efficacy of advertisements and marketing campaigns is evaluating a test group’s emotional response. If a business wants to know how the public will react to a new ad, they can use emotion AI to analyze its effect on testers from their target demographic.
If you want to learn more about how AI can transform your business, reach out. Synaptiq can help you explore machine learning and/or artificial intelligence. Read about us here, or contact us to set up a meeting.