Menu
Dealer
Resource Center

Amazon’s Alexa to Become More Emotionally Intelligent

Mandee Thomas
Jul 8, 2019 11:09:14 AM

Amazon is looking for a new way to connect to its users through making their AI more emotionally intelligent. It’s not surprising that the Amazon Alexa team would be continually looking to improve the way that users interact with their devices, but soon your Amazon Echo may just be able to detect whether you’re happy or sad, and react accordingly.

Alexa’s Emotional Intelligence Project

The scientists behind Amazon’s voice assistant are working to make Alexa more emotionally in tune. Rohi Prasad, a chief scientist for Amazon Alexa’s AI division is leading a team whose goal is to explore emotion recognition capabilities.

This project started back in 2008 with the objective to protect veterans with PTSD. The idea being that their technology would be able to detect and evaluate a person’s mental state based on the sound of their voice. By doing this, they hoped to identify signs of PTSD, depression, or risk of suicide.

“We were looking at speech, language, brain signals, and sensors to make sure that our soldiers—when they’re coming back home—we can pick [up] these signals much earlier to save them,” said Prasad in a recent interview.

Applications

The experiments in 2008 were just the beginning. Since then, Amazon has continued to look for ways that emotion recognition could be implemented into their growing line of technology. Amazon’s Alexa AI team is currently looking to develop a wearable device for users that would be able to detect emotions like happiness, joy, anger, sadness, sorrow, fear, disgust, boredom, and even stress.

This device would have the ability to be connected with a person’s smartphone, and could perceivably be used like a PERS device: alerting predetermined contacts if the wearer is experiencing extreme levels of negative emotion that could warrant intervention.

It isn’t hard to imagine how this kind of AI technology could be utilized in the security industry as well. Other companies are already looking into body language as a predictor for crime in the security monitoring space—why not work emotion recognition through vocal cues into the equation as well? Plus, Amazon is already working to have a footing in the security space with Alexa Guard.

A Work in Progress

Amazon encourages their developers and researchers to experiment a lot, so many of these kinds of ideas don’t make it out of the trial phase. And even though advancements continue to be made on the emotional detection front, there is still a long way to go before the technology would be ready for market.

Amazon Alexa senior applied science manager Chao Wang has been working to extract and map emotions based on voice recordings in order to classify and predict vocal states. She recognizes that there are many obstacles in this line of research:

“There’s a lot of ambiguity in this space—the data and the interpretation—and this makes machine learning algorithms that achieve high accuracy really challenging,” she states.

Take advantage of our robust library of industry and AG related news, articles, webinars and other resources available through our resource center to enhance your success.  You will also discover valuable insights and content you can share with your subscribers through your website, newsletters, and emails.

Receive more useful content like this by signing up for our weekly AG Newsletter below: