Artificial Intelligence (AI)
Trending

An era of Emotion AI

Let the computers know how you feel

You are at the airport and you are late for a flight. You need to speak to an agent quickly, but everyone is busy, with lines that stretch endlessly.

So he goes to the robot for help.

The bot assistant answers his questions in the course of a real back-and-forth conversation. And despite the noisy environment, he is able to register the stress in his voice, along with many other verbal emotional cues, and adjust his tone in response.

This scenario, put together by Rana Gujral, CEO of Behavioral Signals, is still hypothetical, but it may be real sooner than he thinks.

“In the next five years, you will see some really incredible experiences.”

“In the next five years, you will see some really amazing experiences,” he said.

Gujral does not like robots or the chatbot, but he is in the realm of emotional AI: an AI that detects and analyzes human emotional signals. His company’s technology works to map verbal information, such as pitch, vocal focus, and rate of speech, into call center conversations to better match actors.

What is emotional artificial intelligence?
The term Emotion AI refers to artificial intelligence that detects and interprets human emotional signals. Sources can include text (natural language processing and emotion analysis), audio (AI acoustic emotion), video (facial movement analysis, gait analysis, and physiological cues), or combinations thereof.
Artificial intelligence is not limited to the excitement of sound. Sentiment analysis, a natural language processing technique, detects and quantifies the emotional content of text samples, whether they are single excerpts or high-volume samples. It has matured to the point that it is now a popular tool in industries ranging from marketing to product review analytics and recommendation customization, to finance, where it can help predict inventory movements.

There are also video signals. This includes analyzing facial expressions, but also things like analyzing gait and capturing certain physiological signals through video. (A person’s breathing and heart rate can be detected non-contact using cameras in the right conditions.)

At the same time, passion is something mysterious. Applying some of these techniques in situations of dire consequences can be quite problematic. In fact, researchers at New York University’s AI Now Institute last year asked legislators to “prohibit the use of impact recognition in high-risk decision-making processes.” The best-known example is a hiring system that uses the facial expressions and voice patterns of job candidates to determine “employment grade.”

“The idea of ​​using facial expressions to evaluate people in job interviews is not supported by science,” said Daniel McDuff, a Microsoft researcher who studies multimodal affective computing that analyzes facial movements along with other signals, such as physiological levels. And body movement, for possible health applications.

This multimedia emphasizes a key point: our faces alone rarely, if ever, tell the whole story. The extent to which facial expressions reliably convey emotions remains a hotly debated topic.

“I don’t think many people doubt that facial expressions contain information, but to assume that a direct mapping between what I express on my face and how I feel inside is often too simplistic,” added Macduff, who is also a veteran of the Affective Computing group at MIT Media Lab, an industry leader.

With simultaneous promises and chances of failure, we asked four experts to share with us the state of passion: how it works, how it is being applied today, how it explains population differences, and where the strict lines should be drawn to prevent its application.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close
Close