Artificial intelligence (AI) can already do things like highlight potential instances of cancer, answer people’s questions, and predict which items a store sells will be most popular. So-called emotional AI is not yet part of the mainstream, but researchers are working to make it happen. Then, instead of just keeping track of people’s habits, AI could detect and monitor how they feel.
Helping service-oriented companies to be more proactive
Many of the primary arguments in support of emotional AI center on how it could help companies provide more immediate, personalized service. For example, if a person gets annoyed at delays while dining out, AI could detect that distress and signal the restaurant’s manager to intervene and offer the meal for free.
Emotion-detecting AI could even apply to online shopping since most computers have built-in, front-facing cameras. If someone has their camera activated, algorithms could see if they have a clenched jaw, furrowed brows, a frowning expression or anything else that could indicate they’re upset. Then, the website might display a pop-up window to help the shopper contact a customer service agent for help.
Using technology like this would let businesses respond before a customer’s temper reaches a boiling point. Then, the likelihood goes up of soothing the person and preventing them from deciding they don’t want to associate with the brand anymore.
Also interesting: What Is Emotional Intelligence?
Walmart filed a patent for technology that would analyze a person’s biometric data as they stand in checkout lines. The information surrounding the document says the technology might gather an individual’s heart rate and blood pressure. Since both those things can arise as a person gets upset, Walmart managers could determine when and where to send team members to assist people before they start complaining.
Finding the right music to set the mood
The German startup Groovecat has a tool that lets people use AI to detect the emotional qualities within songs. Songwriters receive synchronization royalties when their tunes are played during product commercials and movie trailers, plus during TV shows and within films.
Using AI to detect the emotions within a song is substantially different than the possible use cases discussed in the previous section. However, it could be beneficial for songwriters — particularly if they’re not yet well-known. AI that interprets the emotions in songs could also assist people who create mood-based playlists for streaming sites.
People are not entirely on board with emotional AI
Perhaps the biggest challenge for people who are developing emotional AI is a poor public perception. Many people find such technology creepy and invasive. A 2018 Gartner survey found that 70 percent of individuals were OK with AI evaluating their vital signs or helping make transactions more secure. However, the majority did not want AI to analyze their emotions or have an always-on function to understand their needs better.
Some analysts also worry that if emotional AI gauges how people feel often enough or provides responses with simulated feelings, it could give humanity an excuse not to stay connected. For example, if an AI tool could check on a loved one and send a report that says everything’s fine, a user could decide that’s enough information and not bother confirming it’s true.
Also interesting: Risks and Warnings of AI Traced Back to 19th Century
What if a person has a disability that causes them to have trouble controlling their facial expressions, or perpetually grimace because they’re in pain? Those things don’t have anything to do with the kind of service received. If emotional AI makes the wrong judgment, it could bring unwanted attention to the individual and cause embarrassment.
AI could make individuals feel worse
It’s also possible that AI could pick up on a person’s emotions and do something that worsens how they feel. Many people have had at least a few instances where Facebook’s “On This Day” feature showed something they’d rather not recall. Some companies are developing AI that could respond to people’s angry or sad emotions to cheer them up or calm them down.
That could backfire, though. Take the example of emotional AI that notices a person feeling sad as they get into the driver’s seat. What if the AI then proceeds to play a song that reminds the person of their partner who just passed away after a long battle with cancer — and caused the sadness?
Potentially worthwhile if used with caution
The scenarios mentioned here should highlight why emotional AI could be a good thing for companies, but they must use it carefully. AI is getting smarter by the day but may make mistakes that exacerbate the outcomes it’s supposed to prevent.
The pros and cons, finally
Considering everything that was said, we can summarize the aspects of emotional AI as follows.
- Pros:
- It can improve customer service by detecting and responding to the emotions of customers.
- It can enhance empathy and communication skills by teaching people how to recognize and express their own and others’ emotions.
- It can detect emotions that may otherwise be hidden or ignored, such as stress, depression, or anxiety, and provide support or intervention.
- Cons:
- It may invade privacy and violate ethical principles by collecting and analyzing personal data without consent or transparency.
- It may be inaccurate or biased by relying on facial expressions or voice tones that are not universal or consistent across cultures, contexts, or individuals.
- It may reduce emotional intelligence by making people rely on technology rather than their own judgment or intuition.
AI, including emotional AI, is a rapidly evolving sphere of innovation and before making decisions it would always be wise to review the very latest solutions. New features are being introduced almost on a weekly basis and even if the “golden choice” is not a fit for your problem today, maybe it will be tomorrow. It’s imperative to stay up to date and not rely on an aged decision.
This article has been provided by a guest contributor. Caleb Danziger writes about science and technology on his blog, The Byte Beat. You are invited to visit the blog and read more posts from Caleb over there.
Photo credit: The feature image has been done by Werner du Plessis. The photo “Symposium Cisco” has been done by Ecole Polytechnique. The “VR museum” shot was taken by Lucrezia Carnelos. The image “World Trade Center Observatory” was prepared by Helena Lopes.
Source: Houwei Cao (CNN) / United States Patent and Trademark Office / Stuart Dredge (Music Ally) / Laurence Goasduff, Gloria Omale (Gartner) / Jonathan Cook (Medium) / Selena Larson (The Daily Dot)
Editorial notice: Update 11th of September 2023 – We included a pros and cons list overview to reflect the title as well.