Ingo Payments Generation Instant Overpayment Disbursements June 2024 Banner

New AI Model Decodes Human Emotions in Real Time, Shapes Commerce Strategies

Finnish researchers have developed an AI model to interpret human emotions in real time.

“Humans naturally interpret and react to each other’s emotions, a capability that machines fundamentally lack,” Jussi Jokinen, a professor at the University of Jyväskylä who led the research, said in a news release. “This discrepancy can make interactions with computers frustrating, especially if the machine remains oblivious to the user’s emotional state.”

The model can currently identify six emotional states: happiness, boredom, irritation, rage, despair and anxiety. It’s based on a psychological theory that emotions are generated when human cognition evaluates events from different perspectives.

“Consider a computer error during a critical task. This event is assessed by the user’s cognition as being counterproductive,” Jokinen elaborates on the model’s functioning. “An inexperienced user might react with anxiety and fear due to uncertainty on how to resolve the error, whereas an experienced user might feel irritation, annoyed at having to waste time resolving the issue. Our model predicts the user’s emotional response by simulating this cognitive evaluation process.”

Experts say this emotion-predicting AI model could impact commerce by enabling more personalized and responsive customer experiences. Retailers and service providers could use the technology to gauge customer reactions in real time, allowing them to tailor their offerings or adjust their approach on the fly. For example, in eCommerce, the AI could analyze a user’s emotional state while browsing and customize product recommendations or adjust the user interface accordingly. In physical stores, emotion-sensing kiosks or apps could provide targeted assistance or promotions based on a customer’s mood. 

“Voice AI technology now navigates emotionally charged conversations with remarkable sensitivity, rivaling human agents,” Nikola Mrkšić, CEO and Co-Founder at PolyAI, told PYMNTS. “In customer support settings dealing with delicate matters like bereavement, AI-powered voice assistants can modulate their tone and approach to convey empathy and understanding. Customers have told us that they find these AI interactions about sensitive topics less emotionally draining than speaking with human representatives, and therefore prefer automation over live agents.”

Mrkšić highlights that the evolving sophistication of AI systems is reshaping human-AI interactions. “Voice AI integration in contact centers will become crucial for enhancing customer experiences and managing service interactions efficiently at scale. Success in this domain will hinge on striking a balance between transparency, emotional acuity, and ethical considerations, allowing both businesses and customers to maximize the benefits of AI technology.”

The research is part of a growing industry around AI that understands emotions. For example, MorphCast Emotion AI claims that it “allows you to effectively capture the attention of online users, improve your audience engagement and boost your digital marketing experience results through personalized and highly impressive emotional interaction.”

As PYMNTS previously reported, Synthesia, backed by Nvidia, is developing avatars that respond to human emotions conveyed through text, potentially redefining the future of customer service with near-human interactions.

Potential Applications in eCommerce and Customer Service

“Emotion AI has the potential to provide measurable, repeatable data about customers’ experience continuously and in real-time, allowing companies to optimize their customer journeys and focus on the elements that drive engagement, satisfaction, and sales,” Michel Valstar, co-founder and chief scientific officer of Blueskeye AI, told PYMNTS. 

In eCommerce, “It could help a bride-to-be find the perfect dress. Virtual try-on apps could come complete with a ‘delight’ score capturing the bride’s emotional response to each dress and providing a ranking,” Valstar suggests. He adds that this could be achieved “with commonly available technology, the cameras in mobile phones or apps, in stores.”

For customer service, “Emotion AI will also empower chatbots and virtual assistants to better understand user prompts and user reactions to a chatbot’s suggestions, which in turn will lead to better customer experience,” Valstar notes. He provides a specific example: “This technology could identify and respond to customer’s emotions during an interaction with a chatbot. For example, identifying the frustration or confusion caused by a non-routine inquiry and escalating the customer to a human operator.”

Workplace and Productivity Implications

In workplace settings, emotionally intelligent software could potentially improve project management by anticipating team stress levels and suggesting interventions. Communication platforms might provide feedback on the emotional impact of messages before they’re sent.

“With our model, a computer could preemptively predict user distress and attempt to mitigate negative emotions,” Jokinen adds. “This proactive approach could be utilized in various settings, from office environments to social media platforms, improving user experience by sensitively managing emotional dynamics.”

“Combined with gaze tracking, businesses can tell what parts of their products are leading to delight and what parts lead to confusion or annoyance,” Valstar also highlights as another potential application.

Ethical Considerations and Future Outlook

As with many technological advancements, the ability to detect and potentially influence emotions raises ethical questions about privacy and consent. Regulators and policymakers will need to address these issues as the technology develops.

Despite these challenges, both researchers see significant potential benefits. The future could see businesses “define an emotional target zone for your customer interaction and monitor, in real time, how close the customer’s experience is to that target zone as well as be able to train your virtual assistant or chatbot to take action to move the customer back to your target zone,” Valstar envisions.

When asked about the impact on customer satisfaction and sales metrics, Valstar responds, “In a word, positively.”

“Our model can be integrated into AI systems, granting them the ability to psychologically understand emotions and thus better relate to their users,” Jokinen concludes.

As research continues and practical applications emerge, businesses across various sectors may need to consider how this technology could enhance their products and services, potentially offering a competitive advantage in an increasingly digitized marketplace.

For all PYMNTS AI coverage, subscribe to the daily AI Newsletter.