Paging Dr. Alexa: How are you feeling? It may not be long before virtual, voice-controlled assistants can tell and answer accordingly. Mattersight VP of Product Andy Traba says the tech’s all there, and it’s just a matter of time before our friendly home bots pick up the decidedly human skill of empathy.
Mattersight analyzes call center phone conversations to determine which language patterns correlate to which outcomes. It does this using technology pioneered by NASA in the 1970s to pair astronauts by personality before sending them into space together.
If a caller owes a company money, which words and phrases did he use that may indicate he’s likely to pay it back? If an agent is on the phone with him trying to make a sale, what were some key words in the conversation that correlated to the customer making the buy – and can a company teach those to its other agents to engineer more positive emotions in customers via “winning behaviors?”
These are just a few of the thousands of points in each conversation that Mattersight endeavors to turn into quantifiable data points. The analysis also monitors things like the caller’s distress level, the agent’s level of engagement and even major life events like purchasing a house or getting married.
The company has analyzed more than a billion calls in the last decade and uses that data to build consumer profiles so that callers can be paired to the live agent best matched to their personality type – automatically, before the call is even answered.
Traba said that these profiles give companies a richer understanding of their customers and replace educated guesswork with data-driven activity. Plus, the system plays to employees’ strengths and requires no extra training on their part, because they are simply being placed in the position where they will have the most success.
“We throw more fast balls to fast ball hitters and more curve balls to curve ball hitters,” said Traba. “By doing that, we increase everybody’s batting average.”
So then why, posits Traba, couldn’t the same capabilities be taught to an artificial intelligence via machine learning in order to reap the same benefits in self-service settings?
“There’s been a lot of investment in giving personal assistants their own personality,” said Traba, “but the way they communicate to you is pretty much the same: you ask them a question and they answer it the same way, no matter how you deliver it. People want to be communicated to differently. It’s not what you say, but how you say it.”
For example, an empathetic Alexa might hear Traba ask what time the new James Bond movie is playing. She recognizes his unique voiceprint and taps into his profile. Because his profile shows that he is an organized thinker who values linear thought patterns, the smart assistant would then give a very specific answer including show times, which local theaters were playing the movie and what time he should leave in order to arrive on time.
But if Traba’s wife asked the question, her unique voiceprint would trigger a profile for someone who is much more concerned with the overall experience of the interaction. Alexa might tell her the movie starts in two hours and to have a great time.
In this hypothetical, both parties got the same information, but the delivery was unique and personalized, said Traba, capitalizing on how each person prefers to communicate.
Traba doesn’t think it would be that much of a stretch to take the capabilities NASA and Mattersight have already explored and apply them to this new situation. Either way, the underlying data source is simply language in text, transcribed from recorded phone calls.
Traba believes the next step will be to analyze the benefits driven by such unique and personalized service. Does the local movie theater see more business thanks to empathetic Alexa? Does Alexa herself get more use after her owners have a positive interaction? Traba said that even a slight uptick would prove valuable.
Beyond that, if home speakers go the video route, then even more data can one day be added to the profile, Traba said. Face, hand gestures and posture communicate a lot about a person’s state of mind and can indicate whether a conversation is going well or poorly even when no audio is available.
All of this is possible within the next several years, Traba believes – but is it advisable? Could empathetic robots ever really replace human interaction, and is that something people want?
Traba said that Mattersight’s customers have mixed feelings about that. But what they can all agree on is that, despite introducing more channels for interaction (email, chat, social media, and classic phone calls), the call volume is always on the rise – and so are the distress levels of callers, often because they attempted a self-service route initially and were unable to get the information they needed.
“Self-service channels don’t seem to be working because the length of the phone calls keeps going up, and people are entering those conversations more emotionally charged than ever,” Traba said. “So the technology isn’t there now, but it could get there. One day, self-service channels will get to empathy and deliver a great user experience. It’s a race to get there.”