Faster payments schemes are gaining traction around the globe. Unfortunately, so are the fraudsters. As recent PYMNTS research on the topic shows, some financial institutions (FIs) find it challenging to safeguard their real-time and faster payments systems, leading many to examine their defenses. According to PYMNTS January 2021 Real-Time Payments Tracker® 80 percent of FIs surveyed in the Asia-Pacific region “reported rises in fraud losses after debuting real-time payments services. Tools that help banks quickly analyze consumers’ behaviors for suspicious activities could help banks level up their defenses and ferret out bad actors before fraudsters pull off their scams.”
Of course, one of the most valued use cases for faster payments is in peer-to-peer (P2P) payments. As Dena Hamilton, senior vice president, global product management at Featurespace, told PYMNTS in a recent interview, P2P fraud attacks spotlight the urgency of moving away from rule-based systems as fraud patterns shift. Understanding “genuine customer behavior” will be key in the efforts to preventing attacks. Hamilton said account takeovers have jumped by more than 40 percent, as measured from 2020 over 2019’s levels.
The relative lack of recourse — due to shortened “windows” to identify and stop suspicious payments — and the greater volume, overall, of digital transactions have proved to be irresistible lures for fraudsters. One wrinkle within the deluge of fraudulent activities has been significant growth in “money mule” fraud, where someone moves stolen money to help launder proceeds through P2P payments.
Time, as always, is of the essence. Banks have traditionally had a seven-day recourse window to reverse payments or cancel them. But within the faster payments schemes, that recourse window is truncated to hours. So a reactionary approach — flagging suspicious behavior after it has happened — is one that is all too often too little, too late.
Leveraging Advanced Technologies
The best tools and best practices in combatting the fraudsters, said Hamilton, are predictive and behavioral detection efforts by FIs, armed with machine learning and artificial intelligence.
“When you look at rule-based systems, generally,” noted Hamilton, “it’s from a ‘human perspective’ … and we look at one dimension, maybe two dimensions” of a transaction. There’s no ability, with that approach, to form a “holistic” view of the customer. And in the financial services space at large, earlier attempts at behavioral detection have been tied to patterns of “known” bad behavior and what’s happened in the past. Rules-based systems, she cautioned, incorporate the biases and prejudices inherent in the people or the individuals who write those rules.
But as Hamilton told PYMNTS, FIs need to deploy machine learning and artificial intelligence (AI) to understand the “good” behavior of the consumer instead of trying to predict every individual attack. “The ‘optimal combination’ of advanced anomaly detection, working with machine learning, allows the system to understand real time the individual behavior of a person. And then, you can spot those significant changes in behavior. This turns the traditional approach on its head,” she said.
Featurespace, she added, has seen the challenges of faster payments-enabled fraud — and how machine learning and AI can stop criminals in their tracks — in the U.K. (an early proving ground for faster payments). Identifying and understanding the positives of how a consumer transacts allow FIs to spot the anomalies that a fraudster leaves as telltale signs when they try to “act” like a legitimate consumer.
As she explained to PYMNTS: No one — other than the consumer — knows exactly how they transact, which gives the FI ammunition in its battle against account takeovers. As always, there’s a delicate balancing act when it comes to crafting as seamless an experience as possible.
We may never see the advent of a truly frictionless interaction, said Hamilton. But a holistic view of the customer, gleaned from several data points, can ensure the least intrusion possible. Data privacy must also be top of mind, she said, and consumers want assurance that their card data is being kept secure. She noted that financial institutions would have to analyze the models’ output rather than rely on the model itself.
“At the end of the day, being a responsible steward of a customer’s data is paramount to maintaining trust in banks and financial institutions,” she said, adding that “behavioral data will undoubtedly play a key role in protecting the consumers as it does today.”
But as Hamilton told PYMNTS, FIs need to deploy machine learning and artificial intelligence (AI) to understand the “good” behavior of the consumer instead of trying to predict every individual attack. “The ‘optimal combination’ of advanced anomaly detection, working with machine learning, allows the system to understand real time the individual behavior of a person. And then, you can spot those significant changes in behavior. This turns the traditional approach on its head,” she said.