PYMNTS-MonitorEdge-May-2024

Treasury Report Finds ‘Significant’ Gaps in AI Fraud Prevention

bank cybersecurity

Larger financial institutions have an edge over smaller ones in AI-related fraud prevention.

That’s according to a new report from the U.S. Treasury on dealing with artificial intelligence (AI)-centered risks in the financial sector.

“As more firms deploy AI, a gap exists in the data available to financial institutions for training models,” the department said in a press release accompanying the report on Wednesday (March 27). “This gap is significant in the area of fraud prevention, where there is insufficient data sharing among firms.”

With financial institutions (FIs) working with internal data to create these models, bigger institutions have a major advantage, the report said. They tend to have much more historical data, while smaller FIs typically lack the internal data and expertise needed to develop their own anti-fraud AI models.

The report also finds a greater need for FIs to share data with each other to better train anti-fraud AI and machine learning (ML) models.

“Unlike data on cybersecurity, there is reportedly little fraud information sharing across the financial sector, which limits the ability to aggregate fraud data for use by AI systems,” the report said. “Most financial institutions the Treasury spoke with expressed the need for better collaboration in this domain, particularly as fraudsters themselves have been using AI and ML technologies.”

Research by PYMNTS Intelligence has shown FIs use a range of tools to prevent fraud, with financial institutions across the board saying they rely on a mixture of in-house fraud prevention systems, third-party resources and new technologies to safeguard their institutions and customers.

Last September, when PYMNTS Intelligence put together the 2023 “The State of Fraud and Financial Crime in the U.S.,” 66% of banking executives said they were using AI and ML to fight fraud, which was up from 34% in 2022.

“But developing AI and ML tools can be costly, which explains why only 14% of FIs say they build their own fraud-fighting AI and ML technologies,” PYMNTS wrote. “Nearly 30% say they rely entirely on third-party vendors to provide these tools. Similarly, just 11% of FIs develop their own APIs in-house, while 22% rely entirely on third-party API solutions.”

Meanwhile, PYMNTS spoke recently with Robin Lee, general manager of APAC at Hawk AI, on the need to combine technology with critical thinking in dealing with financial crime, what he described as the “Robocop, not Terminator” approach.

“When the first Robocop movie came out, the tagline was: part man, part machine, and all cop,” he said. “That does a good job in summarizing the approach that we need to take, versus Terminator, which is 100% machine.”

PYMNTS-MonitorEdge-May-2024