The thrill of the hunt — and chance to snag a good deal — is what drives millennials to local marketplaces. In this month’s Digital Fraud Tracker, Nathan Garnett, general counsel of online peer-to-peer (P2P) shopping marketplace OfferUp, explains how combining user ID verification badges, safety tips and machine learning-based fraud detection tools is making that hunt less fruitful for the scammers and safer for deal hunters.
Smartphones and connected technologies are becoming increasingly symbiotic in many consumers’ retail experiences. However, as connected technologies offer new linking opportunities for consumers and retailers, online merchants are facing new adversaries in the fight against fraud.
The recent PYMNTS How We Will Pay report found that consumers are becoming even more attached to their devices, with smartphone ownership rising to 90 percent in 2019 from 84 percent a year prior. Additionally, 76 percent of respondents indicated they made purchases during at least one everyday activity. Such heightened connectivity might make online shopping more convenient, but it also comes with risks. The Federal Trade Commission found that millennial consumers are some of fraudsters’ favorite targets, and that online commerce losses for this group alone reached $71 million over the past two years.
Online merchants must ensure that their platforms are trustworthy if they want users to feel safe, and digital P2P marketplace OfferUp secures its platforms by using machine learning tools to monitor suspicious activities, and protect against bad actors. According to the firm’s general counsel, Nathan Garnett, the marketplace relies on both machine learning and human analysts to improve its anti-fraud responses. Garnett recently spoke with PYMNTS about how both machine learning and human insights keep eCommerce merchants and consumers secure, helping OfferUp to respond immediately to fraud.
“I think the platforms that learn to attack these issues in real time are the ones that are going to survive,” Garnett said.
Building Trust Up Front
Buyers and sellers can use OfferUp to exchange products, ranging from household items to used cars and trucks. Items can be paid for via credit cards, debit cards, Apple Pay, Google Pay or Samsung Pay, but those who meet up in person to exchange goods must pay with cash. The app also recommends in-person meeting spots like local police stations to encourage safe exchanges.
Online transactors often communicate with strangers they might never meet in person, so Garnett pointed out that it is essential for both sides to trust the platform. OfferUp provides users with “verification badges,” indicating that personal information such as email addresses or phone numbers have been confirmed. Customers can also submit selfies, along with photos of their government-issued IDs, to further prove they are who they claim to be.
The badges assure community members that OfferUp has taken steps to evaluate participants’ trustworthiness. Users are assigned ratings based on their transactions and interactions, providing others with the resources they need to make informed decisions.
“It’s helping people feel comfortable that the person they’re transacting with is who they say they are, and someone they actually want to do business with,” Garnett said. “Giving that toolset to the user to make educated decisions is really important.”
OfferUp uses artificial intelligence (AI) and machine learning to monitor in-app language to flag suspicious behaviors, and encourage smarter and safer decisions among its users. The solutions monitor messages exchanged in the chat feature, such as those from users who are encouraging others to take conversations offline via email or phone numbers instead.
“[Scammers may] try to lure people off our platform so they can engage in certain behaviors they know that we would catch if it happened in chat,” he said. “They’ll encourage people to leave the platform to communicate, which we discourage people from doing.”
The system does not prevent users from leaving, but will provide pop-up messages if it detects participants are trying to move the conversation offline at an unusually quick pace.
Tapping Data To Understand User Behaviors
These tools help users make more informed decisions, and Garnett noted that OfferUp includes back-end machine learning solutions to protect them from fraudsters in real time. He declined to offer specifics, but said the platform has established a large data collection based on billions of transactions to improve its anti-fraud efforts.
“Over time, we’ve built a dataset that allows us to identify behaviors that are problematic or indicative of bad actors,” he said. “We use those to find people — ideally, ahead of time — and remove them from the platform, or take additional steps to make sure they are who they say they are and not up to something nefarious.”
AI and machine learning solutions are effective, but these tools alone are not enough to stop fraud. Supporting these solutions is a team of human analysts who review flagged reports, look for new trends and identify emerging fraud patterns.
“From time to time, they’ll say, ‘This is something we want to start monitoring,’” Garnett said.
Analysts could recommend that OfferUp include automatic triggers that remove users based on suspicious behaviors, for example, which can include certain actions coming from new accounts that raise concerns. Garnett noted that users who engage in these behaviors might be attempting to lure others to spots where they could be robbed, or send them false money orders for purchases.
“[The analysts start] seeing those kinds of behaviors, analyzing them [and] mapping them to behaviors that we certainly don’t want to encourage,” Garnett said. “Sometimes, it is seeing where they occur across multiple accounts because, sometimes, these are coordinated.”
He added that the platform plans to double down on anti-fraud measures, having seen positive results from its efforts to thwart potential bad actors. The steps the company is taking will help it better understand how legitimate user actions differ from illegitimate ones.
“When you start to see the same patterns over and over again, you don’t see them with legitimate users,” he explained. “But, you see them with illegitimate users, and we start to flag this.”
These mechanisms still need to be supported by human analysts, who can refine and adjust algorithms to ensure the platform remains secure against fraud, and detects emerging threats, Garnett added. AI and machine learning are impressive, but human beings are still required to make them effective.
“Maybe, at some point in the future, we’ll reach the place where [AI and machine learning solutions] work better on their own, but we’re certainly not there yet,” he said. “A big part of what we’re able to do so far has been informed by the judgment of professionals who deal with this every day.”
Technology is becoming much more involved with consumers’ online retail experiences, yet human beings are still necessary to support retailers’ AI and machine learning solutions. Their insights will be increasingly vital to teaching technological solutions how to distinguish good actors from the bad in today’s connected environment.