close
close

Account onboarding gets an AI makeover

Account onboarding gets an AI makeover

That’s the problem with fraud: you don’t want to have to calculate the ROI of fraud until it’s too late. Because the downside can be treacherous. Unfortunately, the prevailing mindset among banks and other businesses may be that they don’t have the resources to invest in these anti-fraud measures until they’ve been attacked.

Garrett Laird, head of product management at Amount, a SaaS digital lending and decisioning platform that supports deposit account opening and lending for individuals and small businesses, told PYMNTS that many financial institutions (FIs) don’t rethink their fraud prevention methods until it’s too late.

“You may not realize it yet,” Laird said, “but they’re going to attack you.” He added, “The scammers are idiots – and they like to attack you on holidays and weekends, at two in the morning.”

The conversation was part of the “What’s Next in Payments” series, which focuses on protecting the perimeter security of various organizations from cyberattacks and hackers. The aim is to keep fraudsters out, but to let loyal customers in and enable them to make payments easily and quickly.

Working with banks and credit unions and helping with digital lending, Laird said, means that decision-making, pricing, fraud and verification are important — and simultaneous — considerations that need to be addressed in real time. When a bank receives a new application and someone opens a new savings account or applies for a loan or credit card, there can be serious consequences if a fraudster manages to get through.

A single account can represent a “gap” or “loophole” that allows a larger group of criminals to gain an advantage, he said. Fraudsters are known to look for banks’ “weak spots,” so a single application can become a wave of hundreds of other applications, all seeking to provide an entry point for a scam or data theft.

“We were a direct lender ourselves,” he said of his platform’s functionality, “and we developed technologies that we used ourselves. We are confident that we can also lend money to other financial institutions and help them launch new products.”

Technically supported onboarding

A tech-enabled onboarding experience powered by artificial intelligence (AI) and machine learning can not only increase security but also elicit a positive customer response, making legitimate relationships lasting and long-lasting, Laird said.

“This all leads to better conversions when you keep your customers happy,” Laird says, rather than losing the same potential customer to the financial institution that offers a comparatively better user experience.

He pointed out that there are several data sources that can be used to gain insights into emails, passwords, linked bank accounts and uploaded documents as part of identity verification.

“There’s a waterfall that we can send applicants through,” he said. “Let’s say we just discovered a fraud ring that’s really good at falsifying documents and getting around some of (a financial institution’s) controls. We can put an extra layer of friction in their path,” he said, “by escalating to manual review queues so that anti-fraud teams can ‘keep an eye’ on how that fraud ring is developing … and not come in the front door in the first place.” AI, he said, helps with third-party fraud models to detect fraudulent applications and represents another tool in the (rules-driven) anti-fraud toolbox.

“We’ve tried to be proactive and provide the right data and processes to make smart decisions,” he said, adding: “It’s not just about keeping the ‘bad’ out, but also letting the ‘good’ in and making it as painless as possible for them.”

Leave a Reply

Your email address will not be published. Required fields are marked *