How fraudsters are using AI to impersonate businesses

Published

Fraudster AI blog

We have always treated the voice on the other end of the line as the ultimate verifier. Whether you are calling a client to confirm their source of funds, or fielding a call from a "bank" warning you about suspicious activity, hearing a human voice provides a sense of safety.

It feels personal, immediate, and hard to fake.

But in our recent webinar, Compliance Leaders vs Fraudster, we sat down with ex-fraudster Alex Wood to challenge that assumption. In an unfiltered conversation, he revealed why the phone line is no longer a secure channel. From criminals using AI to mimic your clients, to fraudsters posing as your own finance team, the technology you trust to protect your firm might actually be opening the door to them.

The three-minute clone

We often think of deepfakes as high-tech tools used by hackers and well-funded criminals. The reality is much simpler and much cheaper.

With just a three-minute voice sample, a criminal can now create a 99% accurate clone of anyone’s voice.

That sample is frighteningly easy to get. It could come from a podcast, a social media video, or even a previous phone call where the fraudster recorded your client speaking. Tools like ElevenLabs allow criminals to clone these voices without needing access to the dark web.

Once they have the clone, they don't just leave a voicemail. They can have a live conversation.

The "bored dude" act

The technology is only half the threat. The other half is social engineering. If you were to describe how a fraudster may act, you may think aggressive, or threatening behaviour - however this is far from reality.

Alex revealed how the most effective criminals don't rely on intimidation; they rely on boredom.

A common tactic is to play the role of a "bored dude" from the impersonated business’s fraud team. By sounding unenthusiastic, reading from a script, and acting like they are just doing a job, a fraudster lowers the victim's guard.

Now, imagine that social engineering combined with a deepfake of your CEO or your biggest client.

It's going to take a very, very brave person to then think, 'Hang on, I'm going to go upstairs and confront the CEO and see if actually that was him'."

Alex Wood

Why "liveness" is the new standard

If you can't trust your ears, you have to trust your technology.

The phone call fails because it verifies the data (the voice sounds right), but it doesn't verify the source. You have no way of knowing if the person speaking is a human or an AI model.

This is where biometric verification changes the game.

Unlike a phone call, biometric checks rely on "liveness". This technology scans the user's face in real-time to determine if they are a real person present at that moment, or if they are using a mask, a screen, or a deepfake.

It adds the necessary friction to the process. It forces the person on the other end to prove they are physically present, something an AI voice clone simply cannot do.

Verify what you see, not what you hear

The landscape of fraud has shifted. The tools that kept us safe five years ago are now the very tools criminals exploit.

At Thirdfort, we believe in verifying the person behind the device. Our ID check uses biometric liveness detection to ensure you are dealing with a real human, not a synthetic clone or a pre-recorded video. Don't leave your firm's security up to a phone call.

Subscribe to our newsletter

Subscribe to our monthly newsletter for recaps and recordings of our webinars, invitations for upcoming events and curated industry news. We’ll also send our guide to Digital ID Verification as a welcome gift.

Our Privacy Policy sets out how the personal data collected from you will be processed by us.

Related articles