How manipulated images, videos and recordings are the new faces of real estate fraud
Not long ago, real estate scams typically involved forged deeds, stolen identities or shady shell companies. The scams were relatively crude and usually relied on old-school trickery. Now? The game has changed and in ways few people could have predicted even five years ago.
In today's market, real estate fraud isn't just a matter of forged signatures or phony wire instructions. It's about impersonation on a scale — and with a realism — that was once the stuff of science fiction. We're talking about deepfakes: computer-generated images, videos and voices that mimic real people so well that even seasoned professionals can't always tell the difference.
Nowhere is the risk greater than in real estate. The stakes are high, the timelines are short and deals routinely involve six- or seven-figure wire transfers based on trust, speed, and the assumption that the person on the other end of a Zoom call is who they say they are. Increasingly, that's not always the case.
As Mark Rasch, a data privacy expert, attorney and adviser to financial institutions, put it: "The entire real estate industry is built on trust. Deepfakes are engineered to exploit that trust. They're designed to sound like you, look like you, act like you — and in some cases, fool even your colleagues or clients."
A new breed
It's no longer necessary for a scammer to break into an email account to pull off the crime of fraud. These days, it's enough to sound convincing or look familiar on a screen.
With a few minutes of video footage and a small audio sample — say, from a podcast appearance, a Zoom panel or even a TikTok — criminals can now clone your voice and face with chilling accuracy. With a little technical know-how, they can animate that clone in real time.
Imagine a fake version of yourself video-chatting with your bank, your title company or your real estate agent. Imagine them confirming a wire transfer, approving a loan or signing closing documents. It sounds wild. But it is happening.
In one real case reported by CNN, a finance worker at a multinational firm was tricked into paying out $25 million to fraudsters after receiving what seemed to be a routine video call from his company's chief financial officer. The only problem? The CFO wasn't real. The voice and face were fakes — deepfakes, in fact. Created using publicly available images and voice samples, the fraud was so sophisticated that it fooled everyone involved.
It didn't take long for similar schemes to make their way into real estate. After all, the mechanics are the same: large amounts of money, remote communication, time pressure, multiple participants in each transaction and a lot of trust.
How it works
In Florida, one title company came dangerously close to sending hundreds of thousands of dollars to a scammer who posed as both the buyer and seller in a transaction. The fraudster spoofed the buyer's email, then used an artificial intelligence-generated voice to confirm wire instructions over the phone.
The call seemed legitimate — same voice, same casual tone, even a reference to a recent conversation. The person on the other end sounded just like the buyer. But the buyer never made the call. The money nearly disappeared into an overseas account. CNN covered a similar incident involving voice cloning used to trick family members and coworkers alike.
Here's the chilling part: you don't have to be a tech wizard to pull this off. The tools are out there, many of them free or cheap. Platforms like DeepFaceLab let users overlay a person's face onto someone else's body in a video. ElevenLabs and Resemble.ai can replicate someone's voice from a few audio clips. Another tool, Avatarify, lets people animate still images to simulate live video. You can feed it a headshot, and it'll respond to your facial movements as if the person is speaking live on camera.
Now combine that with a spoofed phone number and a fake email account. You've got a convincing identity, ready to walk, talk and send instructions that look and sound authentic. And because so many parts of the real estate transaction happen remotely, especially post-COVID, there's often no in-person meeting to reveal the deception. It's like a Hollywood special effect except it's being used to steal real money from real people.
Why real estate?
It's a perfect storm. Real estate transactions are high-dollar and often one-off, which means that clients and professionals alike are less likely to know each other well. There's no time to build trust through personal relationships. You have an escrow officer in one state, a buyer in another, a lender across the country, and everything coordinated through a patchwork of emails, phone calls and the occasional Zoom meeting.
That kind of distributed, high-stakes communication is exactly what deepfake scammers look for. The system is built on implicit trust and speed. Throw in a deadline — say, a wire that must go out by 4 p.m. or the deal dies — and you've got a ripe opportunity for manipulation.
What's more, the increasing use of digital notarization, e-signatures and online identification verification creates more attack surfaces. These tools are convenient, but they also lower the bar for authentication. In a world where someone can "show up" on camera holding their identification, nodding, smiling and speaking in a familiar voice, all of it fake, you have to ask: how do we know who we're really dealing with?
Legal blind spots
On May 19, 2025, President Trump signed a sweeping bill known as the Take it Down Act into law. The new legislation bans the nonconsensual online publication of sexually explicit images and videos that are both authentic and computer-generated . But from a legal standpoint, the atter of DeepFakes remains murky territory. Most other laws relating to fraud and identity theft were written before anyone imagined that someone could fake a live video call or create a real-time clone of your voice. Sure, laws like the Computer Fraud and Abuse Act and the federal wire fraud statute apply. But what about when a lender follows all their usual procedures, and the fake still gets through?
Under UCC Article 4A, banks and financial institutions are generally protected if they follow "commercially reasonable" security protocols. But courts haven't yet decided what counts as "reasonable" when the fraud involves AI-generated identities.
Should a title company be liable if it relied on a video call that looked completely authentic? Should the lender have required additional verification? Those questions are still working their way through the courts and the answers will likely reshape how real estate closings are conducted in the years to come.
Protect yourself
There are no silver bullets that can block 100% of all deepfakes. But there are a few practical, if sometimes old-fashioned, steps that can make a big difference.
First, always verify critical information using a known channel. If someone emails you wire instructions or calls to confirm them, call back using a number you already have on file—not the one in the email. Don't rely on caller ID. Don't rely on video. Rely on the people you know, and the systems you've already built.
In a deepfake world, the only thing you can trust is what you've already verified. You can't trust your eyes. You can't trust your ears. You can only trust your process.
Second, invest in training. Staff need to know what these scams look like and how they work. They should be encouraged to slow down, ask questions, and double-check anything that feels off—even if it looks perfectly real.
Finally, technology can help. There are companies now offering "liveness detection" in video ID checks. Essentially, these are tests to see whether the face on screen is responding to natural light, depth and motion, or whether it's been generated or pre-recorded. These tools aren't foolproof, but they're better than nothing.
Future of Trust
As deepfakes get more sophisticated, the old ways of verifying identity—voice, video, even live appearances—are going to become less reliable. That's a hard truth. But it's also a call to action.
Real estate is a business built on trust. And trust, in this new era, will have to be earned differently. Not through appearance or familiarity, but through deliberate verification, structured protocols, and perhaps most importantly, a healthy dose of skepticism.
Because the next time you get a call from a client saying, "Go ahead and send the funds," you might want to ask yourself: is this really them? Or is it just a really good fake?
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.