AI Voice & Video Deepfake

The CFO Who Wasn't There

A video call with your CFO and three colleagues. Your CFO is at home, unaware.

Role: Executive Office·Attack type: AI Voice & Video Deepfake

Documented loss

$25,000,000

In this scenario or comparable documented case

The story

James Okafor is a senior finance manager at a multinational with operations in twelve countries. He handles large inter-company transfers regularly. When he receives a confidential Teams meeting invitation from what appears to be the CFO's assistant, he joins without hesitation.

On the call are four people: the CFO, the Chief Legal Officer, a senior director he's worked with for three years, and an external legal advisor he doesn't recognize but who is introduced convincingly. The video quality is slightly compressed — typical of Teams. The audio is clear.

The CFO explains the situation with the kind of controlled urgency James has seen before during acquisition windows. A deal is in final stages. Regulatory timing means funds must be in escrow before Asian markets open. Standard process, just accelerated timeline. The CLO confirms the legal structure. The senior director nods in agreement. Everything is consistent.

James processes the transfer of $25 million to the provided escrow account. The meeting ends. Two hours later, he receives a Slack message from the actual CFO about an unrelated matter. He mentions the call. The CFO has no idea what he's talking about.

Every participant on that call — the CFO, the CLO, the director, and the external advisor — was an AI-generated deepfake. The faces were sourced from publicly available company videos and conference presentations. The voices were synthesized from the same material. The entire setup cost the attackers an estimated few hundred dollars in compute time.

This is not a hypothetical. It is the documented Hong Kong case from January 2024, reported by Reuters and confirmed by Hong Kong Police. The actual loss was $25 million USD equivalent in HKD. The employee initially had doubts but was reassured when he 'saw' familiar faces — which is precisely what the attackers relied on.

What happened

$25 million transferred to a fraudulent escrow account.

What stops it

A code request before any large approval requires the real CFO's enrolled device — which the deepfake cannot produce.

What this scenario teaches us

  • Video conferencing platforms authenticate session credentials, not the biometric identity of participants. A 'verified' Teams or Zoom call does not verify who is showing their face.

  • Deepfake quality is now sufficient to fool humans who know the person being impersonated. Familiarity with a face no longer provides security assurance.

  • Unusual urgency combined with confidentiality instructions ('don't loop in anyone else') are the two most reliable signals of a social engineering attack — even when the attack is visually convincing.

  • Any financial transaction above a defined threshold should require out-of-band verification before processing, regardless of how convincing the authorization appears.

Prevention checklist

  • Require Real Authenticator code verification before any single transfer above your threshold

  • Establish a policy: confidentiality instructions never override verification requirements

  • Train staff that video call 'presence' is not identity verification

  • Designate a callback protocol for large transactions: call a known number, not one provided in the meeting

Frequently asked questions

Is this type of attack common?

Documented cases are growing rapidly. The Hong Kong case in 2024 was the largest confirmed instance, but security researchers and law enforcement agencies worldwide report increasing frequency. The barrier to entry is falling as AI synthesis tools become cheaper and more accessible.

Would a verbal code word help?

Shared verbal code words are better than nothing but have two problems: they require advance setup and can be inadvertently disclosed. Real Authenticator's TOTP codes rotate every 30 seconds and are generated from a device-resident secret — they cannot be extracted from a conversation even if the word is overheard.

Sources & citations

Loss figures are based on documented cases or FBI IC3 reported averages. Individual scenario details are illustrative reconstructions based on documented attack patterns. Real Authenticator is not affiliated with any cited organization.

Your team can't verify.
AI already knows it.

Every week you don't have a verification layer is a week an attacker can impersonate your CFO, your legal counsel, or your vendor — and someone on your team will trust them. Close the gap.

Reply within one business day
30-day pilot, no contract required
Zero-knowledge — nothing to breach
Talk to Our Enterprise Team

Custom pricing · Volume discounts · Annual contracts available