Data Report · Updated 2026-03-07 · Sources cited

Latest AI Voice Cloning Scam Stats & Trends Updated February 2026

Every stat on this page links to its primary source: FBI IC3, FTC Consumer Sentinel, AARP, McAfee, Sumsub, Verizon DBIR, and Regula. No estimates presented as fact. Sources dated and cited.

Financial Losses

$12.5B

Total FBI-reported cybercrime losses, 2023

FBI IC3 Annual Report 2023

$2.9B

BEC / CEO fraud losses alone, 2023

FBI IC3 Annual Report 2023

$2.7B

Consumer impersonation scam losses, 2023

FTC Consumer Sentinel 2024

$1.3B

Romance scam losses (often voice-enhanced), 2023

FTC Consumer Sentinel 2024

$41M

Grandparent scam losses in 2022 (pre-AI-clone baseline)

AARP Fraud Network 2022

$25M

Single deepfake video conference attack, Hong Kong 2024

Hong Kong Police 2024

Technology Capabilities

3 sec

Audio needed to produce convincing voice clone

McAfee Artificial Impostor, 2023

<200ms

Latency of real-time voice conversion in live calls

RVC/ElevenLabs benchmarks, 2024

70%

Adults unable to identify AI voice clone by ear

McAfee consumer survey, 2023

~$5

Monthly cost of commercial voice cloning API access

ElevenLabs pricing, 2024

0

Cost of leading open-source voice cloning tools (RVC)

GitHub releases, 2024

Scale & Growth

10×

Increase in deepfake fraud attempts, 2022 → 2023

Sumsub Identity Fraud Report 2023

3,000%

Increase in AI-assisted phishing attacks since 2022

SlashNext 2024 Phishing Report

77%

Companies reporting deepfake fraud attempts in 2024

Regula Deepfake Report 2024

25%

Adults who have received an AI voice clone attempt

McAfee consumer survey, 2023

82%

Of all fraud involves a human element (not system)

Verizon DBIR 2024

Behind every stat is a real family that lost money.

The numbers keep growing because most families don't have a verification protocol. Real Authenticator gives you one. Free. Today.

Download Free

What These Numbers Mean for Your Family

The figures above represent reported losses only. The FTC estimates that less than 5% of fraud is reported to federal agencies. Actual losses across all AI voice scam categories are likely 10–20× higher than documented figures.

The growth trajectory is unambiguous: every metric — technology capability, attack volume, losses, victim count — is increasing year-over-year. Voice cloning tools that required $10,000 and a research lab in 2020 now run on a $300 GPU in 2026.

The statistical reality: If you have elderly parents or grandparents, there is a material probability they will receive an AI voice scam attempt in 2026. The question is not whether the attack will occur. The question is whether your family has a protocol in place when it does.

Know who you're really
talking to

In a world of deepfakes and impersonation, Real Authenticator gives you and your trusted contacts a private, unforgeable way to verify identity. Download today — it's free.

Download on App Store

Free to download · No credit card required · Privacy-first design