← Back to Blog

Set a Family Safe Word This Weekend. The Grandparent Scam Just Got an AI Upgrade.

Scam Alert AI Voice Cloning Family Safety Phone Scams

TL;DR

  • McAfee’s research shows three seconds of clean audio is enough to clone a voice with 85% accuracy. The tools are free and require no expertise.
  • The FBI’s 2025 IC3 report logged $7.75 billion in senior fraud losses (a 59% jump over 2024), with at least $5 million tied directly to AI distress-call scams using cloned voices.
  • The fix is a family safe word: a four-to-five-word phrase that is not Googleable, never written down digitally, and shared only in person. It’s the FBI’s official recommendation.
  • If a panicked caller can’t say the safe word, hang up and call the family member directly on a number you already have saved.
  • Lock down voicemail greetings and audit older relatives’ public videos. That is where scammers harvest the voice samples.

If your mom called you at 2 a.m. crying, panicked, slurring through tears that she’d been in a car accident and needed bail money wired to a stranger right now, would you stop and verify her voice?

For most people, the answer is no. You’d act. You’d send the money.

That instinct is exactly what AI voice cloning scams are built to exploit. And they’re working at a scale that should make every family with an aging parent (or a kid old enough to have posted a TikTok) sit down for ten minutes this weekend and have a specific conversation.

This post is for the adult kid who keeps a closer eye on Mom or Dad’s phone than they used to. Or the parent who notices Grandma’s been getting weirder calls. The scams have gotten really good. The defense is simple, free, and takes about 30 seconds to set up.

What’s actually happening

The “grandparent scam,” where someone calls an older person pretending to be their grandkid in a panic, has been running for over a decade. Until recently, it was a clumsy con. The voice never really matched. The story usually fell apart on the second question.

That’s no longer true.

In December 2024, the FBI’s Internet Crime Complaint Center issued PSA I-120324-PSA warning that criminals are using generative AI to “increase the believability of their schemes,” including generating audio clips that mimic loved ones requesting financial help. McAfee’s security researchers tested how much voice it actually takes: three seconds of clean audio is enough to clone a voice with an 85% match, and a slightly longer sample pushes that to 95%. The tools to do it are free and don’t require any technical background.

Three seconds. That’s shorter than a voicemail greeting. Shorter than a single sentence in a TikTok video. Shorter than the audio your kid posted to their Instagram story this morning.

The scale is in the FBI’s 2025 Internet Crime Report. Americans 60 and older filed 201,266 complaints in 2025 and reported $7.75 billion in losses, a 59% jump over 2024. The average senior victim lost $38,500. More than 12,400 of them lost over $100,000 each. Of those reports, 3,100 explicitly referenced AI, with senior losses topping $352 million, and at least $5 million was tied directly to distress-call scams using cloned voices.

The real number is almost certainly higher. Most people who lose money to one of these calls are too embarrassed to file a report.

Why this scam keeps working

The mechanics are simpler than they look. The script is always one of three flavors: a car accident, an arrest, or a kidnapping. The “grandkid” or “child” is hurt, in trouble, can’t talk long, and someone else (a lawyer, a bondsman, a cop) is going to take over the call to walk you through getting the money out.

Three things make it so effective.

First, the voice is right. Your brain is built to recognize a loved one’s voice. Once your auditory system says “that’s my daughter,” your prefrontal cortex stops asking questions and starts solving the emergency. Scammers know this and lean on it hard with crying, urgency, and just enough audio distortion to explain any subtle imperfection (“I’m on a bad line, Mom, please listen”).

Second, the second voice. Once the “victim” hands the phone to a “lawyer” or “officer,” the scam is no longer trying to be the family member. It just needs you to be in fight-or-flight mode while a calm, professional voice gives you instructions. By the time you’d normally start to suspect, you’re already at the gas station buying gift cards.

Third, the payment rails. Wire transfers, gift cards, and cash couriers are the asks. All three are designed to be irreversible by the time the family member who’s “in trouble” picks up their actual phone an hour later.

The FBI’s recommendation: a family safe word

Here is the entire defense in one sentence: AI can clone a voice, but it cannot clone knowledge.

A cloned voice can say your daughter’s name, your dog’s name, your neighborhood. Most of that is on her Instagram. What it can’t say is a phrase that was never published anywhere, that she and you agreed on in person, and that she’ll deliver on demand if she ever calls you in a real emergency.

The FBI’s PSA recommends exactly this: pick a secret word or phrase, share it only with people who’d plausibly call in a crisis, and never give money over the phone to anyone who can’t say it. Cybersecurity experts and major reporting on the topic, including CBS News, echo the same protocol.

A few rules for picking one that actually holds up:

A single word is not enough. Pick a four-to-five word phrase. The longer it is, the harder it is to guess and the less likely it is to leak.

Don’t use anything Googleable. Not your street, your alma mater, your dog’s name, the name of the lake house. Anything in a public obit, an old yearbook, or a Facebook profile is already in a scammer’s data set.

Pick something weird and a little memorable. Something that’s hard to confuse, easy for the family member to remember under stress, and doesn’t sound like a normal sentence. “Purple thunder eats Tuesdays” works. “Be safe” does not.

Tell it to people in person, not over text or email. The whole point is that it’s not in any digital channel a scammer can read.

Have a fallback. If someone calls in a panic, you ask for the safe word, and they “can’t remember it because of the head injury,” that is the scam. The protocol is simple. Hang up, call the family member directly on the number you have saved, and verify that way.

Practice it once. Have your parent or kid actually say it back to you. The first time someone hears their adult kid scream “Mom, please” through a phone speaker, they’re not in a state to recall a phrase they’ve only ever read in a text message.

What to do if someone in your family was already targeted

A few things, in order, no matter how recent the call was.

If money moved within the last 24 hours, call the bank or wire service first. Wires can sometimes be recalled if the recipient bank hasn’t released the funds. Gift card numbers can occasionally be frozen by the issuer if you call the merchant fast. The window is short and getting shorter.

Then file two reports. The FBI’s IC3 portal at ic3.gov is where law enforcement tracks the patterns and where seniors specifically should file. The FTC’s ReportFraud.ftc.gov is where the consumer-facing data flows. Both take ten minutes.

One more thing, and this part matters most. Don’t pile on the person who got hit. The framing has to be that the technology has gotten very good, not that they were careless. Shame is what keeps people from telling their family or filing reports, which is exactly what scammers count on. The same mom who got fooled today will protect her grandkids for the next decade once she knows about the safe word.

If they didn’t lose money but the call rattled them, treat it as the tap on the shoulder it is. Set the safe word that night.

A few other things worth doing while you’re at it

Lock down voicemail greetings. A voicemail that says your name in your own voice is exactly the three seconds of audio a cloning tool needs. Use the generic carrier-default greeting on any phone where the owner is over 65 or has a public-facing job.

Audit social media for the older relatives in your family. Voice samples come from videos as much as anywhere. A 30-second clip of Grandpa giving a toast at a wedding, posted to a public Facebook page, is plenty. You don’t have to take it down. Just make sure the account is set to friends-only.

Cut the spam-call surface area. Most scammers buy lists of “live” numbers that have engaged with previous spam. Keeping aging parents off those lists matters. Apple’s “Silence Unknown Callers,” carrier filtering through AT&T ActiveArmor, T-Mobile Scam Shield, or Verizon Call Filter, and a third-party blocker on top all reduce how often the phone rings at all. Fewer rings, fewer chances for a panicked answer.

The safe word is the part that actually saves money, though. Everything else just narrows the funnel.

Set it tonight. Tell three people. Pick something silly enough that everyone will remember it. Then forget about it until the day you really need it.


Not Today is a free spam-blocking app for iPhone. We built it to catch spam before it reaches you, using keyword rules, a community database of 85,000+ reported numbers, and optional AI detection. No account required. Download on the App Store.