Scammers are using AI to mimic children’s voices and trick UK parents into sending money through WhatsApp messages

Scammers are using AI to mimic children’s voices and trick UK parents into sending money through WhatsApp messages

In a world where staying in touch often means sending a quick message over WhatsApp, a new scam is turning that very convenience into a dangerous trap.

It’s called the “Hi Mum” scam, and it’s already cost victims nearly half a million pounds in 2025 alone.

But here’s the scary part — these scams are no longer just about fake texts.

Criminals are now using AI-generated voice messages to impersonate your loved ones, making the deception even harder to spot.


It Starts With a Message That Tugs at Your Heart

The scam usually kicks off with a simple text like “Hi Mum” or “Hi Dad.”

The sender claims they’ve lost their phone and can’t get into their bank account.

It sounds like something your child might actually say, right?

Then comes the request for money.

They’ll say it’s for rent or a replacement phone. And if you hesitate?

They might send a voice note — one that sounds eerily like your real child — asking for help.

That’s because scammers are now using AI voice cloning software to mimic voices pulled from social media videos or other recordings.


Scammers Do Their Homework First

According to cybersecurity expert Jake Moore, these scammers don’t send messages randomly.

They do some digging first.

Using your public social media posts, they gather information about your family — names, relationships, and even little details that make the message seem legit.

They might pretend to be your son, daughter, a close friend, or even your own parent.

Research by Santander Bank shows scams involving people pretending to be sons are the most common — and the most successful.


AI Voices Make It Harder to Tell What’s Real

In the past, you might’ve ignored a suspicious message.

But when you hear a familiar voice begging for help, it hits differently.

Mr. Moore even revealed that he fooled his own mum with a voice clone of himself.

That’s how convincing it can be.

These tools are easy to access.

With just a few seconds of audio, scammers can create a clone of someone’s voice that’s good enough to trick even close family.


Don’t Let Pressure Push You Into Action

One of the biggest red flags in these scams is urgency.

Scammers often say things like, “I need the money now!” or “Please don’t tell anyone!”

The goal is to rush you into sending money without thinking it through.

Chris Ainsley from Santander says the number of cases using AI voices is growing fast.

In just the first four months of 2025, 506 victims were tricked — with £490,606 ($651,230) stolen.

And in April alone, 135 people lost £127,417 ($169,133).


What to Do If You Get a Suspicious Message

If you receive a text that seems off, stop and take a moment before replying.

Here’s a quick checklist:

  • STOP: Don’t rush. Take five minutes to think.

  • THINK: Does it make sense? Is this how your child would normally speak or act?

  • CALL: Use your saved contacts to call them directly — even a voice note from them could help confirm it’s really them.

Better yet, Mr. Moore recommends creating a family “code word” that only your loved ones know — something that wouldn’t be obvious or found online.


WhatsApp’s Advice: Be Cautious With Unknown Numbers

A WhatsApp spokesperson said they’re working to keep the platform safe, using end-to-end encryption to protect messages.

But if someone has your number, they can still try to contact you.

If you get a message from someone not in your contacts, you’ll see a notification.

WhatsApp won’t let you open links from unknown numbers, and you can always report suspicious messages within the app.


Stay One Step Ahead With Strong Passwords

Beyond scams, protecting your passwords is just as crucial.

Here’s a quick tip list from cybersecurity experts:

DO:

  • Use a mix of letters, numbers, and symbols

  • Make it at least 8 characters long

  • Use abbreviations or phrases

  • Change passwords regularly

DON’T:

  • Use common words like “password” or “123456”

  • Include personal info like your birthday or pet’s name

  • Share your password with anyone

  • Save passwords in your browser without caution


The Bottom Line

Technology is moving fast — and so are scammers.

From fake texts to AI voice clones, it’s becoming harder to know who you’re really talking to.

But by staying alert, asking questions, and double-checking before sending money, you can protect yourself and your loved ones.