To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

Woman warns of ‘evil’ AI scam that made her believe her brother was dead
Featured Image Credit: TikTok/@babybushwhacked

Woman warns of ‘evil’ AI scam that made her believe her brother was dead

A woman has warned people that scammers are using AI and there's no depths they won't sink to

A woman has warned people that scammers are using AI to pull an 'evil' scheme to get money which left her thinking that her brother was dead.

As technology moves on so too do the scumbags who want to get their hands on other people's money and don't care what depths they have to sink to to steal it.

The rapid development of AI has really accelerated in the past year, and while it's opened up a whole new wave of possibilities it's also a major cause for concern.

One of the dark sides of this technology is AI's ability to impersonate people, including creating accurate copies of their voices to the point that one lad was able to break into his own bank account using AI.

Now scammers are using AI to terrify families into thinking something terrible has happened to a loved one and that they need to send over money immediately to get them out of trouble.

Brooke Bush thought her brother was dead after scammers called her grandpa with an AI voice saying he was about to crash.
TikTok/@babybushwhacked

One woman took to TikTok to tell her tale as she was left fearing that her little brother might be dead following a call from the scammers.

"Somebody out there used an AI machine to trick my grandpa into thinking my little brother got in a wreck and died. He mimicked my brother's voice and said 'I'm about to get in a wreck' and then the phone went off." She explained, having said she'd been 'crying for the last two hours'.

"So my grandpa called me and told me this, I immediately thought my brother was dead. I started driving, trying to find his location, he wasn't picking up.

"I come to find out it was a scammer that was trying to get money from my grandpa by calling and saying he went to jail and that he killed someone and needed bail money."

"All for money, he acted like my little brother almost died. If you guys ever get called and it's someone asking you for money that you know, they're using a freaking AI machine to re-enact their voice."

"How evil. The only reason I'm sharing this is so you guys don't get scammed and your grandparents don't get scammed, but this is a real thing now that people are just re-enacting voices and getting money from their relatives."

Had the deception not been caught, the scammers posing as her brother would have asked for bail money.
TikTok/@babybushwhacked

People were horrified to hear about the scam, with some recognising that this isn't the first time scammers have used AI voice replicating technology in a bid to steal money.

Last month a woman said scammers imitated her daughter's voice to stage a kidnapping scam, demanding a ransom of $1 million while she could hear her daughter crying for help.

The mum was able to confirm that her daughter was safe and sound and that it was all a scam, but it was an utterly terrifying experience.

People hearing about this have recommended that people either figure out a 'safe word' to clue the other person in that this is a real conversation and not an AI scam, or to have a set question to ask which would hopefully flush out an AI imitator.

The fact that people might have to come up with a safe word or special question to use while on the phone because anyone they're talking to could be an AI in disguise is insane, but apparently this is the world we're hurtling towards at worrying speed.

Topics: Artificial Intelligence, Technology, Crime