Scammers are using AI to clone voices from social media clips, impersonating individuals to trick their loved ones into sending them money.
A quarter of over 18s in the UK have been victim to a voice cloning scam in the last year, according to new data from Starling Bank.
Criminals have worked out how to use AI to accurately portray the voices of a friend or family member from as little as three seconds of audio stolen from uploads to social media.
After identifying family members, they use the cloned voice in a phone call, voice message or voicemail asking for money to be sent urgently.
This new impersonation scam is a sinister new twist as fraudsters use technology to trick people out of money.
Starling Bank surveyed 3000 people and found that nearly 1 in 10 said they would send whatever money was requested, even if they thought the call seemed strange – potentially putting millions at risk of fraud.
The company has now launched a “Safe Phrases” campaign, encouraging the public to agree to a phrase with their close relatives that no one else knows so they can differentiate between real and AI voices.
To launch the campaign, actor James Nesbitt’s voice was cloned by AI technology, demonstrating just how easy it is for anyone to be scammed.
The actor told The Metro: “You hear a lot about AI, but this experience has really opened my eyes to how advanced the technology has become, and how easy it is to be used for criminal activity if it falls into the wrong hands.”
The UK’s cybersecurity agency said in January that AI was making it increasingly difficult to identify phishing messages, where users are tricked into handing over passwords or personal details.
Lord Hanson, Home Office minister with responsibility for fraud, said: “AI presents incredible opportunities for industry, society and governments, but we must stay alert to the dangers, including AI-enabled fraud.”
If you have been scammed and need help to recover your money fill out our Fraud Reclaim Form or call us for advice on 0333 9998791.