A survey by a UK bank suggests that AI-generated voice cloning scams are on the rise, with 28% claiming to have been targeted. It’s recommended that people agree a secret code to guard against the possibility of being take in …
AI-generated voice cloning scams
A voice cloning scam is when a criminal uses AI to generate a fake version of the voice or a friend or family member, claiming to be in trouble and needing money urgently.
While these scams have been around for years in text form, the use of AI voice tech gives attackers the ability to fool many more people.
The Metro reports that today’s AI tech can generate a convincing-sounding imitation of someone’s voice using as little as three seconds of source material – and it’s not hard to find social media videos with a sentence or two.
A survey of over 3,000 people by Starling Bank sound that voice cloning scams [are] now a widespread problem […]
In the survey, nearly 1 in 10 (8%) say they would send whatever they needed in this situation, even if they thought the call seemed strange – potentially putting millions at risk.
Despite the prevalence of this attempted fraud tactic, just 30% say they would confidently know what to look out for if they were being targeted with a voice cloning scam.
A code phrase is recommended
The bank recommends that people agree code phrases they will use if they ever actually do need to contact a close friend or family member for urgent assistance.
A Safe Phrase is a previously agreed phrase that you and your inner circle can use to verify if you’re truly speaking to them.
It can be anything, as long as it’s:
- Simple yet random
- Easy to remember
- Different from your other passwords
- Shared with friends and family in person
Photo by Jp Valery on Unsplash
Add 9to5Mac to your Google News feed.
FTC: We use income earning auto affiliate links. More.