More than 60 per cent of Australian respondents partaking in a global McAfee survey of more than 7,000 adults worldwide said they can’t tell if a voice on a phone call is real, or a synthetic imposter made using an artificial intelligence (AI) text-to-speech engine.
Instagram influencers, Facebook vloggers and TikTok enthusiasts are creating security nightmares by sharing their voice online, the security firm has advised, amidst warnings that AI voice cloning tools can impersonate someone with just 3 seconds of source material.
That set them up for manipulation by scammers that have rapidly gained access to tools that can help them “create a believable clone that can be manipulated to suit their needs,” McAfee Labs researchers warned after reviewing a range of such tools.
“Access and ease of use of AI tools [are] helping cybercriminals to scale their efforts in increasingly convincing ways,” said Steve Grobman, chief technology officer with McAfee – whose survey found that 7 per cent of Australian adults reported having been a victim of an AI voice scam, with an additional 11 per cent saying they knew someone who had.
One survey respondent said her mother’s grandson “called… said he had been in a wreck, he was hurt, and he needed money. Sounded just like him, she said. She got scared and called him and he was okay.”
For all the advice not to fall for such scams, it can be hard when the voice on the other end of the phone sounds like a loved one in trouble – and this is where McAfee’s analysis raised red flags for anybody sharing their voice online.
By posting videos of themselves while travelling overseas, for example, young people provide scammers with both an ample supply of their voice to analyse, and the knowledge that the young travellers have likely left concerned parents back home in Australia – creating ideal conditions for being exploited.
The scam might, for example, involve a call from a panicked voice that sounds just like someone’s daughter, with added static to feign a bad mobile connection and a plea for emergency funds to be transferred to a specific bank or other account.
Indeed, 37 per cent of Australian respondents said they would likely respond and share money if they received a voicemail from a family member or friend who was vacationing abroad and needed help – with even more likely to help a caller claiming they needed funds after a car crash or breakdown, or that they had been the victim of theft.
“Fraudsters are counting on their target’s desire to want to help a loved one,” the analysts said, noting that “humans make mental shortcuts daily for problem-solving and to reduce cognitive overload…. Because of this, a near-perfect match may not even be required, as our brain will automatically make the [link].”
You’re the voice… and so is the AI
McAfee researchers found today’s AI tools frighteningly effective – with one free tool creating a tool that analysed just 3 to 4 seconds of sample audio, then produced a convincing voice that was an 85 per cent match with the real thing.
For just $0.0006 per second of audio produced, the report said, the researchers were able to record 100 prompts to produce a higher-quality outcome – and by upgrading their account, they could add emotion and inflection “making the voice clone almost indistinguishable from the real thing.”
Voice samples need not even come from social videos: scammers randomly calling victims’ phones could record enough sample audio simply by recording the call and keeping them on the line for a few seconds.
Australians are forking over millions of dollars to voice-based scammers, an “incredibly concerned” ACCC deputy chair Catriona Lowe recently warned, after reports that fake phone calls – with voices claiming to be from banks alleging some problem with funds transfers – had tricked more than 90 Australians into wiring between $40,000 and $800,000.