A deep fake audio trick scammed a British CEO out of more than $350,000, with the popular viral technology now being used for more nefarious means.
The CEO of an unidentified UK energy company was tricked into wiring $US243,000 into a scammer’s bank account after believing he was on the phone with the CEO of the German parent company, the Wall Street Journal reported.
The scammer had used artificial intelligence technology to replicate the CEO’s voice on the phone, asking for the money to be urgently transferred to a Hungarian supplier.
The UK CEO soon become suspicious when the scammer called for a third time, requesting another financial transfer, and noticed the call was coming from Austria.
He didn’t make any further transfers, but the funds transferred to the Hungarian account were already under the hacker’s account and had been transferred to Mexico.
The targeted energy company’s insurer Euler Hermes Group publicised the incident, saying they believe the scammers had just used commercially available artificial intelligence voice-generating software to carry out the fraud.
It’s not the first time that the emerging technology has been used for malicious reasons, with Symantec reporting earlier this year that it had seen three similar cases.
In these cases, AI had also been used to spoof a CEO’s voice to trick individuals into transferring millions of dollars to the scammers.
CEOs are a popular target due to the ease in accessing examples of them speaking, through earnings calls, YouTube videos and conference presentations, giving the scammers a wealth of data to feed into the AI technology.
It’s a new age version of a classic scam that is estimated to have cost US businesses $US1.3 billion just last year, where the email account of a senior member of a company is compromised or spoofed, and used to ask the financial controller to urgently transfer funds to an account controlled by the hacker.
Insurance firm AIG has reported that these types of scams accounted for nearly a quarter of all cyber-insurance claims it received last year.
There are also concerns that the same tactic could be used to trick individuals as well as companies, with voice recordings of people often very easy to find online.
“Deep fake audio can be weaponised by criminals and gangs to defraud victims and sabotage business activities through telemarketing,” CereProc chief scientific officer Matthew Aylett told The Sun.
“It’s a very real threat and there’s no solution at this time. By using deep fake audio to replicate a loved one’s voice, victims could be duped into sharing their bank account details or transferring money to a third party.
“This is especially concerning for the elderly, who aren’t as tech savvy as younger generations.”
The new scam is the audio equivalent of deep fake videos that have been prevalent recently, with many going viral.
This week a Chinese app went viral that allows users to seamlessly insert their face on top of a range of famous actors in scenes from popular movies and TV shows.
Zao quickly raced to the top of the App Store, and allows users with just one photo to swap their face onto Leonardo di Caprio, among other Hollywood stars.
The app has led to concerns around privacy and identity theft, and fears it could be used to outsmart facial recognition technology.
Zao’s initial terms and conditions gave its developer the “global right to permanently use any image created on the app for free”, but this clause has now been deleted after widespread backlash.