Have you ever kept a scam caller on the phone just to waste their time?
A new Australian-made AI chatbot has been trained to do exactly that in the hopes of destroying their business model.
Named Apate, after the Greek goddess of deceit, the bots are currently deployed out of honeypot phone numbers that the team at Macquarie University’s Cyber Security Hub is surreptitiously encouraging scammers to call.
“We’ve put these ‘dirty’ numbers all around the internet, getting them into some spam apps, or publishing them on webpages and so on, to make them more likely to receive scam calls,” said research project lead Professor Dali Kaafar.
When a scammer comes across these honey pot numbers on their list of potential victims, the intention is for them to be sucked into a conversation so convincingly real that they stay on the phone, wasting time and saving actual humans from being scammed.
First, the scammer’s audio gets transcribed into text, then it uses a conversational AI to develop a response which is converted to human-like speech before being sent back to the scammer.
The AI has been developed specifically for the purpose of keeping scammers engaged, building on the Macquarie University team’s research into the anatomy of a scam.
Professor Kaafar said he came up with the idea during a family lunch one afternoon when a scammer called. He kept the scammer on the line for 40 minutes, keeping his children entertained all the while.
“I realised that, while I had wasted the scammer’s time so they couldn’t get to vulnerable people – which was the point – that was also 40 minutes of my own life I wouldn’t get back,” Professor Dali says.
“Then I started thinking about how we could automate the whole process and use natural language processing to develop a computerised chatbot that could have a believable conversation with the scammer.”
Scam baiting, as this is commonly known, is a form of entertainment that streamers like Kitboga have made a living from.
Kitboga uses voice modulation software and bespoke virtual machines for scammers to remote into as a way of tricking them into thinking he’s a legitimate victim.
The streamer’s YouTube channel is full of angry scammers who stay on the phone for hours by the promise of a big score that, that like a cartoon $100 note on a string, keeps getting yanked out of their reach.
But because the scammer business model relies on getting big profits from a handful of victims, the odd wasted hour ultimately won’t deter them.
A lot of wasted hours talking to bots, on the other hand, might reduce the profitability of this criminal activity that is estimated to have cost Australians over $129 million in 2022 alone.
“Financially, it's a high-gain, low-cost ratio for scammers,” Professor Kaafar said.
“The practice is very lucrative and a relatively low-risk criminal activity, and it's pretty hard for victims to recover this money.
“We are excited about the potential for this new technology to actively break the scam-calling business model and make it unprofitable.”
On average, the current set of bots are keeping scammers on the phone for about five minutes but they are constantly learning and being improved.
“The bots are continually learning how to drag the calls out to meet their primary objective: keeping scammers on the line longer.”
It’s hard to tell if synthetic voices are real or not, a recent study found, which has given rise to AI voice-generated scams that trick people into thinking their loved ones are on the phone.