The days of spotting dodgy emails by their poor spelling and grammar are long gone as cyber criminals move toward generative AI tools built explicitly to scam people and deliver malware.
WormGPT is one of these tools that has been growing in popularity on cyber crime forums and, unlike popular AI chatbots, it doesn’t have any qualms about writing scam emails – in fact, it doesn’t have any of the ethical limitations built into the likes of ChatGPT, said Daniel Kelly, a researcher with cyber firm Slashnext.
Kelly got his hands on WormGPT to test its capabilities and was surprised by what he found.
“In one experiment, we instructed WormGPT to generate an email intended to pressure an unsuspecting account manager into paying a fraudulent invoice,” he said.
“The results were unsettling. WormGPT produced an email that was not only remarkably persuasive but also strategically cunning, showcasing its potential for sophisticated phishing and BEC [business email compromise] attacks.”
Business email compromise scams commonly see fraudsters impersonate people within a company (like a CEO or finance officer) and make requests for money transfers or to buy a bunch of gift cards.
Alternatively, the scammers might intercept an invoice from a legitimate supplier and ask for funds to be sent to their account instead.
Since its arrival late last year, people have been trying to force ChatGPT to answer questions out of its scope with a technique known as jailbreaking.
Prompt the chatbot to pretend it’s a grandmother telling a bedtime story or dive into the world of do anything now (DAN) and the long prompts that free ChatGPT from its constraints.
Jailbreaking has been a particularly common topic among cyber criminals who want to get the AI to help them create malicious code or write believable English-language emails tricking administrators into sending thousands of dollars to an offshore account.
WormGPT is an evolution of this and further shows the disruptive potential of AI in a security context. But it isn’t cheap – costing $98 (€60) per month or around $900 (€550) per year for access.
Early experiments suggest that AI tools boost the productivity of lower-skilled people the most, and Slashnext warns that WormGPT and its ilk will lower the barrier for entry on the darker side of cyber space.
“The use of generative AI democratises the execution of sophisticated BEC attacks,” he said.
“Even attackers with limited skills can use this technology, making it an accessible tool for a broader spectrum of cyber criminals.”
Kelly recommends companies regularly update their training programs for BEC attacks and has suggested email verification measures that alert people when an email has come from outside the company, or flag messages that have specific keywords like “urgent”, “wire transfer” or “sensitive”.
The ability for large language models to be used for nefarious means was why OpenAI originally chose not to make GPT-2 public, a decision it ultimately reversed.