Has Taylor Swift ever sent you a message on Facebook? Or has Hugh Jackman perhaps slid into the comments of your Instagram post?

We hate to break it to you, but you may be the target of a ‘celeb-bait’ scam.

Social media and tech giant Meta has teamed up with Australia’s primary industry group for combatting financial cyber crime to tackle the droves of AI-generated celebrity scams spreading across Facebook and Instagram.

These scams, Meta explained, see cyber criminals make contact either via email, direct messages or comments on social media posts, all while pretending to be the victim’s “favourite celebrity”.

In most cases the scammer will offer to meet up with the victim or provide them with autographed goods, though with the caveat of exchanging sensitive details or demanding a payment.

“It’s not just the good guys that use the internet,” Meta warned.

By partnering with the Australian Financial Crimes Exchange (AFCX), Meta said it was able to remove over 9,000 spam pages and more than 8,000 AI-generated celeb bait scams across Facebook and Instagram.

While Meta did not provide any direct examples of these scams, this year has seen countless incidents where a celebrity’s likeness and voice are replicated by generative artificial intelligence.

Morgan Freeman, for example, took to social media in June to warn his fans of the “unauthorised use of an AI voice” imitating his likeness.

“There is a fake Morgan Freeman trying to tell me to private message him after a comment I made on a post you made,” one user wrote in the comments.

“I got at least 40 people pretending to be Morgan Freeman on Messenger and I keep blocking them.

“I reported them, and they are still coming and Facebook doesn't do anything,” wrote another.

Another commenter said they’d received messages claiming to be Brad Pitt, while celebrities such as Hugh Jackman have recently had their likeness abused in ‘deepfake’ videos which promote online investment platforms via social media ads.

By teaming up with the AFCX, Meta has set up a direct scam-reporting channel with Aussie financial institutions and banks, including the Commonwealth Bank of Australia (CBA) and ANZ to receive reports of such scams.

Dubbed the Fraud Intelligence Reciprocal Exchange (FIRE), banks can use the reporting channel to share information about known scams with Meta, while Meta can hand back aggregated information about scam trends and particular content which it was able to pull from its platforms, such as Instagram and Facebook.

“Scammers target many apps and sectors, meaning that each company may only be able to see and counter a narrow piece of the broader scams campaign,” wrote Meta.

“FIRE helps banks and Meta put the puzzle pieces together to better protect people using their services.”

People were taken in by a scam ad showing singer Taylor Swift flogging Le Creuset cookware. Photo: Internet

Much needed action on Aussie scams

Established as a pilot in April, FIRE marks a long-awaited collaboration between Australia’s financial sector and Meta.

James Roberts, general manager of group fraud at CBA, welcomed the initiative and encouraged all industry players to “accelerate” their participation.

“Big tech, telcos and banks all play a part in helping to protect Australians,” said Roberts.

“By collaborating and sharing intelligence through the Intel Loop, together we can help make Australia less attractive for scammers.”

Celebrity-impersonation scams, though far enhanced by generative AI, are far from new.

Australian mining tycoon Andrew Forrest first noticed his likeness being used in fake Facebook ads back in 2014, and since 2019, he’s publicly hounded Meta to crack down on crypto scam ads which use deepfakes of himself and other local celebrities such as David Koch, Deborah Knight and Eddie McGuire.

In early 2022, Forrest sued Meta while arguing for charges of criminal recklessness and lambasting the company for the “everyday Australians” who “work all their lives to gather their savings and ensure those savings aren’t swindled away by scammers”.

While this case was discontinued in April, Meta recently lost an appeal in a concurrent US lawsuit, meaning Forrest can continue his legal battle over the scam Facebook ads using his likeness.

Meta said it invested $7.3 billion (US$5 billion) towards global safety and security on its platforms last year alone, adding it has “conducted numerous sweeps and enforcement actions” to remove “hundreds of thousands of violating ads” in the Australian market.

Senior citizens at particular risk

Recent data from the Australian Cyber Security Centre’s ScamWatch shows people over the age of 65 are more likely to fall victim to scammers, particularly for investment scams such as those being conducted via celebrity likenesses.

In 2023, the demographic was the only group to show an increase in reported financial losses, while January to August of 2024 saw some $40.3 million in reported losses for those over 65 compared to some $72.3 million collectively for Australians below 65.

After observing the uptick in senior citizens suffering investment scams, advocacy organisation National Seniors Australia warned that “deepfake endorsements featuring well-known Aussies are stealing the life savings of seniors”.

The organisation said deepfakes of popular television hosts and Australia’s richest people are responsible for “stealing the life savings of hundreds of Australians”, with “many” of these Australians being elderly citizens.

Meta’s advice on avoiding celeb-bait scams is to “take the time to check if it’s the celebrity’s original profile” and “do your research before engaging further with any request”.

Ultimately – whether it’s Tom Hanks messaging you with a special offer on a dental plan or Taylor Swift giving you the insider deal on some expensive kitchenware – National Seniors Australia says if it sounds too good to be true, it probably is.