Some 544,052 Australians have been blocked from Meta properties Instagram, Facebook and Threads in the past month, the company has revealed in its first update on compliance with Australia’s social media minimum age (SMMA) laws that, it argues, aren’t working at all.

The tech giant began warning under-16s to save their data weeks before the controversial new law came into effect on December 10, and since then has identified 330,639 Instagram users and 173,497 Facebook users that it “understands to be” young enough to be banned.

It has also blocked 39,916 allegedly young people from Threads, its competitor to the likes of X and Bluesky, the company said in the update – in which it noted that “ongoing compliance with the law will be a multi-layered process that we will continue to refine.”

Meta “is committed to meeting its compliance obligations and is taking the necessary steps to remain compliant with the law,” the company said, weeks after it was found to be monetising the activities of illegal scammers rather than banning them.

Other targeted platforms have yet to report SMMA figures, with the X Transparency Center a year out of date, TikTok’s transparency blog silent and its figures 15 months old; and Snap yet to update estimates that the ban would impact 440,000 users between 13 and 15.

Grudgingly compliant, but hating it

The figures are the first indication of the blast radius of the new laws, which aim to counter social media’s deleterious effects on a demographic estimated to comprise 16 per cent of Australia’s 24.3 million Facebook users and 4 per cent of its 16.2 million Instagram users.

Many teens report feeling “free” after disconnecting but the ban, which some fear will disenfranchise young people when they need peer support the most, has driven a High Court challenge and separate action by Reddit as well as ongoing protests from tech giants.

Despite its proactivity in ensuring compliance with the SMMA rules, Meta remains a staunch critic of the new law – whose “initial impacts”, it argues in the update, “suggest it is not meeting its objectives of increasing the safety and well-being of young Australians.”

This hypothesis, Meta argues, is supported by reports of a surge in downloads of alternative social media apps and reports that parents are systematically helping their children bypass the restrictions.

Meta also rubbished “the premise of the law” – that the SMMA laws are sparing young Australians from the “algorithmic experience” that amplifies harmful and addictive content – as “false”.

Many social media platforms allow any user to access their content without a login, it argues, and “still use algorithms to determine content the user may be interested in – albeit in a less personalised way that can be appropriately tailored to a person’s age.”

Pushing for a 'better way'

Meta isn’t the only firm grudgingly complying with the SMMA laws: Snapchat has been outspoken its criticism of the “disappointing” laws, which it told users reflected “an evolving legal and regulatory landscape that unfortunately impacts your experience.”

And X – which is under UK government investigation and could be banned from that country after users were found to be abusing its Grok AI tool to create thousands of illegal nude deepfakes – previously said its compliance with the Australian laws is “not our choice”.

Some argue that age verification technologies are still far from perfect, with Meta citing “concerns about determining age online without an industry standard” and spruiking the new OpenAge Initiative – whose interoperable AgeKeys will be built into Meta apps this year.

Some alternative platforms have taken different approaches, with Media.com building new social media communities in which the age and identity of all users are verified before they are given access.

Tech giants and many policymakers have also argued that age verification should happen at the app store level, a contentious move that would shift the onus for age verification onto Apple and Google – which prefer leaving app developers to manage age verification.

Yet Meta, for one, isn’t buying it – noting that “teens and parents must still verify age app by app” because many harmful platforms “may not prioritise safety, adopt new age assurance methods, or be in scope of Australia’s social media ban.”

“We still believe there is a better way forward,” it argues, “which is age verification and parental approval at the app store level…. This is the only way to guarantee consistent, industry-wide protections for young people, no matter which apps they use.”

The eSafety Commissioner, which released regulatory guidance for the online industry in September, is monitoring compliance with the new laws and is expected to provide an update shortly.