The government wants to give the Australian Communications and Media Authority (ACMA) a more direct role in combatting mis- and disinformation online by enforcing a pre-existing industry code of practice already agreed to by the likes of Facebook, TikTok, Google, and Twitter.

Communications Minister Michelle Rowland said mis- and disinformation “poses a threat to the safety and wellbeing of Australians, as well as to our democracy, society and economy”.

“A new and graduated set of powers will enable the ACMA to monitor efforts and compel digital platforms to do more, placing Australia at the forefront in tackling harmful online misinformation and disinformation,” she said.

Currently the industry is self-regulated via a voluntary code of practice for mitigating misinformation that Adobe, Apple, Facebook, Google, Microsoft, TikTok, Twitter, and RedBubble have adopted.

Misinformation, as defined by the code, is any digital content shared by a platform’s users which is “verifiably false or misleading or deceptive” and which is “reasonably likely” to cause harm.

Disinformation has largely the same definition except that it is shared “via inauthentic behaviours”.

‘Harm’ in this context refers to any content which poses a “credible and serious threat” to democracy or public goods like health.

Misinformation spread during the COVID-19 pandemic and greater expectations were placed on the tech giants to stop the spread, even going as far as temporarily banning the YouTube channel for Sky News Australia.

According to Rowland’s office, the new laws – a draft of which is yet to be circulated – will give ACMA more authority over the code of practice through “information-gathering and record-keeping powers” as well as the ability to register an enforceable industry code if self-regulation proves to be ineffective.

James Clark, Executive Director of Digital Rights Watch, said he’s “happy to see better oversight and accountability for big tech and social media platforms”.

“Self-regulation is almost never effective and government regulators have a role to play,” Clark said, adding that there should be a review of the code’s effectiveness prior to making it mandatory.

“It’s hard to say how effective the code has been without proper review and this is probably a good opportunity to investigate the effectiveness of the code before giving ACMA more powers,” he told Information Age.

“We should also be investigating ways to rein in the power these companies have and protect local media and creative industries.”

The code of practice is managed by the Digital Industry Group Inc (DIGI), an association founded by tech giants with the stated aim of advocating for policies “that enable a growing Australian technology sector” by letting them self-regulate, for example.

DIGI’s self-regulatory code was met with skepticism from digital rights groups who claimed it was a way for big tech companies to skirt around the fact disinformation was being algorithmically spread around their platforms in order to maximise audience engagement.

Managing director of the industry group, Sunita Bose, said DIGI “broadly welcomes” the forthcoming ACMA powers.

“DIGI is committed to driving improvements in the management of mis- and disinformation in Australia, demonstrated through our track record of work with signatory companies to develop and strengthen the industry code,” she said.

“We welcome that this announcement aims to reinforce DIGI’s efforts, and that it formalises our long-term working relationship with the ACMA in relation to combatting misinformation online.”