Digital platforms are once again in the spotlight over the spread of disinformation and misinformation on their services as the local arms of tech giants continue their attempts at self-regulation.
On Monday the Digital Industry Group Inc (DIGI) announced the governance and public complaints feature of its voluntary Code of Practice on Disinformation and Misinformation along with an independent oversight board.
Signatories to the code include Facebook, Google, Twitter, and TikTok who “must commit to safeguards to protect against online disinformation and misinformation” such as providing ways for users to report policy violations.
The new public complaint feature – which can be completed through this Google form – is limited to complaints specifically about digital platforms’ conduct in relation the Disinformation Code itself and complaints about content must be made through the platforms themselves.
Reset Australia’s director of tech policy, Dhakshayini Sooriyakumaran, called the announcement “laughable” and a “PR stunt”.
“The DIGI code is voluntary and opt-in, with no enforcement and no penalties. Clearly, self-regulation does not work,” he said.
“If DIGI are serious about cracking down on the serious harms posed by misinformation and polarisation then it should join Reset Australia and other civil society groups globally in calling for proper regulation.”
Sooriyakumaran called for proper regulation in the wake of recent revelations from Facebook whistleblower Frances Haugen who provided previously unseen internal documents from the company showing it is well-aware of the harms its platforms can cause.
“We need answers to questions like, how do Facebook’s algorithms rank content?
“Why are Facebook’s AI based content moderation systems so ineffective?
“The proposed reforms to the code do not provide this,” Sooriyakumaran said.
DIGI Managing Director Sunita Bose defended the code saying the pandemic has proven how important it is for the spread of misinformation to be curbed.
“The Australian Code of Practice on Disinformation and Misinformation provides a strong framework for continued technology industry action and transparency on these complex challenges, and we wanted to further strengthen it with independent oversight from experts, and public accountability,” she said.
Digital platforms have long trod a fine line between free speech and active censorship. In late 2019, Mark Zuckerberg warned against “cracking down too much” on free speech.
“While I certainly worry about an erosion of truth, I don’t think most people want to live in a world where you can only post things that tech companies judged to be 100 percent true,” he said.
Zuckerberg has come under fire for the role his company’s platforms has played in the spread of misinformation about COVID-19 and the Capitol Riots in January.
It has also been alleged that Zuckerberg struck a deal with former US President Donald Trump in which he agreed not to fact check Trump while he was in office – and in return, Trump would avoid strict social media regulation.
YouTube meanwhile only decided to ban all anti-vaccine content in late September, signaling a change in its AI policy that cannot be properly scrutinised so long as the proprietary algorithms behind YouTube remain hidden.
Nerida O’Loughlin, Chair of the Australian Communications and Media Authority (ACMA), said “now is the time [for digital platforms] to make good on their word and address community concerns”.
“A responsive complaints-handling system with robust governance is a critical component of an effective self-regulatory regime, particularly in the online environment,” she said.
“However, we do have concerns that complaints about non-compliance with opt-in commitments will be treated differently to those about mandatory commitments.
“We will be watching how this works in practice and whether expanding the committee’s remit will be necessary.”
Publisher or platform
The government has also signalled its intentions to act aggressively toward the tech companies that are crucial to the spread of information in Australia with Prime Minister Scott Morrison suggesting a wholesale change in the way social media platforms are legally defined.
Responding to online rumours that former NSW Deputy Premier John Barilaro had been having an affair with the daughter of Deputy Prime Minister Barnaby Joyce, Scott Morrison said social media “has become a coward’s palace”.
“People can just go on there, not say who they are, destroy people’s lives, and say the most foul and offensive things to people and do so with impunity,” he said.
“Now that’s not a free country where that happens, that’s not right. They should have to identify who they are.
“The companies, if they’re not going to say who they are, well they’re not a platform anymore, they’re a publisher.”
Digital platforms are not currently treated as publishers of content and thus avoid responsibility for what users say and post.
It’s a necessary distinction for them to function as they currently do, and something which was explored in detail in a landmark High Court ruling last month.
But as Morrison loudly condemned social media platforms, Morrison’s Defence Minister, Peter Dutton, is in the midst of a defamation trial against refugee activist Shane Bazzi over a tweet in which Bazzi called Dutton a “rape apologist”.
Classing digital platforms as publishers could give more scope for litigious politicians and other public figures to go after the platforms themselves, as Barilaro has done in his dual defamation cases against both YouTuber Jordan Shanks and Google.
The government is not yet walking in lockstep on this issue, however, with Communications Minister Paul Fletcher floundering on the topic when he appeared on the ABC’s Insiders program this Sunday.
Fletcher dodged repeated questions about whether platforms should be treated as publishers.
“Well that is certainly one of the options that is before us in this process,” he said referring to ongoing work by state and federal attorneys general.
“I agree with the Prime Minister, yes – we expect a stronger position from the platforms. For a long time they’ve been getting away with not taking any responsibility in relation to content posted on their sites.”