Binding transparency requirements helped ensure that Australia’s 2022 federal election had “much lower levels” of disinformation than in other countries, according to reports showing eight tech giants are “incrementally” improving their fight against misinformation.
Transparency reports covering the tech giants’ content enforcement activities during 2022 – which must be filed annually under the Australian Code of Practice on Disinformation and Misinformation (ACPDM) introduced by DIGI in February 2021 – highlighted a range of proactive work by Adobe, Apple, Google, Meta, Microsoft, Redbubble, TikTok, and Twitter that range from removing harmful content to disrupting disinformation campaigns.
Facebook parent company Meta alone flagged more than 9 million distinct pieces of content on its Australian site that were found to be false by independent third-party fact checkers, the company reported, while Microsoft’s Bing search engine applied “defensive search interventions” to block more than 45,000 queries that attracted over 1 million impressions last year alone.
Twitter outlined its work post Elon Musk takeover to improve community consultation about inaccurate content while TikTok substantially improved the percentage of misinformation content that was removed before viewers could see it – from 37.6 per cent in the first quarter of 2022, to 69.8 per cent in the fourth quarter.
And Google, the company reported, terminated over 100,000 accounts and disrupted more than 50,000 instances of the Chinese ‘dragonbridge’ influence campaign, which was designed to flood its services with disinformation on topics such as China’s COVID-19 outbreak and the Russia-Ukraine war.
The reports “represent summations of ongoing activity that is massively complex,” media consultant Hal Crawford said in releasing the documents, which he reviewed against the Code’s guidelines to evaluate the tech giants’ progress.
“Collating the information is a substantial task, and presenting it in a way that balances accuracy and readability is not easy,” he said, noting that on the whole the reports “continue to incrementally improve on the provision and presentation of information on the efforts of the Signatories to counter mis/disinformation on their platforms.”
Voice lessons
The functioning of the ACPDM dovetailed with proactive work conducted by the Australian Electoral Commission (AEC) during last year’s federal election, with the AEC formally engaging with Meta, Twitter, Google, Microsoft, and TikTok to streamline the referral of harmful content and misinformation for review and takedown.
By increasing scrutiny of misinformation and facilitating its removal through “extensive electoral integrity work”, DIGI managing director Sunita Bose said, the ACPDM had “positively contributed to the AEC’s determination that [the 2022] election saw much lower levels of electoral mis and disinformation than in other like-minded democratic elections across the globe.”
Those learnings – which are detailed in the individual company reports and summarised in DIGI’s newly released ACPDM Annual Report – will now be applied to minimise or eliminate disinformation from the opposing campaigns around the Aboriginal and Torres Strait Islander Voice to Parliament, debate about which is heating up as a formal referendum nears before the end of this year.
In line with previous efforts around the 2022 Federal Election and 2023 Aston By-election, the AEC recently launched a new Disinformation Register to track disinformation around the Voice campaign.
Ensuring the proverbial fair fight will be a crucial goal of the code throughout the year, with firms like Apple highlighting their use of “careful and deliberate” human curation and vetting of publishers “to reduce the risk of harm that may arise from the propagation of disinformation and misinformation.”
An Apple partnership with transparency and credibility ratings firm NewsGuard, which expanded into the ANZ market in March, has provided “a more nuanced perspective on Apple News’s presence within the grater Australia news environment and the misinformation narratives spreading there.”
As one of just two industry misinformation codes in the world, Bose said the ACPDM had shown that industry transparency could be a powerful force – with a range of “bespoke structures” and new initiatives designed to further strengthen its protections and broaden its use by ever smaller content platforms.
Yet education remains as important as enforcement, she said – noting that just 14 complaints had been received through the dedicated complaints portal since its launch in October 2021.
“Content removal is often the focus when it comes to conversations about how we tackle misinformation,” she said, “but it is only one lever in how we effectively respond to the challenge holistically…. Research and education measures are essential to a whole of society response.”