Australia’s eSafety Commissioner has issued X (formerly Twitter) a fine of more than $600,000 for failing to answer questions about the prevention of child abuse material on its platform, the first time these powers have been used.
Earlier this year the Commissioner issued legal notices to X, Google, TikTok, Twitch and Discord ordering them to answer questions about the measures they are taking to deal with child sexual exploitation material, sexual extortion and the livestreaming of abuse on their respective platforms.
Under the Online Safety Act, the eSafety Commissioner is able to issue these reporting notices to tech firms, requiring them to report on their compliance with elements of the Basic Online Safety Expectations.
eSafety Commissioner Julie Inman Grant said that both X and Google failed to properly comply with the notices under this law.
While Google was issued a formal warning, X’s offending was found to be more serious, and the Elon Musk-owned company has been handed an infringement notice of $610,500.
X now has 28 days to request that this fine be withdrawn, and the Office of the eSafety Commissioner is able to pursue other avenues if the company chooses not to pay the fine.
Of the questions put to X regarding child safety, the company failed to adequately answer at least 14 of them, leaving some entirely blank and providing incomplete or inaccurate information to the others, the Commissioner said.
“Twitter / X has stated publicly that tackling child sexual exploitation is the number one priority for the company, but it can’t just be empty talk, we need to see words backed up with tangible action,” Inman Grant said.
“If Twitter / X and Google can’t come up with answers to key questions about how they are tackling child sexual exploitation, they either don’t want to answer for how it might be perceived publicly, or they need better systems to scrutinise their own operations.
“Both scenarios are concerning to us and suggest they are not living up to their responsibilities and the expectations of the Australian community.”
There have been ongoing concerns around safety and rules-enforcement on X since it was taken over by Musk in October last year.
In the first three months of Musk’s tenure, the proactive detection of child sexual exploitation material fell from 90 per cent to 75 per cent, although the company now claims that this has since improved.
The answers provided to the eSafety Commission by the other tech firms revealed serious shortfalls in the detection, removal and prevention of this content, and inconsistencies across platforms, Inman Grant said.
“We really can’t hope to have any accountability from the online industry in tackling this issue with meaningful transparency which is what these notices are designed to surface,” she said.
“This latest report also reveals similar gaps in how these five tech companies are dealing with the problem and how they are tackling the rise in sexual extortion – we need them all to do better.
Included in the findings was that Discord is not taking steps to detect child sexual exploitation in livestreams on its platform as this would be “prohibitively expensive” and that it is not blocking links to known child sexual exploitation material, despite the availability of databases on this.
X, Discord and other Google services are not using technology to detect grooming, while YouTube, TikTok and Twitch are, the report revealed, while Google is also not using its own technology to detect known child sexual exploitation videos on Gmail, Chat and Messages.