Warning: This story contains references to child abuse.

Some of the biggest tech companies in the world will face fines of nearly $800,000 per day if they do not provide the Australian government with twice-yearly reports on their efforts to tackle child abuse material on their platforms.

The eSafety Commissioner on Wednesday issued further legal notices to tech firms including Apple, Google, Meta, and Microsoft under the Online Safety Act, ordering them to report every six months on measures that are in place to tackle this issue.

These notices have also been issued to Discord, Snap, Skype, and WhatsApp.

These reports will have to include information on how these tech companies are tackling child abuse material being posted on their platforms, livestreamed abuse, online grooming, sexual extortion and the production of AI-generated deepfake abuse material.

Companies that do not provide these reports on time to the government will face a fine of $782,500 for every day that they are late, with the eSafety Commissioner threatening to go to court to have these financial penalties applied.

“We’re stepping up the pressure on these companies to lift their game,” eSafety Commissioner Julie Inman Grant said.

“They’ll be required to report to us every six months and show us they are making improvements.”

The eSafety Commissioner issued similar notices to the same tech companies early last year and said that the responses it received were highly concerning and led to the ramped-up enforcement action.

“In our subsequent conversations with these companies, we still haven’t seen meaningful changes or improvements to these identified safety shortcomings,” Inman Grant said.

‘Wilful blindness’

Among the concerning responses to previous legal notices were Apple and Microsoft saying in 2022 that they were not proactively detecting child abuse material stored on their cloud storage services, despite it being “well-known” that they “serve as a haven for child sexual abuse”, Inman Grant said.

It was also uncovered that Microsoft was taking on average two days to respond to user reports of child sexual exploitation and abuse on their platforms, and sometimes as long as 19 days if the reports required a re-review.

Meta has also previously said that it has made 27 million reports of child sexual abuse to authorities, while Apple said it had only made 267.

“The reason is they’re taking an element of wilful blindness,” Inman Grant told ABC RN Breakfast.

“They’re not detecting it, they’re not looking under the hood to see what might be hosted on their platforms and they’re not allowing people to report this content as they come across it.”

The eSafety Commission will be publishing summaries of the reports provided by these tech firms.

The new reporting requirements have been welcomed by the International Justice Mission, an international non-government organisation focusing on human rights and law enforcement.

“To date, these companies have failed to provide a safe online environment for children,” International Justice Mission Australia country director David Braga said.

“We are hopeful the transparency will hasten big tech companies to review not only the content dissemination on their platforms but also the systems which allow for this content distribution.”

Reporting with teeth

The tech firms now have until 15 February next year to provide these reports to the eSafety Commission, and compliance is mandatory, with a fine of nearly $800,000 applicable.

“These transparency powers will work with our interlocking mandatory codes and standards,” Inman Grant said.

“For those who don’t follow the law or fail to pay their infringement notices like X Corp in this area, we’re going to court.”

The eSafety Commission issued the first fine under this scheme to X late last year after the social media giant failed to answer previous questions about child abuse material on its platform.

X has refused to pay the fine of more than $600,000 and is now engaged in a legal battle with the eSafety Commission over it.

Similar legal notices were also issued to various tech firms earlier this year requiring information on the efforts they are taking to address terrorism content being spread on their platforms.

If you need someone to talk to, you can call: Lifeline on 13 11 14, 1800RESPECT on 1800 737 732, Kids Helpline on 1800 551 800, Beyond Blue on 1300 22 46 36, Headspace on 1800 650 890, MensLine Australia on 1300 789 978, QLife (for LGBTIQ+ people) on 1800 184 527, or the Suicide Call Back Service on 1300 659 467.