Social media giants could face hefty fines if they refuse to remove “worst of the worst” content or hand over the details of abusive users, under the terms of a new Online Safety Bill designed to give the eSafety Commissioner more teeth to tackle serious online abuse.

Introduced into Parliament this week, the new legislation includes a range of measures, such as a cyber-abuse scheme to help victims of “seriously harmful online abuse” get the material removed when online platforms fail to act.

Offending materials include cyberbullying, “toxic online abuse”, harmful content and image-based abuse – defined as “the non-consensual sharing of intimate images” – with “consistent take-down requirements” established in legislation and online service providers required to remove such material within 24 hours of receiving a formal notice from the eSafety Commissioner.

“When people interact in person, they take for granted that the rule of law applies,” Paul Fletcher, Minister for Communications, Urban Infrastructure, Cities and the Arts said in introducing the new legislation.

“People should be able to expect the same when they interact online.”

The proposed powers extend to app stores, mandating the takedown of apps providing gateways to the most harmful online content.

The bill also empowers the eSafety Commissioner to order the blocking of “online crisis events” – a nod to the live-streaming of the 2019 Christchurch terror attacks – depicting or promoting terrorist and “other abhorrent conduct”.

“Social media services, electronic services and designated Internet services” will face civil penalties of up to $111,000 (500 penalty units) if they fail to comply.

The eSafety Commissioner could effectively make offending sites vanish, with new ‘link deletion notices’ requiring search engines to delete links to sites that have not complied with takedown notices.

A set of guidelines in the Act, known as Basic Online Safety Expectations, will lay down clear guidelines of “what the Australian government – on behalf of all Australians – expects of business in the digital sector,” he added, including controls to prevent children accessing Category 2 content (hardcore pornography) and mandatory reporting requirements to ensure transparency of the online platforms’ compliance.

Additionally, an updated Online Content Scheme would force industry to adopt stricter codes of conduct and help the eSafety Commissioner target “worst of the worst” online content regardless of where it is being hosted.

A long time coming

The 192-page Online Safety Bill 2021 reflects the culmination of a more than year-long push that saw Fletcher launch a formal discussion paper in late 2019, when he told a National Press Club audience that the government would seek to bolster oversight powers to take down or block “seriously harmful material”.

Consultation on an exposure draft of the legislation began days before Christmas, with the final proposed legislation introduced the day after the government and Facebook resolved the impasse over Facebook’s blanket Australian ban on news content.

The release of the new act, set amidst ongoing tension between the government and social-media giants, reflects the growing expectation that Big Tech firms become active partners in efforts to fight content that violates community standards.

Despite scepticism about its potential efficacy, the amended media bargaining code passed the Senate on the same day as the Online Safety Bill was introduced – reflecting another small step in the government’s efforts to rein in digital giants’ unfettered growth.

This week also saw the introduction of a DIGI-led Disinformation Code, which major firms endorsed as a call to action against fake news spreading on their networks.

“Industry must work harder to prevent online harms occurring in the first place,” Fletcher said, noting that the new act “introduces important new protections for Australians when things do go wrong.”

Promises to keep Australians safe online have been a recurring theme throughout the Morrison government, which went to the 2019 election with a policy that promised many of the reforms that are now set to be enshrined into law.

“To have a safer online environment, we need to change the law,” Fletcher said in his second-reading speech in Parliament.

“But that in itself is not enough. The digital sector must also step up.”