The Online Safety Bill 2021 will likely get an easy ride into law after a senate environment and communications committee gave it the nod of approval last week.
Under the government’s proposed laws, the eSafety Commissioner will be given expanded censorship powers to direct social media platforms and other internet services to take down material and remove links to content it deems offensive or abusive.
During the brief consultation period on the bill, tech giants and digital rights groups levelled wide-ranging criticism at the bill for its broad scope, disproportionate powers, and a seeming lack of understanding about modern digital infrastructure.
The eSafety Commissioner will have powers to issue removal notices for certain types of digital content such as: cyber-bullying material aimed at a child; cyber-abuse material aimed at an adult; the non-consensual sharing of intimate images (commonly known as ‘revenge porn’); abhorrent violent material (such as murders, rapes, or acts of terrorism); and content that has been classified X18+ or has been refused classification.
In each of these contexts, the eSafety Commissioner could issue notices to social media platforms, websites, hosting services, and other related internet companies – along with the end-user who posts the content – compelling them to remove the content within 24 hours.
Ben Au, director of policy and government affairs with the Interactive Games and Entertainment Association, noted the potential for disproportionate action over what may be little more than a short internet message.
“The bill, as it currently stands, provides for a range of remedies, including obviously removal of the content, naming and shaming, end-user notices, Federal Court orders, social link removals, app store delisting and a requirement for platforms to provide often sensitive end-user information from both Australians and non-Australians,” Au told the senate committee earlier this month.
“So certainly an unintended consequence would be an incident leading to several multi-faceted regulatory interventions that may not be proportional to the harm that occurred.”
Board member of Electronic Frontiers Australia, Justin Warren, told the committee the use of the classification scheme for censoring digital content was “inappropriate”.
“The classification code was originally designed to regulate broadcast-style communications, where individual choice of whether or not to be exposed to specific content is challenging, such as TV programs or magazine covers in a shop,” he said.
“It was not designed to regulate consensual communication between adults over private channels.”
Tech giants understandably objected to the legislation given the impact on their resources dealing with high amounts of takedown notices would entail.
Facebook was critical of the inclusion of instant messaging services under the scope of the bill, saying many messaging apps already include the ability to block people and delete messages you don’t want to see.
“Private messaging could involve interactions that are highly nuanced and context-dependent and could be misinterpreted as bullying, like a group of friends sharing an in-joke, or an argument between adults currently or formerly in a romantic relationship,” Facebook said in its submission.
“It does not seem clear that government regulation of these types of conversations are warranted, given there are already measures to protect against when these conversations become abusive.”
Twitter complained about a lack of oversight for the eSafety Commissioner’s powers, saying the office will been given “quasi-judicial and law enforcement powers … without any accompanying guidelines or guardrails”.
“In a law enforcement setting, such investigative powers are typically accompanied through a well-established legal process, like obtaining a warrant,” it said.
“Any obligation on any social media service provider, relevant electronic service, or designated internet service provider to either disclose user information as well as remove content, must be balanced with procedural fairness.”
And Google thought notices given to hosting services – which may include cloud service providers – seem to misunderstand the role and function of cloud infrastructure.
“The cloud provider typically does not have visibility into its customers’ content to meet the privacy, security, and regulatory demands of its customers (and of their end customers), and to comply with existing laws and regulations governing cloud-based services,” Google said.
“Even if something was flagged by an external observer, it is often impossible for a cloud provider to remove individual pieces of content.”
Still, the legislation, which went from its first reading in parliament to senate committee approval in just over two weeks, is likely to become law.
Julie Inman Grant, the current eSafety Commissioner, said it was all about keeping Australians safe online.
“Overall, our goal is just to make the internet a safer and more civil place for Australians to enjoy, and we believe that this reform bill is an important next step in achieving this ambitious and worthy goal,” she said.
“We're privileged to be leading the globe as the world's online safety regulator, and we welcome the opportunity to provide even greater levels of online protection to our citizens.”