The eSafety Commissioner will be able to tell search engines like Google to delete links pointing to offending content under legislation that comes into effect early next year.
Under the re-configured Online Content Scheme, the eSafety Commissioner can direct internet companies like social media platforms, ISPs, app stores, and even search engines to remove or restrict access to material based on the National Classification Scheme which is commonly used for film, TV, and video game ratings.
Failure to remove content could see penalties of up to $555,000 per offence for corporations.
Communications Minister Paul Fletcher said the updated scheme gives the eSafety Commissioner global reach.
“While the Online Content Scheme has been in operation for over 20 years and has been very successful within Australia, the Online Safety Act extends the reach of the scheme overseas in recognition of the global nature of the internet,” he said.
“Online communities don’t have borders, and yet many Australians access materials online.
“This new scheme will hold industry accountable for keeping users safe and empower the eSafety Commissioner to remove the worst of the worst online content no matter where it is.”
The Online Content Scheme contains provisions for the Commissioner to issue different notices to internet services including notices for search engines to delete links and for app stores to remove apps if they are deemed inappropriate.
These aren’t indiscriminate powers, however, as the Commissioner’s scope is currently limited to material that has been refused classification or is otherwise rated at least R18+.
But it could have consequences for certain content like controversial video games which may pass censors in other jurisdictions only to find they are banned in Australia.
This year award-winning RPG Disco Elysium – the Final Cut was initially refused classification because of its depiction of drug use, although that decision was eventually overturned.
Adult material will also be more heavily restricted under the Online Content Scheme which lets the eSafety Commissioner issue remedial notices for services that host R18+ content.
Restricted Access System
Included in the remedial notice is a provision to put R18+ content behind a ‘restricted access system’ that limits access to people under the age of 18.
Exactly what qualifies as a ‘restricted access system’ is up to the discretion of the eSafety Commissioner which sought public consultation about these age verification systems this year.
A draft declaration says a ‘restricted access system’ for R18+ material “must incorporate reasonable steps” to confirm a user’s age which needn’t include providing government identification but could involve the linking of unrelated services.
In its submission to the eSafety Commissioner, advocacy group Digital Rights Watch urged caution when implementing rules about restricted access systems, saying it should aim to take “the least privacy-invasive approach”.
“Rather than implementing an invasive regime for all adults in Australia, we would prefer to see an approach that prioritises making websites ensure that their content is more easily filterable by parental control software, as well as leveraging the ability for ISPs to filter content on specific devices to be used by children,” Digital Rights Watch said.
It also warned against allowing government agencies or private companies the ability “to track or link an individual’s identity with their online pornography viewing habits, or any other ‘age inappropriate material’ they may access”.
The eSafety Commissioner is due to register its ‘restricted access system’ declaration when the Online Safety Access comes into effect on 23 January 2022.