The federal government is drawing fire from critics after deciding not to use “still new and evolving” age verification technology to restrict access to online pornography due to privacy, security, and technological concerns.
The decision – which comes months after the eSafety Commissioner submitted a proposed Age Verification Roadmap (AVR) to the government – comes two years after eSafety issued a call for evidence to inform the creation of a policy for addressing children’s access to online pornography, including techniques for online age verification.
Two rounds of consultations were held last year and published last May and October, including a cross-sector workshop and thematic analysis of the technology that concluded “a one-size-fits-all technological solution would not be effective.”
Existing age assurance technologies are “immature, but developing”, the government noted in its newly released response to the AVR, which mandated viable solutions must “work reliably without circumvention”; be “comprehensively implemented” including applicability to pornography hosted outside of Australia’s jurisdiction; and “balance privacy and security, without introducing risks to the personal information of adults who choose to access legal pornography.”
Noting that many age-verification technologies require personal information such as official government identity documents – and that many tools estimate users’ age from their photos, browsing or social-media behaviour – the response notes that “age assurance technologies cannot yet meet all these requirements,” adding that the AVR “makes it clear that a decision to mandate age assurance is not ready to be taken.”
Government cracking down in other ways
The government’s AVR response comes amidst a flurry of child-protection activity – including a new AFP call for public assistance recognising settings of child-abuse materials, and the recent takedown of a significant child abuse network.
Despite backing away from mandating age verification techniques – a move the Opposition called “inexplicable” and “impossible to understand” – the government has championed industry codes that will force the online industry to take “reasonable steps to make sure technological or other measures are in effect to prevent children accessing pornography”.
Those codes – five of which were approved in June and three that are still under development – will apply to eight key sections of the online industry including social media services, messaging, file storage, search engine, app distribution, hosting, internet carriage services, and manufacturers and suppliers of any equipment that connects to the Internet.
From 16 December, Phase 1 of the codes will see service providers required to advise end users about content filtering products, and to proactively remove child sexual exploitation material and pro-terror material within 24 hours.
The subsequent Phase 2 will address ‘class 2’ content, including pornography – after which, the government said, it will reconsider any potential trial of age assurance technologies.
Setting technological expectations
Age verification technology is just one of many approaches on the table: emboldened by a government move to quadruple funding, eSafety Commissioner Julie Inman Grant earlier this year mooted the forced scanning of photos and emails, calling Twitter a “bin fire” after calling for an update on social media giants’ efforts to help fight child exploitation.
Even as Meta toed the line with an image removal tool designed to help teenagers, eSafety’s AVR advised that technological tools “should not be prescribed” while noting that “any online service provider that poses a risk of exposing children to pornography should have measures to prevent children gaining access.”
Such tools should “meet strict safety and privacy standards, be certified, and independently audited [with] the role of filtering and parental controls [to] also be considered,” eSafety advised, noting that “a one-size-fits-all technological solution would not be effective.”
“Technological requirements should be proportionate and based on risk”, it advised, with the online industry directed to “design technologies so they are easy for children and parents to understand – including information on how the technologies work and how they use, store and protect data.”
The AVR, Minister for Communications Michelle Rowland said in releasing the government’s response, “underscores the important role the regulator plays in registering industry codes or developing standards to keep children safe.”
“The Government supports this approach, and will work with the regulator to ensure the full and successful implementation of the Online Safety Act,” Rowland continued.
“While the Government awaits the outcome of this process, the digital industry is on notice that we will not hesitate to take further action should it fail to keep children safe.”