Age assurance could become mandatory for most online services, based on a new exposure draft of the Privacy (Children’s Online Privacy) Code that obligates tech firms to “take reasonable steps” to verify users’ age before collecting data about them.

The obligation is one of several measures in the new exposure draft, which lays down new rules for companies collecting data about people under 18 years of age – requiring them to “establish privacy protections by default in the design of their online services.”

The Children’s Online Privacy (COP) Code is an Australian Privacy Principles (APP) code – meaning that any company bound by the existing APP framework will also be obligated by the COP Code, and breaches will be considered “interference with the individual’s privacy” and “subject to investigation.”

Unlike the recent social media minimum age (SMMA) restrictions – which apply to a prescribed list of services – COP applies to any firm offering non health services likely to be accessed by children or “primarily concerned with the activities of children”.

This includes applications that track early childhood development, family photo sharing tools, online school management systems and online baby monitors, meaning the COP Code will apply to a broad range of hardware and software providers.

Firms must destroy “sensitive information” like ID documents after they are used for age verification – aiming, the explanatory notes say, to “reduce risks that may be associated with the collection of sensitive information that may arise from ascertaining age.”

Entities that collected personal information before the COP’s commencement will need to take “reasonable” steps to ascertain end-users’ age before collecting any further information about them, implying a broad age verification requirement for online sites.

Acting early to avoid exploitation of children online

The draft’s release comes days after the OAIC revealed the results of the latest Global Privacy Enforcement Network (GPEN) sweep, which saw 27 data protection and privacy authorities examine nearly 900 websites and apps that are used by children.

Fully 59 per cent of the websites and mobile apps required an email address to access the platforms’ full functionality, while half required usernames and 46 per cent demanded geolocation.

Some 71 per cent provided no information about their protective controls and privacy practices for children, while just 35 per cent of those with “high risk data processing and design features for children” used popups to prompt users for parental permission.

And 72 per cent of websites and mobile apps failed to stop sweep participants from circumventing age assurance, with many still using self declaration practices that have been deprecated by the SMMA’s requirement for stricter controls.

“By the time a child turns 13, around 72 million pieces of data will have been collected about them,” privacy commissioner Carly Kind said, “making them vulnerable to harms from data breaches, discrimination, algorithmic bias and targeted advertising.”

“The code will not restrict or limit children and young people’s participation in online spaces,” she added, saying that it “raises the standard for privacy protections in Australia and puts the onus on online services to do better when handling children’s personal information online.”

Empowering children to control their data

A key goal of the new code is to provide agency for children, including a requirement that children’s accounts be set to ‘high privacy by default’ and that children be empowered to control and block the collection and use of their data.

Over 65 consultations with academics, child welfare bodies and industry – and the 337 children, young people, parents and carers given “age-appropriate” worksheets – found they “place a high value on online privacy, including a desire for greater transparency”.

They also wanted “clearer, more accessible and transparent information designed for children” that would give them “greater control of their personal information” including limits on direct marketing, location tracking and ‘nudge’ techniques.

“A child should be able to turn on or off elements such as personalised content, targeted advertising or recommendations that rely on the handling of a child’s personal information,” the OAIC explains, with data destroyed when options are disabled.

That means, for example, that a child signing up for a messaging service could be required to provide “limited personal information” to direct messages but the provider could not collect data for personalised recommendations or targeted advertising.

Furthermore, tech firms will have to “consider the best interests of children” before collecting, using or disclosing their personal data – a requirement tied to Article 3 of the UN Convention on the Rights of the Child (UNCRC).

That includes “proactively assessing the risk of harm” including child exploitation; impacting children’s physical, psychological, emotional, social and cognitive development, or ability to develop and express their views and identities.