In late November, Australia’s Federal Parliament passed landmark legislation banning under-16s from accessing social media.

Details remain vague: we don’t have a complete list of which platforms will fall under the legislation, or how the ban will look in practice.

However, the government has signalled that trials of age assurance technologies will be central to its enforcement approach.

Video games and online game platforms are not currently included in Australia’s ban of social media.

But we can anticipate how enforcing an online ban might (not) work by looking at China’s large-scale use of age verification technologies to restrict young people’s video game consumption.

In China, strict regulations limit children under 18 to just one hour of online gaming on specified days.

This approach highlights significant challenges in scaling and enforcing such rules, from ensuring compliance to safeguarding privacy.

‘Spiritual opium’: Video games in China

China is home to a large video game industry.

Its tech giants, like Tencent, are increasingly shaping the global gaming landscape.

However, the question of young people’s consumption of video games is a much thornier issue in China.

The country has a deep cultural and social history of associating video games with addiction and harm, often referring to them as “spiritual opium”.

This narrative frames gaming as a potential threat to the physical, mental, and social wellbeing of young people.

For many Chinese parents, this perception shapes how they view their children’s play.

They often see video games as a disruptive force that undermines academic success and social development.

Parental anxiety like this has paved the way for China to implement strict regulations on children’s online gaming.

This approach has received widespread parental support.

In 2019, China introduced a law to limit gaming for under 18-year-olds to 90 minutes per day on weekdays and three hours on weekends.

A “curfew” would prohibit gameplay from 10pm to 8am.

A 2021 amendment further restricted playtime to just 8pm to 9pm on Fridays, Saturdays, Sundays and public holidays.

In 2023, China expanded this regulatory framework beyond online gaming to include livestreaming platforms, video-sharing sites and social media.

It requires the platforms to build and complete “systems for preventing addiction”.


Genshin Impact, developed by Chinese company MiHoYo, is one of the highest-grossing mobile games of all time. Photo: Shutterstock

How is it enforced?

Leading game companies in China are implementing various compliance mechanisms to ensure adherence to these regulations.

Some games have incorporated age-verification systems, requesting players to provide their real name and ID for age confirmation.

Some even introduced facial recognition to ensure minors’ compliance — an approach which has sparked privacy concerns.

In parallel, mobile device manufacturers, app stores and app developers have introduced “minor modes”.

This is a feature on mobile games and apps that limits user access once a designated time limit has been reached (with an exception for apps pre-approved by parents).

A November 2022 report by the China Game Industry Research Institute — a state-affiliated organisation — declared success.

Over 75 per cent of minors reportedly spent fewer than three hours a week gaming, and officials claimed to have curbed “internet addiction”.

Yet these policies still face significant enforcement challenges, and highlight a wider set of ethical issues.

Does it work?

Despite China’s strict rules, many young players find ways around them.

A recent study revealed more than 77 per cent of the minors surveyed evaded real-name verification by registering accounts under the names of older relatives or friends.

Additionally, a growing black market for game accounts has emerged on Chinese commerce platforms.

These allow minors to rent or buy accounts to sidestep restrictions.

Reports of minors successfully outsmarting facial recognition mechanisms — such as by using photos of older individuals — underscore the limits of tech-based enforcement.

The regulation has also introduced unintended risks for minors, including falling victim to scams involving game account sellers.

In one reported case, nearly 3,000 minors were collectively scammed out of more than 86,000 yuan (approximately $18,500) while attempting to bypass the restrictions.

What can Australia learn from China?

The Chinese context shows that a failure to engage meaningfully with young people’s motivations to consume media can end up driving them to circumvent restrictions.

A similar dynamic could easily emerge in Australia, which would undermine the impact of the government’s social media ban.

In the lead-up to the law being introduced, we and many colleagues argued that outright bans enforced through technological measures of questionable efficacy risk being both invasive and ineffective.

They may also increase online risks for young people.

Instead, Australian researchers and policymakers should work with platforms to build safer online environments.

This can be done by using tools such as age-appropriate content filters, parental controls, and screen time management features, alongside broader safety-by-design approaches.

These measures empower families while enabling young people to maintain digital social connections and engage in play.

These activities are increasingly recognised as vital to children’s development.

Crucially, a more nuanced approach fosters healthier online habits without compromising young people’s privacy or freedom.

Tianyi Zhangshao is a PhD candidate at the University of Sydney.

Ben Egliston is a lecturer in digital cultures at the University of Sydney.

Marcus Carter is a professor in human-computer interaction at the University of Sydney.

This article is republished from The Conversation under a Creative Commons license. Read the original here.