The federal government will move to legislate a digital duty of care for social media companies, putting the onus on these tech giants to proactively keep their users safe.
Communications Minister Michelle Rowland announced the new legislation in a speech to the Sydney Institute on Wednesday night, which she said will be based on similar existing provisions in the United Kingdom and European Union.
A legislated digital duty of care is a key recommendation from the independent statutory review of the Online Safety Act, which was completed by former ACCC deputy chair Delia Rickard and is yet to be publicly released.
It will mean that social media companies will be required to take “reasonable steps” to prevent foreseeable harm occurring on their platforms, including through conducting risk assessments and risk mitigations and implementing safety-by-design principles.
There will also be “strong penalty arrangements” in place for if a tech firm “seriously and systematically” breaches these new obligations.
“Online services operating in Australia can do more to proactively protect their users – which is why our government is developing a digital duty of care,” Rowland said.
“The duty of care will put the onus on industry to prevent online harms at a systemic level, instead of individuals having to ‘enter at their own risk’.
“Australia’s online safety laws are world-leading, but they aren’t set and forget.
“Legislating a duty of care is a key step in delivering on the review and ensuring greater protections for all Australians online.”
Mitigating harms
Rowland said that the amendments to the Online Safety Act will futureproof the regime and place a legal responsibility on social media companies to keep Australians safe while using their platforms.
As part of the new duty, these companies will have to continually identify and mitigate potential harms, including to young people, on mental wellbeing, through instructions and awareness of harmful practices and other illegal content.
The review into the Online Safety Act found that it does not “in a fundamental sense, incentivise the design of a safer, healthier, digital platforms ecosystem”, Rowland said.
“What’s required is a shift away from reacting to harms by relying on content regulation alone, and moving towards systems-based, accompanied by a broadening of our perspective of what online harms are,” she said.
The legislation will be aligned with similar laws in the UK and EU, Rowland said.
“This, as part of a growing global effort, will deliver a more systemic and preventative approach to making online services safer and healthier,” she said.
“A duty of care is a common law concept and statutory obligation that places a legal obligation to take reasonable steps to protect others from harm.
“It is a proven, workable and flexible model.”
War on big tech
The new legislated duty of care comes just a week after the Labor government announced it would move to legislate a minimum age of 16 years old for Australians to use a range of social media services.
The regulation of big tech and efforts to keep Australians, especially children, safe online is now shaping to be a key election policy next year.
The minimum social media age was ticked off by the state and territory governments late last week, and also has the backing of the federal Opposition.
The age ban will likely include platforms such as Facebook, Instagram, TikTok, X and possibly YouTube.
At a Senate Estimates hearing, a representative from the Department of Communications admitted that such a scheme will involve all Australians having to verify their age with a social media platform, rather than just children.