Leaders of the world’s biggest economies stood together in a promise to combat online extremism when they met at the G20 summit in Osaka last weekend.

Prime Minister Scott Morrison took the initiative to propose the G20 Osaka Leaders’ Statement on Preventing Exploitation of the Internet for Terrorism and Violent Extremism Conducive to Terrorism.

Agreed upon by 20 of the world’s strongest nations, the statement outlined global expectations for social media giants like Facebook and YouTube to “do their part” to combat terrorism.

“The impetus of this is to say to the companies, ‘You have the technology, you have the innovation, you have the resources, and now you have the clear communicated will of the world's leaders that you need to apply that to get this right,’ Morrison said to journalists in Osaka.

“This was about simply trying to ensure that we all were agreed that the internet should not become a weapon of terrorists.”

The statement aims at holding social media companies accountable for the way their services are used.

“We urge online platforms to meet our citizens' expectations that they must not allow use of their platforms to facilitate terrorism,” the Leaders’ Statement says.

Digitalisation and innovation were a key focus for the G20 summit. Emerging technologies, responsible use of AI, free-flowing data, and cyber security were also agreed on as important areas for attention for the transition into what Japan is calling ‘Society 5.0’.

“For us all to reap the rewards of digitalisation, we are committed to realising an open, free and secure internet,” the Leaders’ Statement says.

“The internet must not be a safe haven for terrorists to recruit, incite or prepare terrorist acts. To this end, we urge online platforms to adhere to the core principle … that the rule of law applies online as it does offline.”

Australian Taskforce

The G20 Leaders statement came a day before the government released the report of the Australian Taskforce to Combat Terrorist and Extreme Violent Material Online.

Among recommendations for greater transparency and reporting from digital platforms, the taskforce also called for the eSafety Commissioner to develop a protocol that would allow it to block “websites hosting offending content” during crisis events.

The eSafety Commissioner currently has the power to “give written directions” to carriers and service providers under the Telecommunications Act.

A ‘testing event’ has also been recommended for some time in the next 12 months.

This would be a simulated scenario “which will allow all parties to gauge whether industry tools, and Government processes, are working as intended”.

Social media platforms have been under increased scrutiny following the Christchurch massacre which was streamed live on Facebook.

Facebook reportedly removed more than a million videos of the attack that killed 51 people, but thousands more were undetected by AI-powered filters.

Federal parliament quickly passed new laws that would hold the executives of social media platforms accountable if they fail to “remove abhorrent violent material expeditiously”.