The world’s three largest pornography sites have just weeks to get serious about removing child sexual abuse material (CSAM), after the European Commission (EC) found their high traffic demands their regulation as ‘very large online platforms’ (VLOPs) like Google and Meta.

Investigations into the operations of Pornhub, Stripchat, and Xvideos led the EC to make the order based on the fact that each has more than 45 million average monthly users in the EU – the threshold set by the EC’s Digital Services Act (DSA), which requires online platforms and search engines to publish user numbers half-yearly and qualifies them as VLOPs or very large online search engines (VLOSEs) based on these results.

Last April, the EC designated 17 VLOPs – including Amazon, Apple App Store, Facebook, Google Maps, Tiktok, Twitter, Wikipedia and others – as well as two VLOSEs, Bing and Google Search.

Those designations gave the sites four months to introduce a range of risk management, reporting, and content moderation tools – including easier user reporting of illegal content; restrictions on the profiling of users; the right to opt out of profile-based recommendation systems; a ban on targeted advertising towards children; and system redesigns to improve minors’ safety, security, and privacy.

Regulated platforms must also work to address the dissemination of illegal content and what the EC called “negative effects on freedom of expression and information”, as well as improving the transparency of complaints handling processes and prioritising the handling of notifications from ‘trusted flaggers’.

Large platforms must also ensure that risk assessments and DSA compliance are externally audited, as well as regularly providing publicly available data to researchers, publishing regular transparency reports, and publishing repositories of all ads served through their interface.

National Digital Services Coordinators (DSCs) – which have legal powers to request access to data, order inspections, and fine service providers for infringements as well as certifying ‘trusted flaggers’ and bodies for out-of-court dispute settlement – have been established in each member state and will begin enforcing the supervision of VLOPs and VLOSEs on 17 February.

At that point, all 22 designated sites will need to comply with general DSA obligations – a change that “will allow for higher scrutiny and accountability of their algorithms and processes,” said Margrethe Vestager, executive vice-president for a Europe Fit for the Digital Age, who called the DSA “an essential tool to ensure that technology respects the fundamental rights of European citizens.”

A new sheriff for the web’s Wild West

The designation of the three adult platforms marks a significant change for adult sites, which have long operated with impunity even as regulators worldwide increasingly scrutinised mainstream tech giants.

Yet in recent months, that scrutiny has seen governments around the world taking increasingly concrete action to rein in the activities of Big Tech giants – whose concentrated and considerable powers have raised concerns around transparency, competition, scams and security, consumer lock-in, and more.

Many recent changes in Australia dovetail with the DSA’s priorities, with increasingly data hungry digital giants recently warned to stop ignoring customer complaints, put on notice over ‘dark patterns’ that ensnare users, told to improve the safety of dating apps, ordered to fight the spread of AI-generated illegal deepfakes, and forced to actively scan for CSAM and other illegal content under new industry-specific codes of conduct that came into effect on December 16.

The threat of real financial penalties for breaches has recently gotten some teeth, with the Federal Court fining Airbnb $30 million for deceptive pricing, and ACMA given power to fine social media companies that fail to fight misinformation.

Days before Christmas, eSafety filed civil penalty proceedings against X Corp (previously known as Twitter) – which was recently kicked out of DIGI’s misinformation code after a “serious breach” – after the social-media giant ignored a September infringement notice fining it $610,500 for failing to meet transparency requirements under the Online Safety Act.

Australian authorities continue to fight to improve online safety, yet whether they will follow the lead of the DSA by extending regulations to adult sites remains to be seen: the Australian government recently shelved plans to mandate age verification for pornography sites, which is a core tenet of the EU’s DSA.

The EC “will continue to designate platforms that meet the thresholds and make sure that they comply with their obligations under the DSA,” EC commissioner for internal market Thierry Breton recently promised in noting that “creating a safety online environment for our children is an enforcement priority under the DSA.”

A similarly-minded eSafety Commissioner Julie Inman Grant believes Australia is moving in the right direction: with the new industry codes providing “a tremendously important online safety milestone,” she said as they came into effect, “what we have achieved… is very significant, world-leading, and will have a real impact when it comes to the online safety of all Australians.”