Social networking and other digital platforms will be forced to verify the identities of advertisers, app developers and merchants as part of the ACCC’s proposed suite of service-specific reforms that also include “user-friendly” methods for challenging fake reviews.

“Widespread, entrenched, and systemic” consumer harms were persisting across a range of digital platform services, ACCC chair Gina Cass-Gottlieb said in launching the fifth report in the organisation’s five-year Digital Platform Services Inquiry (DPSI), which kicked off in 2020 with a remit to closely examine competition and consumer harms in the market.

“The expansion of digital platforms in Australia has brought many benefits to Australian consumers and businesses,” Cass-Gottlieb said, but “this expansion… has also created risks and harms that our current consumer and competition laws are not always able to address.”

The new report focuses on possible regulatory reform to rein in digital giants whose importance, widespread use, and lack of competition “creates opportunities and incentives for these platforms, and unscrupulous parties using their services, to engage in conduct that harms consumers, competition, and the economy.”

As ‘gatekeepers’ between businesses and consumers, Cass-Gottlieb said, digital platforms like Google, Meta, and Apple had built “broad influence across the economy” that enabled anti-consumer outcomes – such as their demonstrated ability to preference their own services or impose “tying arrangements” in which users of one service are pushed to use another service.

Provision of free services increased customers’ exposure to advertising or harvesting of personal data, the report noted in a nod to the current haemorrhaging of personal data after breaches at Medibank, Optus, and other organisations.

“These poor competition outcomes harm consumers and business users,” Cass-Gottlieb said in calling for “service-specific codes of conduct” that would force designated digital platforms to actively protect consumers’ interests.

Those codes could, for example, prevent self-preferencing, tying, and exclusive pre-installation arrangements; address advantages created by the use of data; ensure “fair treatment” of business users; and improve switching, interoperability, and transparency.

Recommendations for codes of conduct complement the report’s recommendation that new laws be passed to force digital platforms to give consumers five key protections.

These include providing “user-friendly processes” to help consumers report scams, harmful apps, and fake reviews – and for digital giants to respond to such reports; to fight scams by verifying advertisers, app developers, and merchants; and to publish review verification processes so consumers can better evaluate reviews’ reliability.

The laws would also require digital platforms to report on scams, harmful apps, and fake reviews on their services, and to publicise their efforts to fight such content.

Empowering consumers and small businesses

In a climate where malicious actors are leveraging digital platforms to drive soaring scam losses, creating a new digital platform ombudsman would, the report recommends, give consumers and small businesses access to “appropriate dispute resolution” designed to combat digital platforms’ inaction when challenged to moderate problematic online content.

Problems with scams and fake reviews “have been made worse by a lack of avenues for dispute resolution for consumers and small businesses,” Cass-Gottlieb said, “who often simply give up on seeking redress because they cannot get the digital platforms to properly consider the problem.”

The recommendations of the report – which attracted over 90 submissions after publication of a discussion paper earlier this year – echo the sentiments of consumer advocacy organisations like Choice, which argued for “urgent regulatory reform” including “principles-based obligations”, strengthening Australian Consumer Law protections, and addressing “well documented” harmful practices that are “oppressive, exploitative or contrary to consumer expectations of fairness.”

Industry group Free TV Australia – which has long argued for tighter regulation of digital giants – welcomed the proposal for targeted mandatory codes of conduct, with chief executive Bridget Fair saying that such codes “put Australia in step with international developments, such as in the EU.”

“The longer we take to get the necessary legislation in place,” she said, “the harder reform will become as the tentacles of dominant platforms continue to expand across the economy.”

Shadow Minister for Communications Sarah Henderson was quick to confirm that the crackdown has bipartisan support, pressing the Albanese Government to “urgently act on” the report’s recommendations.

“We urge the government to get on with the critical job of reining in the anti-competitive and harmful conduct of global tech giants such as Meta, Apple and Google,” she said.

The sixth DPSI report, due to be published by the end of March 2023, will explore competition amongst social media apps – including whether Tik Tok’s rapid rise has broken Facebook’s long-held monopoly over the sector.