Social media managers must be vigilant for defamatory content after a NSW Supreme Court justice found that media companies can be held liable for comments that their readers post online.
The case arose after major media organisations shared links to their stories about 17-year-old Dylan Voller – whose mistreatment in NT youth detention prompted a 2016 Four Corners investigation – on their Facebook sites.
User comments included negative allegations about Voller’s behaviour and past, leading him to sue News Corp, Fairfax Media and Sky News for defamation.
He won a victory after Justice J Rothman determined that the companies were legally the publishers of the comments because they had not detected and taken them down.
“Unlike the publication of a mass circulation daily, or the usual website or electronic version of a newspaper, the purpose of the public Facebook page is not to disseminate news,” Rothman wrote in Voller v Nationwide News Pty Ltd & Ors.
“Rather... the purpose of a public Facebook page is to excite comments and interest from and by the public.”
When has a comment been published?
The case has attracted worldwide attention – and anger from media outlets that already spend many hours curating social media content every day.
Facebook is a major conduit for the publishers, with around 20 to 30 daily stories generating around 39 percent of visits to The Australian’s web site alone.
Fairfax Media social media administrators publish around 50 stories every day, each typically receiving 100 to thousands of comments over many days.
That made it “physically impossible”, social media editor Sophia Han Thuy Phan testified, for her and her assistant to monitor and act upon every comment.
Media outlets argued their defence under s32 of the Defamation Act 2005 (NSW), which absolves parties from defamation due to “innocent dissemination” – where an organisation “published the matter merely in the capacity, or as an employee or agent, of a subordinate distributor.”
A ‘subordinate distributor’ is an organisation that was not the “first or primary distributor” of the matter; was not the author or originator of the matter; and had no “capacity to exercise editorial control over the content of the matter (or over the publication of the matter) before it was first published.”
Social media outlets have generally evaded responsibility unless someone has complained about a specific comment – and even then removal is not always guaranteed.
A 2017 appeal against an earlier verdict in Duffy v Google exonerated Google for publishing negative search results excerpted from a third-party website.
The Voller decision, however, holds companies liable for defamatory comments – even when the comments have not been vetted or complained about.
A new standard for liability
The decision is a point of law in a bigger defamation case, but its significance and implications have sent legal and social media experts into a frenzy.
“If you operate, host or administer a Facebook page, whether it be large or small, which encourages and elicits contributions from others… you are potentially exposing yourself to liability for the posting handiwork of others,” Nick Stagg and Jasmine Sims – a partner and associate, respectively, of WA law firm Lavan – warned in a blog.
Even comments older than the limit for defamation action – WA imposes a 12-month limit – could potentially create exposure, they warned, because “each time an online content is ‘downloaded’ by a person and read by them there is a fresh ‘publication’.”
Ugur Nedim, principal of Sydney Criminal Lawyers, wrote in his analysis that the decision “means administrators of social media pages will need to be vigilant to ensure that potentially defamatory comments are deleted in a timely manner.”
Minimising your exposure
The decision is a wakeup call for any brand that uses blogs or social media platforms.
By forcing staff to continuously monitor social media comments, it presents legal, moral, and logistical issues.
The defendants in Voller said they had no way to vet comments before publication because Facebook does not provide one – but this was shot down after an expert witness testified about a workaround using Facebook comment filters and monitoring tools.
Even where filters are being used, Nedim warned, “administrators could potentially become liable as soon as just one person sees the defamatory remark.”
That’s a problem because, even using Facebook’s rudimentary filters, the court heard, posts blocked from public view remain available to the poster and everyone in their Facebook network.
The judgement was no surprise to Larah Kennedy, team lead and community consultant with social media advisory firm Quiip.
Writing for marketing journal B and T, Kennedy opined that brands must not only take control of the conversations around their products, but that “they are also more widely responsible for ensuring the online spaces they create are safe and welcoming.”
“We wouldn’t step into an organisation’s bricks and mortar establishment and expect to be verbally attacked or discriminated against by another patron,” she wrote. “And if it did happen, we would certainly expect the business to intervene.”
“For an organisation to claim that they are not aware of the comments or discussions happening in an online space they have created is lazy at best and downright [negligent] at worst.”