Meta has announced a significant backdown on its moderation policies just weeks before Donald Trump begins his second term as US President, with third-party fact-checkers axed and content restrictions efforts scaled back.
Meta CEO Mark Zuckerberg revealed the overhaul in a video message this week where he said that moderation on the company’s platforms including Facebook and Instagram has gone too far, leading to “too many mistakes and too much censorship”.
Zuckerberg said that Trump’s election win in November last year represented a “cultural tipping point” towards an emphasis on freedom of speech, and that a number of changes will be made by Meta to “get back to our roots around free expression”.
Most significantly, Meta will be axing all of its third-party fact-checkers in the US and moving towards an X-style Community Notes system, lifting restrictions on discussions around certain topics such as immigration and gender identity, and promoting more political content into the feeds of users.
“What started as a movement to be more inclusive has increasingly been used to shut down opinions and shut out people with different ideas and it’s gone too far,” Zuckerberg said.
It’s a significant reversal in Meta’s stance on moderation and misinformation and comes as Zuckerberg has attempted to mend his fractured relationship with Trump, who has brought X CEO Elon Musk into his inner circle.
The end of fact-checkers
Starting in the US, Meta will be scrapping the use of third-party fact-checking programs, which sees contracted organisations moderating content posted on its platforms and posts as untrue.
Zuckerberg said these fact-checkers had become unreliable and biased.
“The fact-checkers have been too politically biased and have destroyed more trust than they’ve created,” Zuckerberg said in the video.
The changes will initially apply only to fact-checkers in the US, and Australian fact-checkers, including AAP marking FactCheck, are understood to be continuing their work this year.
This scheme will be replaced by a community notes program, where user consensus will decide what is misleading and what needs more context.
This is similar to the policy adopted by X following the takeover by Musk.
Meta will also be refocusing its use of automation to scan content for policy violations to only illegal and high-severity violations, such as terrorism, child sexual exploitation and drugs.
It will rely on users reporting posts containing less severe policy violations before any action is taken.
The company will be moving its trust and safety teams out of California and into Texas.
Meta’s new chief global affairs officer Joel Kaplan said that Meta is currently “over-enforcing” these moderation rules.
He said in December last year that the company removed millions of pieces of content each day, and that an estimated one or two out of every 10 of these posts may have been removed in error.
“We want to undo the mission creep that has made our rules too restrictive and too prone to over-enforcement,” he said.
Restrictions on posts about topics such as immigration, gender identity and gender will also be lifted as part of the massive changes.
“It’s not right that things can be said on TV on the floor of Congress, but not on our platforms,” Kaplan said.
Zuckerberg admitted that these changes will mean that the tech giant is “going to catch less bad stuff”.
Meta will also reverse a move made in 2021 to reduce the amount of political content served up to its users, with this type of content to now be treated the same as other posts.
Zuckerberg added he will now aim to work with the new Trump administration on international efforts to censor social media platforms.
“We’re going to work with President Trump to push back on governments around the world that are going after American companies and pushing to censor more,” he said.
Misinformation ‘free-for-all’
In Australia, Greens Senator Sarah Hanson-Young slammed the changes, saying that “it’s going to mean a free-for-all on misinformation, disinformation, abuse and trolling”.
“This is a very, very dangerous move at a time when members of the community, parents, young people – women in particular – are increasingly concerned about the unsafe environment on these big platforms,” Hanson-Young said.
Dr Alexia Maddox, director of digital education and senior lecturer in pedagogy at Melbourne’s La Trobe University, said the timing of Meta’s announcement could be particularly significant for Australia.
“With our federal election approaching, these changes to Meta's platforms could substantially impact how Australians access and verify political information,” Dr Maddox said.
"In Australia, specifically, we currently have three professional fact-checking organisations working with Meta (RMIT FactLab, AFP, and AAP FactCheck).
“The shift to a community-based system raises questions about maintaining verification standards in our market, where we already have limited fact-checking resources compared to larger markets like the US.”