On Monday the tech news landscape was suddenly awash with stories about Facebook after a group of 17 news outlets were given direct access to internal documents from whistleblower Frances Haugen.
Time and again the articles published about Facebook’s inner workings show a company that, unsurprisingly, has prioritised profits over the safety and wellbeing of its users and which has struggled with handling the scale of its global operations and the immense intricacies that entails.
The Verge explained how Facebook uses a tier system to determine how it will allocate content moderation resources between countries depending on their risk of social instability, especially during elections.
The US, Brazil, and India were given highest priority and most active monitoring. Then came Germany, Indonesia, Iran, Israel, and Italy which had fewer resources outside of election periods.
Another 22 countries were placed in the next tier which would forgo the company’s “enhanced operations centres” with the rest of the world lumped together in the lowest tier.
The Verge noted significant disparities between how Facebook treated its moderation efforts in different countries, including lax misinformation classification in Myanmar and Ethiopia at times when they were experiencing severe violence and political crises.
Facebook has also struggled with specific instances of abuse on its platforms – although it apparently ramps up its efforts when its bottom line is threatened.
The Associated Press reported how in 2019 Apple threatened to ban Facebook and Instagram from its iOS App Store because the platforms were being used to buy and sell workers – maids in particular – in the Middle East.
Facebook had been aware of poor conditions for cheap foreign workers in places like Saudi Arabia and Egypt that included “trafficking people for the purpose of working inside private homes through the use of force, fraud, coercion or deception”.
It had known its platforms were being used to facilitate this human trade through programmatic ads on localised versions of its apps, but was struggling to keep up with the scale of the problem given only a fraction of the advertisements were being reported.
Apple’s removal threat hastened the company which recognised being booted from the App Store “would have potentially severe consequences to the business”.
So when Facebook kicked over 1,000 accounts and told Apple how it was working on the problem, Apple agreed to keep the apps alive – but, as the Associated Press reported, the problem of its platforms being used for human trafficking still remains.
Maximising engagement
Then of course there’s issues with how Facebook tweaks its algorithms to heighten user engagement and maximise advertising revenue, often at the cost of amplifying problematic and divisive content.
As the Washington Post discovered, Facebook tends to increase the ratings of new features to encourage people to use them.
When the company added emoji reacts to posts five years ago, it gave these user reactions a significantly higher weighting than the standard ‘like’ because it was a stronger indicator of user reaction and engagement.
One result was that controversial posts which garnered lots of ‘angry’ reactions were pushed higher onto news feeds and helped foster a toxic environment where misinformation spread.
Internally, Facebook staff – including the research teams that Facebook cites as a reason for its trustworthiness – pointed out some of these flaws.
One snippet of an internal document shared by the New York Times concludes that hate speech and misinformation are, in many ways, intrinsically tied to Facebook’s operations.
“We also have compelling evidence that our core product mechanics, such as virality, recommendations and optimising for engagement, are a significant part of why these types of speech flourish on the platform,” the document read.
Facebook has repeatedly denied any allegation of wrongdoing or mismanagement throughout the month-long sage of leaked documents and public hearings with Haugen.
On the back foot
During the company’s earnings call on Monday morning, CEO Mark Zuckerberg said news outlets were trying to spin a negative narrative about Facebook.
“My view is that what we're seeing is a coordinated effort to selectively use leaked documents to paint a false picture of our company,” Zuckerberg said.
“The reality is that we have an open culture where we encourage discussion and research about our work so we can make progress on many complex issues that are not specific to just us.”
Certainly the effort was coordinated in the sense that news outlets agreed to hold off on posting their stories until the same time on Monday.
But the outlets have also found different angles in their reporting on Facebook and in each case the social network comes off looking like Frankenstein’s monster: out of its creators’ control.
This is partly because the nature of its business (maximising advertising revenue) but also because its global scale means Facebook affects the political, economic, and social realities of different countries and cultures around the world.
In response to the Facebook Papers this week, Zuckerberg said his company has to “balance social equities” when it makes decisions.
He offered the following set of dichotomies: between free speech and stopping harmful speech; between privacy and helping law enforcement; and between allowing interopability and locking down personal data.
“It makes a good soundbite to say that we don't solve these impossible tradeoffs because we're just focused on making money, but the reality is these questions are not primarily about our business, but about balancing different difficult social values,” Zuckerberg told shareholders.
“And I've repeatedly called for regulation to provide clarity because I don't think companies should be making so many of these decisions ourselves.”