The internal rules and guidelines used by Facebook to decide what content to remove have been revealed, detailing how the tech giant is struggling to keep up with its growing role as a publisher.

More than 100 Facebook training manuals, spreadsheets and flowcharts were leaked to The Guardian, showing how the company tells its moderators what to deleted from its platform.

The documents reveal how the company approaches violent content, hate speech, terrorism, pornography, racism and self-harm, and why so much of this content is allowed to remain online.

Facebook is currently under extreme pressure, especially in Europe and the US, over some of the content that it hosts, with many politicians arguing that it should be treated like a media publisher and take on more responsibility for what is posted on its platform. Facebook recently lost a court case in Austria which ruled that offensive material had to be removed on the platform globally, not just in the relevant jurisdiction.

The training manuals show how Facebook views many violent threats as not credible and allows them to remain on site. According to the documents, a post saying “someone shoot Trump” will be deleted because heads of state are protected, but posting either “f*** off and die” or “to snap a b****’s neck, make sure to apply all your pressure to the middle of throat” is allowed because they are “not regarded as credible threats”.

Innovation and tech consultant Kate Raynes-Goldie says this reflects a general societal culture of ignoring misogynistic hate speech.

“They crack down on hate speech with respect to race and ethnicity, and rightly so, but when it comes to gender they don’t,” Raynes-Goldie tells Information Age. “It still seems that speech like that is somehow deemed okay, and that’s reflected in these rules.”

Frustration vs credible threats

The leaked documents also show the surprisingly amount of offensive or disturbing content that Facebook allows to remain on its platform due to a number of reasons.

The company says that videos of violent deaths are sometimes allowed because they may create awareness of issues like mental illness, photos of animal abuse are allowed while certain photos showing non-sexual abuse and bullying children are allowed unless they are accompanied by sadistic or celebratory captions.

Facebook has been met with growing complaints over how it deals with threats and abuse, especially directed towards women. The internal documents reveal how the tech firm distinguishes between “frustration” and credible threats.

“People use violent language to express frustration online [and] feel safe to do so [on the site]. They feel that the issue won’t come back to them and they feel indifferent towards the person they are making the threats about because of the lack of empathy created by communication via devices as opposed to face to face,” the documents say.

Facebook then allows comments such as “I’m going to kill you” as an “expression of frustration” and does not remove it.

This fails to address the fact that women bear the brunt of violent threats on social media, Raynes-Goldie says.

“It’s definitely something disproportionately faced by women, and this doesn’t seem to protect women as much as it should,” she says.

The pressure on moderators

Facebook currently has 4500 ‘content moderators’ that sift through reports and determine whether content should be taken down. The leaked documents are used during these moderators’ two weeks training. The moderators are subcontractors and work from around the world, with Facebook planning to hire a further 3000 to tackle its new challenges.

A lot of pressure is placed on these moderators, who sometimes only have 10 seconds to make a decision, a source tells The Guardian.

“Facebook cannot keep control of its content. It has grown too big, too quickly,” the source says.

Raynes-Goldie says these moderators shouldn’t be blamed for the current issues.

“I’m really glad this isn’t my job,” she says.“These people have to look at this stuff all day long every day, and that must be hugely traumatic for them. I don’t know what the answer is but I don’t think this is working.”

Facebook’s Monika Bickert wrote a lengthy reply following the revelations, saying that the process of reviewing content is “complex, challenging and essential”.

“The articles and training materials published alongside this show just how hard it can be to identify what is harmful - and what is necessary to allow people ability to share freely,” Bickert says.

“We face criticism from people who want more censorship and people who want less. We see that as a useful signal that we are not leaning too far in any one direction. The alternative - doing nothing and allowing anything to be posted - is not what our community wants.

“We get things wrong, and we’re constantly working to make sure that happens less often. We put a lot of detailed thought into trying to find right answers, even when there aren’t any.”