Facebook’s leaked internal rulebook reveal ambiguous guidelines for content removal

Livestreams of suicides and self-harm are allowed but can be marked as disturbing.
Facebook’s leaked internal rulebook reveal ambiguous guidelines for content removal
Facebook’s leaked internal rulebook reveal ambiguous guidelines for content removal
Written by:
Published on

In the Netflix series Black Mirror, we are confronted with familiar and chilling fears about living in a world saturated with technology. 

What is eerie is how close we have come to finding ourselves in a technology dystopia. And in this dystopia, Facebook is increasingly taking centrestage, with the social media platform unwittingly enabling the public broadcast of suicides, murders and so on. 

But so far, we had very little actual information on how Facebook dealt with such content, and how much responsibility it should take for it. So the one major accusation that went against the social media giant till now was that it was highly secretive about the way it monitored and removed inappropriate content. 

Now, leaked documents from Facebook reveal how Facebook does it, and what it unveils is an ambiguous process of identifying what can and cannot be removed. 

The task of filtering obscenities, abuse and violence is entrusted to people who are now widely known as content moderators. The task of content moderators is to go through objectionable content and manually remove them. For that, Facebook has to lay down some guidelines about how to interpret the content at hand. 

The leak, reported by the Guardian, reveals the ambiguities in the guidelines laid down by Facebook for its content moderators. 

The Guardian says that the documents lay out Facebook’s ‘secret rules and guidelines for deciding what its 2 billion users can post on the site’. 

It says it has seen more than 100 internal training manuals, spreadsheets and flowcharts, which offer an insight into the blueprints Facebook has used to moderate issues such as violence, hate speech, terrorism, pornography, racism and self-harm. There are even guidelines on match-fixing and cannibalism.

These guidelines give us a glimpse of the difficulties faced by content moderators who have to respond to new challenges thrown at them every time something inappropriate is uploaded. 

For instance, here is what some of guidelines state: 

On suicide and self-harm: Livestreams of suicides and self-harm are allowed. While it can be marked as disturbing, it does not always have to be deleted because they can help create awareness of issues such as mental illness. Facebook says that users livestreaming or posting videos of sel-harm are “crying out” for help, and therefore shouldn’t be censored. However, this content would be deleted once there was “no longer an opportunity to help the person.

On violence and death: The guideline makes a distinction between content that should be removed immediately and content that can be marked ‘disturbing’. For instance, videos of mutilations are removed no matter what, whereas photos are marked as ‘disturbing’

On child and animal abuse: Non-sexual child abuse is allowed, as long as it doesn’t have a ‘celebratory’ overtone which glorifies the abuse. Photos of animal abuse can be shared, with only extremely upsetting imagery to be marked as “disturbing”. Abuse shared with celebratory and sadistic intention will be removed. 

On sex and nudity: Facebook’s community standards generally prohibit most nudity. Hand-drawn depictions of sex and nudity is allowed, but digitally rendered artwork is not. The guidelines lay down parameters to identify and remove revenge porn. Videos and photos of abortions are allowed, as long as they don’t contain nudity.

On threats: For example, “I hope someone kills you” would not be removed by Facebook, since “people use violent language to express frustration online”. However, statements like “someone shoot Trump” would be removed, since he’s a head of state and is therefore in a ‘protected category’. ‘Generic’ threats like, “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat” will not be removed. 

According to CNN, these guidelines highlights the company's struggle to censor harmful content without being accused of trampling on freedom of expression.

The leaked documents are important because it is the first time the 2 billion users of Facebook has a chance to assess the rules that Facebook adheres to, to monitor content on its platform.

Related Stories

No stories found.
The News Minute
www.thenewsminute.com