Facebook does not delete videos of death, abuse and self-harm says leaked guidelines
The social media giant's 4,500 moderators are struggling with policing two billion users.
Facebook will not delete videos of death and self-harm because it does not want to censor its users, leaked guidelines into the social media giant have revealed.
Dubbed the Facebook Files, the guidelines published by the Guardian detail how the company's 4,500 moderators are swamped with trying to police what its two billion users can post.
While videos of violent deaths are marked as disturbing, the rules state they do not always have to be deleted because they can "help create awareness of issues such as mental illness."
Facebook also said it did not "action photos of child abuse. We mark as disturbing videos of child abuse. We remove imagery of child abuse if shared with sadism and celebration".
However non-sexual child abuse can be shared so "the child [can] be identified and rescued, but we add protections to shield the audience".
Facebook will also allow people to livestream attempts to self-harm because it "doesn't want to censor or punish people in distress". Videos of abortions are also permitted, as long as there is no nudity.
The guidelines state that any comments pertaining to the US President Donald Trump that suggest his life is in danger such as "Someone shoot Trump" should be deleted, because as a head of state he is in a protected category.
The use of expletives is also deemed permissible as Facebook acknowledges "people use violent language to express frustration online" and feel "safe to do so" on the site.
Facebook conceded that "not all disagreeable or disturbing content violates our community standards".
However, the reports suggest that that moderators are ill-equipped to contend with new challenges such as "revenge porn" and are overwhelmed by the volume of work, leaving them with little more than 10 seconds to make a decision on the suitability of content.
"Facebook cannot keep control of its content," said one source. "It has grown too big, too quickly," it said.
'We feel responsible to our community to keep them safe'
Monika Bickert, Facebook's head of global policy management, said that while the company has a responsibility to keep its users safe, moderating the content of two billion people is a near impossible feat.
"We have a really diverse global community and people are going to have very different ideas about what is OK to share. No matter where you draw the line there are always going to be some grey areas. For instance, the line between satire and humour and inappropriate content is sometimes very grey. It is very difficult to decide whether some things belong on the site or not," she said.
"We feel responsible to our community to keep them safe and we feel very accountable. It's absolutely our responsibility to keep on top of it. It's a company commitment. We will continue to invest in proactively keeping the site safe, but we also want to empower people to report to us any content that breaches our standards."
While critics in the US and Europe have called for the company to be regulated in the same way as mainstream broadcasters and publishers, Bickert says that Facebook's remit is unique from broadcasters.
"It's not a traditional technology company. It's not a traditional media company. We build technology, and we feel responsible for how it's used. We don't write the news that people read on the platform."
© Copyright IBTimes 2024. All rights reserved.