Leaked Facebook documents reveal problematic content removal standards: report

Leaked Facebook documents reveal problematic content removal standards: report

The leaked documents revealed precise guidelines on how to eliminate content related to sex, terrorism, death threats, self-harm, suicide and more, The Guardian reported. A series of presentation slides that detail how Facebook handles reports about posts, comments and videos that violate the rules of your site. The company has given a great deal of freedom to certain types of violent content, such as self-degradation and threats, while relying on “news” to decide whether videos and stream of suicide and terrorism should be erased. “Not all unpleasant or disturbing content violates our community standards,” he told The Guardian Facebook (this statement is indeed part of the page of its tystandards standards).

Facebook’s approach to declared content

This is how the slides publicize the Facebook policy for different types of content “unpleasant or unsettling.”

1) Abuse of children and animals: sexual abuse of children is not allowed, provided it is not a “celebration” that glorifies abuse. It is allowed to abuse animals for the majority, but in cases of particularly dangerous or harmful visual content should be marked as “disturbing”; Content marked “disruptive” can only be seen if users (who are over 18 years of age) specifically choose to view them. Like the approach of child abuse, the abuse of animals shared with the intention of party or will be sadistic wipe.

2) Suicide and selfjudication: the flow of suicide and selfjudication is allowed. On one of the slides, Facebook said that people living or self-allergy display videos are “Cry” for online help and therefore should not be censored. One of the documents says that Facebook has done on the basis of advice from the Samaritans and lifeline, both non-profit anti-suicide operating aid lines from the United States and the United Kingdom. As far as suicide “experts have told us what is best for the safety of these people would be left to live when they participate in the spectators,” said one of documents. However, this content would be deleted once there was “more an opportunity to help the person.”
“Sometimes we see special moments or public events that are part of a wider public conversation that warrants the release of that content on our platform,” said Monika Bickert, director of public policy for Facebook at The Guardian. He cited an example of a video of an Egyptian taxi driver who has self-immolated to protest against the government and the “high price”, Facebook has decided not to eliminate.

3) Violence and death slides that describe how to handle graphic violence and death also make a distinction between removing content and labeling it as “disruptive.” For example, mutilation videos are deleted anyway, so images are marked as “disturbing.” There are exceptions for content that “Document the atrocities,” but they must also be marked as disruptive.

4) Threats: A slide entitled “Violence Examples Credible” Facebook has listed “credible” threats that justified withdrawal, as well as “generic” threats that were not. For example, “I hope someone is going to kill” will not be deleted by Facebook because “people use violent language to express frustration online” and is an example of people doing this type. However, statements like “someone who draws Trump” or will be erased, since he is the head of state and therefore is in a “protected class”. Another example of a “generic” threat that Facebook did not remove was “To turn off the dog’s neck, be sure to apply all the pressure inside the throat.”

Leave a Comment

Your email address will not be published. Required fields are marked *