"The Tech Portal" : Leaked documents detail how Facebook moderates graphic, controversial content

as declared in Further, one of the leaked documents sheds light on how Facebook handles content related to child abuse and pornography. The said guidelines define what we can post on the website and how content moderators need to act upon reported content — only within 10 seconds due to the volume of work. And we have today gained access to these internal content moderation documents. But, several reports floating around the interwebs seem to suggest that Facebook could possibly face instant backlash for content moderation policies related to threats of violence and other graphic content. It provides us a first-hand look at the blueprints and algorithms being employed to moderate content related to violence, hate speech, terrorism, pornography, racism, and self-harm.


as declared in

Facebook leaked documents show types of content it allows: Guardian


Facebook leaked documents show types of content it allows: Guardian
http://bit.ly/2q7dThGMany of the company's content moderators have concerns about the inconsistency and peculiar nature of some of the policies. "Keeping people on Facebook safe is the most important thing we do. Those on sexual content, for example, are said to be the most complex and confusing, the Guardian said. (Reuters) - Leaked Facebook Inc documents show how the social media company moderates issues such as hate speech, terrorism, pornography and self-harm on its platform, the Guardian reported, citing internal guidelines seen by the newspaper. The newspaper gave the example of Facebook policy that allowed people to live-stream attempts to self-harm because it "doesn't want to censor or punish people in distress."

Leaked Facebook documents reveal problematic content removal standards: report


Leaked Facebook documents reveal problematic content removal standards: report
Leaked documents have revealed exact guidelines for how Facebook removes content related to sex, terrorism, death threats, self-harm, suicide, and more, the Guardian reported. Facebook's approach to reported contentHere's how the leaked slides describe Facebook's policy for different types of 'disageeable or disturbing' content. These leaked documents show the standards that those moderators will probably be using to make decisions on reported posts and videos. "Not all disagreeable or disturbing content violates our community standards," the Guardian quoted Facebook as saying (this statement is actually a part of their tystandards">community standards page). To that end, it hired 3000 moderators to review reports from users about disturbing content.


to read more visit us Facebook

collected by :Andro Alex

Comments