Facebook Reveals The Lengthy Rulebook It Uses To Remove Posts

Wednesday, 25 Apr, 2018

The community standards also do not address false information - Facebook does not prohibit it but it does try to reduce its distribution - or other contentious issues such as use of personal data. Indeed, there are thousands of words split across the multiple sections of the community standards guidance; expecting each individual team member to reach the same conclusion for every incident is impossible.

"You should, when you come to Facebook, understand where we draw these lines and what's OK and what's not OK", Bickert told reporters in a briefing at Facebook's headquarters. Facebook will also hold a series of summits where the general public will be able to give feedback on these policies.

The new 27-page guidelines detail the policies and gives examples of how content moderators will be expected to decide wether to take down content. "If we've made a mistake, we will notify you, and your post, photo or video will be restored".

Facebook promises that appeals will be reviewed within 24 hours by its Community Operations team.

Billoo herself was censored by Facebook two weeks after Donald Trump's election, when she posted an image of a handwritten letter mailed to a San Jose mosque and quoted from it: "He's going to do to you Muslims what Hitler did to the Jews".

It's all part of Facebook's attempt to better control - and do so more transparently - what's going on across the site, particularly in the wake of controversies around its involvement in the 2016 USA presidential election. "We make mistakes because our processes involve people and people are not infallible", Ms Bickert said in a blog post. Facebook was also forced to temporarily shut down its services in Sri Lanka last month after inflammatory messages posted to the service incited mob violence against the country's Muslim minority. It also removed some of the accounts that attacked Oluo. Note: Facebook has not changed the actual rules - it has just made them public. "This whole process assumes there is - or perhaps creates - some global, universal standard of decency and appropriateness". Both sides had called for greater transparency. Conservatives say there's too much influence from the left-leaning Silicon Valley, with its liberal leadership and employees. Even if the person agreed to be taped or photographed, for example, they may not have agreed to have their naked image posted on social media. After all, Facebook has been accused of various forms of censorship in the past. Now we know the company defines a hate organization as one with "three or more people that is organized under a name, sign, or symbol and that has an ideology, statements, or physical actions that attack individuals based on characteristics, including race, religious affiliation, nationality, ethnicity, gender, sex, sexual orientation, serious disease or disability".

Rules on Facebook are different than laws in the United States and elsewhere. That means Facebook deletes calls for violence or slurs that may be protected free speech in the US under the First Amendment. They are required to list the organizations outside Facebook with which they consulted. "We are trying to strike the line between safety and giving people the ability to really express themselves".

The release of Facebook's rules draws attention to the hard work of the 7,500 content moderators - many of them subcontractors - who are tasked with applying the rules to the billions of pieces of content uploaded to the platform. We receive millions of reports every week, and automation helps us to route those reports to the right content reviewer.

Hopefully, Facebook will absorb what is useful, and reject what is useless, and establish itself as a flawed-but constantly improving-place for people to share their views.

It's very important that we use a combination of technology, human reviewers and the flagging of problem content in order to remove posts that violate our community standards, she said.

"We're developing AI tools that can identify certain classes of bad activity proactively and flag it for our team at Facebook", he said.