Facebook Reveals Its Internal Rules For Removing Controversial Content

Shutterstock

Mark Zuckerberg’s congressional testimony notwithstanding, his company, Facebook continues to face public pressure to come clean regarding its private practices regarding personal data storage in light of the ongoing Cambridge Analytica scandal. This is especially the case now that whistleblowers have revealed that the data firm possesses far more data than it has officially let on, thereby prompting demands for further action from Facebook. So on Tuesday, the latter decided to reveal its internal community standards and policies, which it uses to determine which posted content is acceptable, and which should be removed.

According to CNN, the newly released document is what the company’s “7,500 content moderators use when deciding what is and isn’t acceptable content, including hate speech, nudity, gun sales and bullying.” A shorter version as previously made available to the public, but the latest release includes far more detail than previously reported. Monika Bickert, Facebook’s vice president of product policy and counter-terrorism, briefed reporters on the community standards’ contents and purpose. “You should, when you come to Facebook, understand where we draw these lines and what’s OK and what’s not OK,” she said.

“Our enforcement isn’t perfect,” she added. “We make mistakes because our processes involve people, and people are not infallible.” Per Reuters, the company will now “allow people to appeal a decision to take down an individual piece of content,” as opposed to “the removal of accounts, Groups and Pages” only. Facebook is also providing individuals and content managers with specific reasons as to why certain posts were taken down per the internal community standards. Now that he latter is publicly available, users will hopefully have a better grasp of what the social media giant is doing, and why. Hopefully.

(Via CNN and Reuters)

×