Facebook Discloses Secret Rules On What Posts Get Blocked — and Why

Ever had a post on Facebook blocked — by Facebook? For the first time, the company is explaining why.

The social media giant has now disclosed the once-secret rules on how it polices itself, explaining what it allows, or doesn’t, on topics such as drug use, sex work, bullying, hate speech, and inciting violence.

For years, Facebook held forth “community standards” for what people can post. Before, though, only a generalized version of the rules was available.

Yet Facebook’s moderators have long followed a far more specific internal document to decide when or if individual posts or accounts should be removed.

The extended rules were released in an attempt by the company to become more transparent, and subsequently, be more open about its operations.

“You should, when you come to Facebook, understand where we draw these lines and what’s OK and what’s not OK,” Monika Bickert, Facebook’s vice president of product policy and counter-terrorism, told reporters in a briefing at Facebook headquarters.

Facebook has come under much criticism from governments and from human rights groups in many countries.

They believe Facebook has failed when it comes to hate speech and preventing the promotion of terror.

Critics charge Facebook with not doing enough to quell the fomenting of sectarian violence. And they criticize the company for essentially broadcasting murders and suicides.

At the same time, some assert, the social media firm has been doing the bidding of repressive regimes by aggressively removing content that crosses these governments.

New policy

Facebook, the world’s largest social network, has become a influential source of information in many countries around the world.

It uses both automated software and a team of moderators that now numbers 7,500 to take down text, pictures and videos that break its rules.

Under pressure from several governments, the company has been strengthening its moderator ranks since last year.

The company considers changes to its content policy every two weeks at a meeting called the “Content Standards Forum” led by Bickert.

A new policy will, for the first time, allow people to appeal a decision to take down an individual piece of content. Attendees at the meetings will include people who specialize in public policy, legal matters, product development, communication and other areas.

Facebook also is planning a series of public forums in May and June in various countries to get more feedback on the rules.