Here Are Facebook's Internal Standards For Removing Controversial Content

Facebook

Facebook's privacy policies and its mode of operation as a whole has come under intense scrutiny following the Cambridge Analytica debacle. While testifying before Congress, Facebook CEO Mark Zuckerberg fielded a variety of questions as lawmakers sought to understand how the social network operates, and whether government regulation is needed. The damage control is ongoing for Facebook, and as part of the effort, its VP of global product management Monika Bickert shared some details on the company's internal enforcement guidelines for what content is and is not allowed.

"We decided to publish these internal guidelines for two reasons. First, the guidelines will help people understand where we draw the line on nuanced issues. Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines—and the decisions we make—over time," Bickert said.

Bickert is a former criminal prosecutor who has worked on everything from child safety to counter terrorism. She notes that other team members include a rape crisis counselor, an academic who has spent her career studying hate organizations, a human rights lawyer, and a teacher. To prevent Facebook from falling behind the times, the social network's safety team seeks input from outside experts and organizations every week, Bickert explains.

She also notes that Facebook's enforcement of its policies are not perfect.

"One challenge is identifying potential violations of our standards so that we can review them. Technology can help here. We use a combination of artificial intelligence and reports from people to identify posts, pictures or other content that likely violates our Community Standards. These reports are reviewed by our Community Operations team, who work 24/7 in over 40 languages," Bickert added.

Facebook Post Removal

As of this moment, Facebook employs more than 7,500 content reviewers, which is up more than 40 percent from this same time last year. Nevertheless, it's not enough eyeballs looking at content, and Facebook is prone to making mistakes. To combat that, Bickert says Facebook plans to launch an appeals process for posts that were removed for nudity/sexual activity, hate speech, or graphic violence. The image above is an example of a post that could have been improperly removed.

It all sounds good in theory, but also feels like a lot of posturing while the spotlight is on Facebook and how it operates. One point of criticism that Zuckerberg sort of sidestepped during his recent testimony is that he and his company have a history of apologizing for mishaps, without any real repercussions. In his defense, Facebook is the largest social networking service on the planet, and inevitably there will be mistakes made along the way. At least as it pertains to reported posts, Facebook is hoping to mitigate those mistakes with its upcoming appeals process.