Boobs And Beheadings: Facebook Revisits Community Standards Enforcement

The problem is a simple one: There are images out there that should not be on Facebook, where they’re viewable by anyone aged 13 and up. However, Facebook has an extremely difficult job to do when it comes to deciding which and what type of images might fall into the “banned” category, and the issue has recently come into focus over a beheading video that got so much attention that British PM David Cameron even weighed in on it.

Facebook has now pulled the video, and although the social network isn’t actually changing any of its standards, it is revisiting how it enforces its guidelines. "When we review content that is reported to us, we will take a more holistic look at the context surrounding a violent image or video, and will remove content that celebrates violence," reads an emailed statement Facebook sent to AFP.

David Cameron Facebook video

It’s easy to have a knee-jerk reaction to the fact that Facebook would allow such deeply disturbing and graphic violence--especially as the company previously banned images of acts such as breastfeeding if a nursing mother’s nipple was exposed (a policy that appears to have been reversed)--but it’s important to take a more nuanced view of what’s going on here.

Yes, it’s a bit prudish to panic over the possibility that a youngster might see bare breasts on Facebook (although no one is suggesting that pornography should be allowed) while allowing videos of horrific violence, but there’s a big difference between glorifying violence and condemning it, which is a distinction that Facebook is clear to make. The company states in its community standards:

Facebook has long been a place where people turn to share their experiences and raise awareness about issues important to them. Sometimes, those experiences and issues involve graphic content that is of public interest or concern, such as human rights abuses or acts of terrorism. In many instances, when people share this type of content, it is to condemn it.

The company also told AFP, “If [graphic imagery] is being shared for sadistic pleasure or to celebrate violence, Facebook removes it.”

Facebook Community Standards

Without taking a stance either way on whether or not Facebook should have left the beheading video up (the company apparently did add warnings to the video so viewers would know that it was disturbing), the larger issue here is what, exactly, Facebook is being asked to censor. While many use Facebook to connect with pals, waste time, or stalk exes, some use the platform to spread awareness of serious issues, and sometimes violence and other disturbing imagery is part of that.

Facebook
Credit: AFP
Is it actually beneficial to rid Facebook of any reference to human rights abuses because the associated imagery is “yucky”? Or is that precisely why Facebook should allow it to persist? David Cameron is upset that a violent and disturbing video was viewable on Facebook, and rightly so, but should he have been more concerned with why a woman was beheaded by a masked man in the first place?