Opinion Columns & Blogs

Facebook’s nudity policy a muddled mess

This Nov. 14, 2014, photo, shows a board with the Facebook logo inside the new Facebook data centers in Altoona, Iowa.
This Nov. 14, 2014, photo, shows a board with the Facebook logo inside the new Facebook data centers in Altoona, Iowa. AP file

When it comes to nudity and what gets posted, Facebook has a problem. It’s a problem that will only get worse, especially now that the company is being sued by an Irish teenager who claims the social media giant failed to act when naked pictures of her were repeatedly posted in an act of what’s now being called revenge pornography.

These so-called shame pages are obviously wrong and need to be taken down. But Facebook’s nudity policy is broad and has resulted in decisions that have infuriated users.

And that’s the problem: There is no defined standard. Facebook needs to find a way to distinguish the truly harmful from the benign.

Facebook recently found itself internationally derided for deleting a famous photograph of a terrified 9-year-old girl fleeing a napalm attack in 1972 during the Vietnam War, and suspending the account of Tom Egeland, the Norwegian writer who had posted it as part of a piece on iconic war images. Facebook then deleted a critical post by Norwegian Prime Minister Erna Solberg that tried to distinguish between the necessary prevention of abuse and plain old censorship.

It’s that which is at stake for the teenager who clearly needs Facebook to cooperate, but it’s also an issue for those who see Facebook’s policy as woefully overbearing.

Another user whose account was suspended recently was Australia Aboriginal activist Celeste Liddle. She posted a trailer for a television show that featured some indigenous women topless during an Aboriginal ceremony. In a scathing critique of Facebook, Liddle pointed out that the ban coincided with racy images of Kim Kardashian all over Facebook.

Since then, Facebook has struggled to adopt a policy that its users endorse, but also that has any real coherence. In 2012, the company began to ban nude images that were reported by users. In 2015, after altering its community standards, Facebook classified nudity alongside hate speech, self-harm, bullying and violence. That is an interesting combination. Why should the naked body be classified as disgusting, hateful and harmful?

If you’re the victim of a shame page, then the answer is clear. But does the current policy that Facebook has adopted serve either the needs of such victims or those of other users who see legitimacy in posting images that include nakedness but which serve other purposes? After all, if art galleries refused to exhibit nude paintings, the selection from which they would be able to choose would be radically diminished.

Facebook should start by listening more carefully to what users have to say. The persistent suspension of users such as Liddle and Egeland serves no purpose and is mindless censorship. Without question, Facebook must continually be alert to the dangers of abuse, but it’s not that difficult to distinguish between a Pulitzer Prize-winning photograph that’s about discrediting aggressive wartime policies that hurt the young and the new and troubling trend of hurtful revenge pornography. The violence in the napalm attack photograph is not the naked child but the atrocities she’s trying to outrun. By acceding to the idea that nakedness is unacceptable, Facebook reinforces the harm of the shaming game.

There is a world of difference between a photograph or a piece of art posted as social commentary, empathy or even self-promotion, and one designed to humiliate. It’s not the nude image that’s the problem, but the intent behind it, and that’s what Facebook needs to address.

Facebook uses a mix of human and algorithmic testing to determine the suitability of material that gets posted, so it could easily put into place a policy that routed all complaints about nudity to Facebook employees with clear guidelines about application. Facebook needs to nuance its guidelines and stop treating its users so disdainfully. It needs to ask itself why a naked Vietnamese child or an elderly Aboriginal woman is not OK to post, but a Kim Kardashian shot makes the cut.

Facebook needs to lead, not follow.

Philippa Levine, Ph.D, is a professor at the University of Texas at Austin. She wrote this for the Dallas Morning News.