Facebook’s recent censorship of the iconic AP photograph of nine year-old Kim Phúc fleeing naked from a napalm bombing, has once again brought the issue of commercial content moderation to the fore. Although Facebook has since apologized for taking the photo down from the page of Norwegian publication Aftenposten, the social media giant continues to defend the policy that allowed the takedown to happen in the first place.

The policy in question is a near-blanket ban on nudity. Although the company has carved out some exceptions to the policy—for example, for “photographs of paintings, sculptures, and other art that depicts nude figures”—and admits that their policies can “sometimes be more blunt than we would like and restrict content shared for legitimate purposes,” in practice the ban on nudity has a widespread effect on the ability of its users to exercise their freedom of expression on the platform.

In a statement, Reporters Without Borders called on Facebook to “add respect for the journalistic values of photos to these rules.” But it’s not just journalists who are affected by Facebook’s nudity ban. While it may seem particularly egregious when the policy is applied to journalistic content, its effect on ordinary users—from Aboriginal rights activists to breastfeeding moms to Danish parliamentarians who like to photograph mermaid statues—is no less damaging to the principles of free expression. If we argue that Facebook should make exceptions for journalism, then we are ultimately placing Facebook in the troubling position of deciding who is or isn’t a legitimate journalist, across the entire world.

Reporters Without Borders also called on the company to “ensure that their rules are never more severe than national legislations.” Indeed, while it is now largely accepted that social media companies take down content in response to requests from governments, the idea that these companies should temper their rules to be more in line with the liberal policies of other governments—to keep up nudity that violates no local regulation, and is inoffensive by the societal standards of many countries outside the United States—has not yet entered the public discussion.

Despite recent statements and certain exceptions, Facebook certainly doesn’t see nude imagery as a component of freedom of expression. In a letter to the Norwegian prime minister in which she apologized for the recent gaffe, the company’s COO, Sheryl Sandberg, wrote that “sometimes … the global and historical importance of a photo like ‘Terror of War’ outweighs the importance of keeping nudity off Facebook”. What Facebook hasn’t explained, however, is why it’s so important to keep nudity off the platform.

The company’s Community Standards state that the display of nudity is restricted “because some audiences within our global community may be sensitive to this type of content - particularly because of their cultural background or age.” Facebook’s concern for this unnamed set of users rings hollow, perhaps because the fear of getting blocked by conservative authoritarian governments is more likely the real impetus behind the policy.

As a company, nothing obliges Facebook to adhere to the principles of freedom of expression. The company has the right to convey, or remove, whatever content it chooses. But a near-blanket ban on nudity certainly contradicts the company’s mission of making the world more open and connected.

So what should Facebook do? Short of getting rid of the policy altogether, there are several simple changes the company could make that would place it more in line with both its own mission and the spirit of free expression.

First, Facebook could stop conflating nudity with sexuality, and sexuality with pornography by making changes to their user reporting mechanism. Currently, when users attempt to report such content, their first option reads: “This is nudity or pornography,” with “sexual arousal,” “sexual acts” and “people soliciting sex” as examples listed below. This creates a blurry line between non-sexual nudity (which is legal and uncontroversial in a number of jurisdictions in which the company operates) and sexual content.

Facebook's reporting mechanism conflates mere nudity with sexuality

Another option would be to apply content warnings . Facebook already employs such warnings for graphic violence (a subject that promotes greater concern in much of northern Europe than nude imagery) and could easily roll them out to apply to nudity as well. The company could institute different guidelines for public and private content as well, allowing nudity on friends-only feeds, for instance. .

Facebook could also consider whether its ban on female nipples—but not male ones—is a just policy. A number of countries and regions throughout the world have equalized policies toward toplessness, but Facebook’s policy remains regressive, and discriminatory. Furthermore, it often affects transgender users, an already vulnerable population.

Finally, Facebook could reconsider the punitive bans it places on users who violate the policy. Currently, users who violate the policy first have their content taken down, while a second violation typically results in a 24-hour ban—the same length of time meted out for seemingly more egregious policy violations.

All of these would help mitigate the confusion, concern and accusations of censorship, that incidents like the Kim Phúc takedown provoke. But if Facebook wants to avoid being seen as the world’s arbitrary and prudish censor, the company should perhaps spend more time thinking—and articulating—about why a ban on nudity is so important in the first place.


 

Has your content been taken down, or your account suspended, on a social media platform? Report your experience now on Onlinecensorship.org, a project of EFF and Visualizing Impact which aims to find out how social media companies’ policies affect global expression.