The free and open Internet has enabled disparate communities to come together across miles and borders, and empowered marginalized communities to share stories, art, and information with one another and the broader public—but restrictive and often secretive or poorly messaged policies by corporate gatekeepers threaten to change that.

Content policies restricting certain types of expression—such as nudity, sexually explicit content, and pornography—have been in place for a long time on most social networks. But in recent years, a number of companies have instituted changes in the way policies are enforced, including demonetizing or hiding content behind an age-based interstitial; using machine learning technology to flag content; blocking keywords in search; or disabling thumbnail previews for video content.

While there are some benefits to more subtle enforcement mechanisms—age restrictions, for example, allow content that would otherwise be removed entirely to be able available to some users—they can also be confusing for users. And when applied mistakenly, they are difficult—if not impossible—to appeal.

In particular, policy restrictions on “adult” content have an outsized impact on LGBTQ+ and other marginalized communities. Typically aimed at keeping sites “family friendly,” these policies are often unevenly enforced, classifying LGBTQ+ content as “adult” when similar heterosexual content isn’t. Similarly, as we noted last year, policies are sometimes applied more harshly to women’s content than to similar content by men.

Watts the Safeword is a YouTube channel that seeks to provide “kink-friendly sexual education.” In March, one of the channel’s creators, Amp, noticed that the thumbnails that the channel’s team had carefully selected to represent the videos were not showing correctly in search. Amp reached out to YouTube and, in a back-and-forth email exchange, was repeatedly told by several employees that it was a technical error. Finally, after six days, Amp was told that “YouTube may disable custom thumbnails for certain search results when they’re considered inappropriate for viewers.” To determine inappropriateness, the YouTube employee wrote, the company considers “among other signals, audience retention, likes and dislikes and viewer feedback.”

When I spoke to Amp earlier this month, he told me that he has observed a number of cases in which sexual education that contains a kink angle is demonetized or otherwise downgraded on YouTube where other sexual education content isn’t. Amp expressed frustration with companies’ lack of transparency about their own enforcement practices: “They’re treating us like children that they don’t want to teach.”

Caitlyn Moldenhauer produces a podcast entitled “Welcome to My Vagina” that reports on gender, sex, and reproductive rights. “The intention of our content,” she told me, “is to educate, not entertain.”

The podcast, which uses a cartoon drawing of a vulva puppet as its logo, was rejected from TuneIn, an audio streaming service that hosts hundreds of podcasts. When Caitlyn wrote to the company to inquire about the rejection, she received a message that read: “Thank you for wanting to add your podcast to our directory unfortunately we will not be adding please refer to our terms and conditions for more information.”

A close reading of the terms of service and acceptable use policy failed to provide clarity. Although the latter refers to “objectionable content,” the definition doesn’t include sexuality or nudity—only “obscene,” “offensive,” or “vulgar” content, leading Caitlyn to believe that her podcast was classified as such. A cursory search of anatomical and sexual terms, however, demonstrates that the site contains dozens of podcasts about sex, sexual health, and pornography—including one that Caitlyn produces—raising questions as to why “Welcome to My Vagina” was targeted.

These are not the only stories like this. Over the past few months, we’ve observed a number of other cases in which LGBTQ+ or sexual health content has been wrongfully targeted:

  • The YouTube channel of Recon—“fetish-focused dating app”—was suspended, and later reinstated after receiving press coverage. It was the second occurence.
  • The Facebook page of Naked Boys Reading was removed after being flagged for policy violations. After the organizers accused Facebook of “queer erasure,” the page was restored.
  • In 2017, Twitter began collapsing “low-quality” and “abusive” tweets behind a click-through interstitial—but users have reported that tweets merely containing the words “queer” and “vagina” are affected.
  • Chase Ross, a long-time YouTuber who creates educational videos about transgender issues and about his personal experience as a trans person, has reported that videos containing the word “trans” in their title are regularly demonetized.

Many creators have suggested the uptick in enforcement is the result of the passing of FOSTA, a law that purports to target sex trafficking but is already broadly chilling online speech and silencing marginalized voices (and to which we are posing a legal challenge). Although it’s difficult to attribute specific policy changes to FOSTA, a number of companies have made amendments to their sexual content policies in the wake of the bill’s passing.

Others, such as the creator of Transthetics, have suggested that the crackdown—particularly on transgender content—is the result of public pressure. The company, which produces "innovative prosthetics for trans men et al," has had its YouTube page suspended twice, and reinstated both times only after they demonstrated that YouTube allowed similar, heterosexual content. In a video discussing the most recent suspension of Transthetics’ YouTube page, its creator said: “My worry is that for every Alex Jones channel that they can, they feel that a bit of a tit for tat needs to happen and in their view, the polar opposite of that is the LGBT community.”

Both Amp and Caitlyn expressed a desire for companies to be clear about their policies and how they enforce them, and we agree. When users don’t understand why their content is removed—or otherwise downgraded—it can take days for them to make necessary changes to comply with the rules, or to get their content reinstated. Says Amp: It's not just an inconvenience but in most cases requires us to monitor our social media and scrutinize every inch of our content to find if we've been removed, restricted or deleted. YouTube specifically says it will notify you when something is affected, but it rarely happens.”  

Similarly, Caitlyn told me: "The issue isn't really that we get flagged, as much as when we reach out over and over again to try and solve the issue or defend ourselves against their content policies, that we're met with radio silence ... I wish sometimes that I could argue with a guidelines representative and hear a valid reason why we are denied access to their platform, but we're not even given a chance to try. It's either change or disappear."

Companies must be transparent with their users about their rules and enforcement mechanisms, and any restrictions on content must be clearly messaged to users. Furthermore, it’s imperative that companies implement robust systems of appeal so that users whose content is wrongly removed can have it quickly reinstated.

But with so many examples related to sex and sexuality, we also think it’s time for companies to consider whether their overly restrictive or blunt policies are harming some of their most vulnerable users.