The Senate Commerce Committee just approved a slightly modified version of SESTA, the Stop Enabling Sex Traffickers Act (S. 1693).

SESTA was and continues to be a deeply flawed bill. It would weaken 47 U.S.C. § 230, (commonly known as “CDA 230” or simply “Section 230”), one of the most important laws protecting free expression online. Section 230 says that for purposes of enforcing certain laws affecting speech online, an intermediary cannot be held legally responsible for any content created by others.

It’s not surprising when a trade association endorses a bill that would give its own members a massive competitive advantage.

SESTA would create an exception to Section 230 for laws related to sex trafficking, thus exposing online platforms to an immense risk of civil and criminal litigation. What that really means is that online platforms would be forced to take drastic measures to censor their users.

Some SESTA supporters imagine that compliance with SESTA would be easy—that online platforms would simply need to use automated filters to pinpoint and remove all messages in support of sex trafficking and leave everything else untouched. But such filters do not and cannot exist: computers aren’t good at recognizing subtlety and context, and with severe penalties at stake, no rational company would trust them to.

Online platforms would have no choice but to program their filters to err on the side of removal, silencing a lot of innocent voices in the process. And remember, the first people silenced are likely to be trafficking victims themselves: it would be a huge technical challenge to build a filter that removes sex trafficking advertisements but doesn’t also censor a victim of trafficking telling her story or trying to find help.

Along with the Center for Democracy and Technology, Access Now, Engine, and many other organizations, EFF signed a letter yesterday urging the Commerce Committee to change course. We explained the silencing effect that SESTA would have on online speech:

Pressures on intermediaries to prevent trafficking-related material from appearing on their sites would also likely drive more intermediaries to rely on automated content filtering tools, in an effort to conduct comprehensive content moderation at scale. These tools have a notorious tendency to enact overbroad censorship, particularly when used without (expensive, time-consuming) human oversight. Speakers from marginalized groups and underrepresented populations are often the hardest hit by such automated filtering.

It’s ironic that supporters of SESTA insist that computerized filters can serve as a substitute for human moderation: the improvements we’ve made in filtering technologies in the past two decades would not have happened without the safety provided by a strong Section 230, which provides legal cover for platforms that might harm users by taking down, editing or otherwise moderating their content (in addition to shielding platforms from liability for illegal user-generated content).

We find it disappointing, but not necessarily surprising, that the Internet Association has endorsed this deeply flawed bill. Its member companies—many of the largest tech companies in the world—will not feel the brunt of SESTA in the same way as their smaller competitors. Small Internet startups don’t have the resources to police every posting on their platforms, which will uniquely pressure them to censor their users—that’s particularly true for nonprofit and noncommercial platforms like the Internet Archive and Wikipedia. It’s not surprising when a trade association endorses a bill that would give its own members a massive competitive advantage.

If you rely on online communities in your day-to-day life; if you believe that your right to speak matters just as much on the web as on the street; if you hate seeing sex trafficking victims used as props to advance an agenda of censorship; please take a moment to write your members of Congress and tell them to oppose SESTA.

Take Action

Tell Congress: Stop SESTA

Related Issues