As Congress considers undercutting a key law that protects online free speech and innovation, sponsors of the bills don’t seem to understand how Section 230 (47 U.S.C. § 230) works.

EFF opposes the Senate’s Stop Enabling Sex Traffickers Act (S. 1693) (“SESTA”) and its House counterpart, the Allow States and Victims to Fight Online Sex Trafficking Act (H.R. 1865). These bills would roll back Section 230, one of the most important laws protecting online free speech and innovation.

Section 230 generally immunizes Internet intermediaries from legal liability for hosting user-generated content. Many websites or services that we rely on host third-party content in some way—social media sites, photo and video-sharing apps, newspaper comment sections, and even community mailing lists. This content is often offensive when, for example, users make defamatory statements about others. With Section 230, the actual speaker is at risk of liability—but not the website or blog. Without Section 230, these intermediaries would have an incentive to review every bit of content a user wanted to publish to make sure that the content would not be illegal or create a risk of legal liability—or to stop hosting user content altogether. 

But according to one of SESTA’s sponsors, Sen. Rob Portman (R-Ohio), EFF and other groups’ concerns are overblown. In a recent floor speech, Sen. Portman said:

They have suggested that this bipartisan bill could impact mainstream websites and service providers—the good actors out there. That is false. Our bill does not amend, and thus preserves, the Communications Decency Act’s Good Samaritan provision. This provision protects good actors who proactively block and screen for offensive material and thus shields them from any frivolous lawsuits.

Sen. Portman is simply wrong that the bill would not impact “good” platforms. He’s also wrong about how Section 230’s “Good Samaritan” provision would continue to protect online platforms if SESTA were to become law, particularly because that provision is irrelevant to the massive potential criminal and civil liability that the bill would create for online platforms, including the good actors.

Section 230 Has Two Immunity Provisions: One Related to User-Generated Content and One Called the “Good Samaritan” Immunity

We want to be very clear here, because even courts get confused occasionally. Section 230 contains two separate immunities for online platforms.

The first immunity (Section 230(c)(1)) protects online platforms from liability for hosting user-generated content that others claim is unlawful. If Alice has a blog on WordPress, and Bob accuses Clyde of having said something terrible in the blog’s comments, Section 230(c)(1) ensures that neither Alice nor WordPress are liable for Bob’s statements about Clyde.

The second immunity (Section 230(c)(2)) protects online platforms from legal challenges brought by their own users when the platforms filter, remove, or otherwise edit those users’ content. In the context of the above example, Bob can’t sue Alice if she unilaterally takes down Bob’s comment about Clyde. This provision explicitly states that the immunity is premised on actions the platforms take in “good faith” to remove offensive content, even if that content may be protected by the First Amendment. This second provision is what Sen. Portman called the “Good Samaritan” provision. (Law Professor Eric Goldman has a good explainer about Section 230(c)(2).)

When EFF and others talk about the importance of Section 230, we’re talking about the first immunity, Section 230(c)(1), which protects platforms in their role as hosts of user-generated content. As described above, Section 230(c)(1) generally prevents people who are legally wronged by user-generated content hosted on a platform (for example, defamed by a tweet) from suing the platform. Importantly, Section 230(c)(1) contains no “good faith” or “Good Samaritan” requirement.

Rather, Section 230(c)(1) provides platforms with immunity based solely on how they function: if providers offer services that enable their users to post content, they are generally shielded from liability that may result from that content. Full stop. Platforms’ motives in creating or running their services are thus irrelevant to whether they receive Section 230(c)(1) immunity for user-generated content.

Section 230’s “Good Samaritan” Immunity Wouldn’t Protect Platforms From the Liability for User-Generated Content That SESTA Would Create

Sen. Portman’s comments suggest that the current proposals to amend Section 230 would not impact the law’s “Good Samaritan” provision found in Section 230(c)(2). That is debatable but beside the point, and it unnecessarily confuses the impact SESTA and its House counterpart would have on online free speech and innovation. 

Sen. Portman’s comments are beside the point because SESTA would blow a hole in Section 230(c)(1)’s immunity by exposing online platforms to increased liability for user-generated content in two ways: 1) it would cease to protect platforms from prosecutions under state criminal law related to sex trafficking; and 2) it would cease to protect platforms from claims brought by private plaintiffs under both federal and state civil laws related to sex trafficking. 

Section 230’s “Good Samaritan” immunity simply doesn’t apply to lawsuits claiming that user-generated content is illegal or harmful. Section 230(c)(2)’s “Good Samaritan” immunity is irrelevant for platforms seeking to defend themselves from the new claims based on user-generated content that SESTA would permit.

In those newly possible criminal and civil cases, platforms would be unable to invoke Section 230(c)(1) as a defense, and Section 230(c)(2) would not apply. Sen. Portman’s comments thus betray a lack of understanding regarding how Section 230 protects online platforms.

Additionally, Sen. Portman implies that should SESTA become law, as long as platforms operate in “good faith,” they will not be liable should content related to sex trafficking appear on their sites. This is a dangerous misstatement of SESTA’s impact. Rather than recognizing that SESTA creates massive liability for all online platforms, Sen. Portman incorrectly implies that Section 230 currently separates good platforms from bad when it comes to which ones can be held liable for user-generated content.

Sen. Portman’s comments thus do little to alleviate the damaging consequences SESTA will have on online platforms. Moreover, they appear designed to mask some of the bill’s inherent flaws.

For these reasons and others, visit our STOP SESTA campaign page and tell Congress to reject S. 1693 and H.R. 1865!

Related Issues