The Supreme Court is about to hear a case that could dramatically affect users’ speech rights online. EFF has filed a brief explaining what’s at stake, and urging the court to preserve the key law protecting user expression, 47 U.S.C § 230 (Section 230).

In Gonzalez v. Google, the petitioning plaintiffs make a radical argument about Section 230. They have asked the Supreme Court to rule that Section 230 doesn’t protect recommendations we get online, or how certain content gets arranged and displayed. According to the plaintiffs, U.S. law allows website and app owners to be sued if they make the wrong recommendation. 

In our brief, EFF explains that online recommendations and editorial arrangements are the digital version of what print newspapers have done for centuries: direct readers’ attention to whatever might be most interesting to them. Newspapers do this with article placement, font size, and use of photographs. Deciding where to direct readers is part of editorial discretion, which has long been protected under the First Amendment. 

If Courts Narrow Section 230, We’ll See A Censored Internet 

If the plaintiffs’ arguments are accepted, and Section 230 is narrowed, the internet as we know it could change dramatically. 

First, online platforms would engage in severe censorship. As of April 2022, there were more than 5 billion people online, including 4.7 billion using social media platforms. Last year, YouTube users uploaded 500 hours of video each minute. Requiring pre-publication human review is not feasible for platforms of even moderate size. Automated tools, meanwhile, often result in censorship of legal and valuable content created by journalists, human rights activists, and artists. Many smaller platforms, unable to even access these flawed automated tools, would shut down. 

The Gonzalez case deals with accusations that Google recommended content that was related to terrorism. If websites and apps can face severe punishments for recommending such content, they’re very likely to limit all speech related to terrorism, including anti-terrorism counter-speech, and critical analysis by journalists and intelligence analysts. The automated tools used to flag content can't tell whether the subject is being discussed, commented on, critiqued, or promoted. That censorship could also make it more difficult for people to access basic information about real-world events, including terrorist attacks.

Second, online intermediaries are likely to stop offering recommendations of new content. To avoid liability for recommendations that others later claim resulted in harm, services are likely to return to presenting content in blunt chronological order, a system that is far less helpful for navigating the vast seas of online information (notably, a newspaper or magazine would never use such a system). 

Third, the plaintiffs want to create a legal distinction under Section 230 related to URLs (the internet’s Uniform Resource Locators, the addresses that begin with “http://”). They argue that Section 230 protects the service from liability for hosting the user-generated content, but it should not protect the service for providing a URL so that others can access the content. The Supreme Court should reject the idea that URLs can be exempted from Section 230 protection. The argument is wrong as both a legal and technical matter. Users direct the creation of URLs when they upload content to a service. Further, Section 230 does not contain any language that indicates Congress wanted to create such a hair-splitting distinction. To rule as the plaintiffs argue would cripple online services up and down the internet “stack,” not just social media companies. The primary means by which everyone accesses content online—the URL—would become a legal liability if the link led to objectionable content. 

Section 230 Has Allowed Online Culture To Flourish

In the beginning of the digital age, Congress saw that the internet would be a powerful tool for creating and finding diverse communities. They were right. Cultural and educational institutions like Wikipedia, the Internet Archive, and the Library of Congress’ oral history projects enrich our lives, and all benefit from the protections of Section 230. Every message board, email service, social media site, and online marketplace flourishes because of Section 230. The law holds users accountable for their own speech, while allowing more specialized moderation for niche sites and interests. 

The court is scheduled to hear oral arguments in this case on February 21, 2023. EFF’s brief was joined by the American Library Association, the Association of Research Libraries, the Freedom to Read Foundation, and the Internet Archive. You can read the entire brief here. We also filed a brief in a related case being heard by the Supreme Court next month, Taamneh v. Twitter.

As the internet has grown, its problems have grown, too. But there are ways to address those problems without weakening a law that protects everyone’s digital speech. EFF has endorsed several paths in that regard, including comprehensive privacy legislation and renewed antitrust action. Removing protections for online speech, and online moderation, would be a foolish and damaging approach. The Supreme Court should use the Gonzalez case as an opportunity to ensure that Section 230 continues to offer broad protection of internet users’ rights. 

  • Brief of Amici Curiae Electronic Frontier Foundation et al. in support of Respondent