One of the most important principles underpinning the Internet is that if you say something illegal, you should be held responsible for it—not the owners of the site or service where you said it. That principle has seen many threats this year—not just in federal legislation, but also in a string of civil lawsuits intended to pin liability on online platforms for allegedly providing material support to terrorists.
Several federal trial courts dismissed such suits this year, but some of these cases are on appeal and plaintiffs have filed several new ones. If these suits are successful, they could be detrimental for the Internet: platforms would have little choice to become much more restrictive in what sorts of speech they allow.
Without definitive rulings that these cases cannot stand under existing law, they continue to threaten the availability of open online forums and Internet users’ ability to access information. That’s why EFF filed legal briefs in 2018 asking two different federal appellate courts to dismiss material support cases against social media platforms.
As well-intentioned as these cases are, they pose a threat to the online communities we all rely on.
The good news: So far, courts have been quick to toss out these material support lawsuits, including the U.S. Court of Appeals for the Ninth Circuit, the first federal appellate court to hear one. Although the facts and claims vary, the majority of the cases seek to hold platforms such as Twitter, YouTube, and Facebook liable under the federal Anti-Terrorism Act.
The lawsuits usually claim that by allowing alleged terrorists to use their publishing or messaging services, online platforms provided material support to terrorists or aided and abetted their terrorist activities. A key allegation of many of these lawsuits is that the pro-terrorism content posted by particular groups radicalized or inspired the actual perpetrators of the attacks, thus the platforms should be liable for the harm suffered by the victims.
The facts underlying all of these cases are tragic. Most are brought by victims or family members of people who were killed in attacks such as the 2016 Pulse nightclub shooting in Orlando.
As well-intentioned as these cases are, they pose a threat to the online communities we all rely on. In seeking to hold online platforms liable for what terrorists and their supporters post online—and the violence they ultimately perpetrate—such lawsuits threaten Internet users’ and the platforms’ First Amendment rights. They also jeopardize one of the Internet’s most important laws, Section 230 (47 U.S.C. § 230).
Section 230 protects online platforms, in part, from civil lawsuits based on content created or posted by their users. The law is largely responsible for the creation and continued availability of a plethora of online forums and services that host a diverse array of user speech, ensuring that all views—even controversial ones—can be shared and heard. Section 230 lets anyone—regardless of resources, technical expertise, or geography—communicate with others around the world.
Despite Section 230 barring all civil claims for hosting user-generated content, if the lawsuits brought under the Anti-Terrorism Act succeed in imposing liability on the social media companies, they would open up a huge exception to Section 230 and undermine its legal protections for all online platforms.
That would have dire repercussions: if online platforms no longer have Section 230 immunity for hosting content even remotely related to terrorism, those forums and services will take aggressive action to screen their users, review and censor content, and potentially prohibit anonymous speech. The end result would be sanitized online platforms that would not permit discussion and research about terrorism, a prominent and vexing political and social issue.
Although federal trial courts and one appellate court have largely avoided undermining Section 230 and Internet users’ First Amendment rights, they have not entirely shut the door on these types of lawsuits. The U.S. Court of Appeals for the Ninth Circuit, for example, missed a clear opportunity to rule that Section 230 bars these types of lawsuits.
That’s why EFF this year filed two friend-of-the-court briefs in cases before the United States Courts of Appeals for the Second and Sixth Circuits arguing that the courts should rule that Section 230 and the First Amendment prevent lawsuits under the Anti-Terrorism Act that seek to hold online platforms liable for content posted by their users—even if some of those users are pro-terrorism or terrorists themselves.
We hope that the courts vindicate Section 230 and the First Amendment in these material support cases. We will continue to monitor these cases and stand up for Internet users’ rights.
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2018.
Like what you're reading? Support digital freedom defense today!