The U.S. Court of Appeals for the Second Circuit last week became the first federal appellate court to rule that Section 230 bars civil terrorism claims against a social media company. The plaintiffs, who were victims of Hamas terrorist attacks in Israel, argued that Facebook should be liable for hosting content posted by Hamas members, which allegedly inspired the attackers who ultimately harmed the plaintiffs.

EFF filed an amicus brief in the case, Force v. Facebook, arguing that both Section 230 and the First Amendment prevent lawsuits under the Anti-Terrorism Act that seek to hold online platforms liable for content posted by their users—even if some of those users are pro-terrorism or terrorists themselves. We’ve been concerned that without definitive rulings that these types of cases cannot stand under existing law, they would continue to threaten the availability of open online forums and Internet users’ ability to access information.

The Second Circuit’s decision is in contrast to that of the Ninth Circuit in Fields v. Twitter and the Sixth Circuit in Crosby v. Twitter, where both courts held only that the plaintiffs in those cases—victims of an ISIS attack in Jordan and the Pulse nightclub shooting in Florida, respectively—could not show a sufficient causal link between the social media companies and the harm suffered by the plaintiffs. Thus, the Ninth and Sixth Circuit rulings are concerning because they tacitly suggest that better pleaded complaints against social media companies for hosting pro-terrorism content might survive judicial scrutiny in the future.

The facts underlying all of these cases are tragic and we have the utmost sympathy for the plight of the victims and their families. The law appropriately allows victims to seek compensation from the perpetrators of terrorism themselves. But holding online platforms liable for what terrorists and their supporters post online—and the violence they ultimately perpetrate—would have dire repercussions: if online platforms no longer have Section 230 immunity in this context, those forums and services will take aggressive action to screen their users, review and censor content, and potentially prohibit anonymous speech. The end result would be sanitized online platforms that would not permit discussion and research about terrorism, a prominent and vexing political and social issue. As we have chronicled, existing efforts by companies to filter extremist online speech have exacted collateral damage by silencing human rights defenders.

There have been several cases filed in federal courts that seek to hold social media companies such as Twitter, Facebook, and YouTube civilly liable for providing material support to terrorists or aiding and abetting terrorists by allowing terrorist content on their platforms. We hope that the Second Circuit’s ruling will inspire other courts to ensure through their rulings that all Internet users will continue to be able to discuss and access information about controversial topics.