Tech platforms, especially the largest ones, have a problem—there’s a lot of offensive junk online. Many lawmakers on Capitol Hill keep coming back to the same solutionblaming Section 230.

What lawmakers don’t notice is that a lot of the people posting that offensive junk get stopped, again and again, thanks to Section 230. During a March hearing in the House Committee on Energy and Commerce, lawmakers expressed concern over some of the worst content that’s online, including extremist content, falsehoods about COVID-19, and election disinformation.

But it’s people spreading just this type of content that often file lawsuits trying to force their content back online. These unsuccessful lawsuits show that Section 230 has repeatedly stopped disinformation specialists from disseminating their harmful content.

Section 230 stands for the simple idea that you’re responsible for your own speech online—not the speech of others. It also makes clear that online operators, from the biggest platforms to the smallest niche websites, have the right to curate the speech that appears on their site.

Users dedicated to spreading lies or hateful content are a tiny minority, but weakening Section 230 will make their job easier. When content moderation doesn’t go their way—and it usually doesn’t—they’re willing to sue. As the cases below show, Section 230 is rightfully used to quickly dismiss their lawsuits. If lawmakers weaken Section 230, these meritless suits will linger in court, costing online services more and making them leery of moderating the speech of known litigious users. That could make it easier for these users to spread lies online.

Section 230 Protects Moderators Who Remove Hateful Content

James Domen identifies as a “former homosexual,” who now identifies as heterosexual. He created videos that describe being LGBTQ as a harmful choice, and shared them on Vimeo, a video-sharing website. In one video, he described the “homosexual lifestyle” this way: “It’ll ruin your life. It’s devastating. It’ll destroy your life.”

In at least five videos, Domen also condemned a California bill that would have expanded a ban on “sexual orientation change efforts,” or SOCE. Medical and professional groups have for decades widely recognized that efforts to change sexual orientation in various ways, sometimes called “conversion therapy,” are harmful.

Vimeo removed Domen’s videos. In a letter to Domen’s attorney, Vimeo explained that SOCE-related videos “disseminate irrational and stereotypical messages that may be harmful to people in the LGBT community,” because it treated homosexuality as “a mental disease or disorder” that “can and should be treated.” Vimeo bans “hateful and discriminatory” content, and company officials told Domen directly that, in their view, his videos fell into that category.

Domen sued, claiming that his civil rights were violated. Because of Section 230, Domen’s lawsuit was quickly thrown out. He appealed, but in March, the federal appeals court also ruled against him.

Forcing a website to publish Domen’s anti-LGBTQ content might serve Domen’s interests, but only at the expense of many other users of the platform. No website should have to face a lengthy and expensive lawsuit over such claims. Because of Section 230, they don’t.

Some lawmakers have proposed carving civil rights claims out of Section 230. But that could have the unintended side effect of allowing lawsuits like Domen’s to continue—making tech companies more skittish about removing anti-LGBTQ content.

Section 230 Protects Moderators Who Remove Covid-19 Falsehoods

Marshall Daniels hosts a YouTube channel in which he has stated that Judaism is “a complete lie” which was “made up for political gain.” Daniels, who broadcasts as “Young Pharaoh,” has also called Black Lives Matter “an undercover LGBTQ Marxism psyop that is funded by George Soros.”

In April 2020, Daniels live-streamed a video claiming that vaccines contain “rat brains,” that HIV is a “biologically engineered, terroristic weapon,” and that Anthony Fauci “has been murdering motherfuckers and causing medical illnesses since the 1980s.”

In May 2020, Daniels live-streamed a video called “George Floyd, Riots & Anonymous Exposed as Deep State Psyop for NOW.” In that video, he claimed that nationwide protests over George Floyd’s murder were “the result of an operation to cause civil unrest, unleash chaos, and turn the public against [President Trump].” According to YouTube, he also stated the COVID-19 pandemic and Floyd’s murder “were covert operations orchestrated by the Freemasons,” and accused Hillary Clinton and her aide John Podesta of torturing children. Near the video’s end, Daniels stated: “If I catch you talking shit about Trump, I might whoop your ass fast.”

YouTube removed both videos, saying that they violated its policy on harassment and bullying.  

Daniels sued YouTube, demanding account reinstatement and damages. He claimed that YouTube amounted to a state actor, and had thus violated his First Amendment rights. (Suggesting that courts treat social media companies as the government has no basis in the law, which the 9th Circuit reaffirmed is the case last year.)

In March, a court dismissed most of Daniels’ claims under Section 230. That law protects online services—both large and small—from getting sued for refusing to publish content they don’t want to publish.

Again, Internet freedom was protected by Section 230. No web host should be forced to carry false and threatening content, or Qanon-based conspiracy theories, like those created by Daniels. Section 230 protects moderators who kick out such content.

Section 230 Protects Moderators Who Remove Election Disinformation

The Federal Agency of News LLC, or FAN, is a Russian corporation that purports to be a news service. FAN was founded in the same building as Russia’s Internet Research Agency, or IRA; the IRA became the subject of a criminal indictment in February 2018 for its efforts to meddle in the 2016 U.S. election.

The founder and first General Director of FAN was Aleksandra Yurievna Krylova, who is wanted by the FBI for conspiracy to defraud the U.S. Later in 2018, the FBI unsealed a criminal complaint against FAN’s chief accountant, Elena Khusyaynova. In that complaint, the FBI said that Federal Agency of News was not so different than the IRA. Both were allegedly part of “Project Lakhta,” a Russian operation to interfere with political and electoral systems both in Russia “and other countries, including the United States.”

Facebook shut more than 270 Russian language accounts and pages in April of 2018, including FAN’s account. Company CEO Mark Zuckerberg said the pages “were controlled by the IRA,” which had “repeatedly acted deceptively and tried to manipulate people in the U.S., Europe, and Russia.” The IRA used a “network of hundreds of fake accounts to spread divisive content and interfere in the U.S. presidential election.” Facebook’s Chief Security Officer stated that the IRA had spent about $100,000 on Facebook ads in the United States.

At this point, one might think that anyone with alleged connections to the Internet Research Agency, including FAN, would lie low. But that’s not what happened. Instead, FAN’s new owner, Evgeniy Zubarev, hired U.S. lawyers and filed a lawsuit against Facebook, claiming that his civil rights had been violated. He demanded that FAN’s account be reinstated, and that FAN be paid damages.

A court threw the FAN lawsuit out on Section 230 grounds. The plaintiffs re-filed a new complaint, which the court again threw out.

Small Companies And Users Can’t Afford These Bogus Lawsuits 

Weakening Section 230 will give frivolous lawsuits like the ones above a major boost. Small companies, with no margin for extra legal costs, will be under more pressure to capitulate to bogus demands over their content moderation.

Section 230 protects basic principles, whether you run a blog with a comment section, an email list with 100 users, or a platform serving millions. You have the right to moderate. You have the right to speak your own mind, and serve other users, without following the dictates of a government commission—and without fear of a bankrupting lawsuit. 

Innovation, experimentation and real competition are the best paths forward to a better internet. More lawsuits over everyday content moderation won’t get us there.

Related Issues

Tags