Censorship doesn’t always look like a black line across a document, or a clear order to remove a piece of content. Websites feel pressured without the government having to issue a clear directive that they host certain speakers or carry certain content. The First Amendment recognizes that speech can often be ‘chilled’ in other ways–for example, by a burdensome governmental investigation. In an amicus brief filed yesterday, EFF, the Center for Democracy and Technology, and R Street, urged the Ninth Circuit Court of Appeals to take the case “en banc” and protect Twitter from a retaliatory investigation by Texas Attorney General Ken Paxton.
The demand subjected Twitter’s internal discussion about content moderation rules or decisions to discovery under the CID and second-guessing by AG Paxton. This put Twitter in a difficult position and pressured it to minimize its legal, reputational, and financial risks by self-censoring along the lines indicated by AG Paxton. Twitter sued Paxton, claiming that he was “abusing his authority as the highest law-enforcement officer of the State of Texas to intimidate, harass, and target Twitter in retaliation for Twitter’s exercise of its First Amendment rights.”
Last week, a panel of judges on the Ninth Circuit wrongly ruled that Twitter cannot sue Paxton until a possible enforcement action at the conclusion of Paxton’s investigation, or until the CID is enforced.. But as our brief to the Ninth Circuit says, “even pre-enforcement, threatened punishment of speech has a chilling effect.” Since the previous panel got this wrong, a more comprehensive “en banc” hearing is needed. From the moment it was issued, the CID chilled Twitter from exercising its First Amendment-protected right to engage in content moderation. Requiring the company to endure even more retaliation by Paxton before it can sue harms Twitter’s First Amendment rights.
From the brief:
An investigation and CID from a state attorney general for documents about a host’s content moderation practices—particularly when coupled with the attorney general’s critical public statements about the host’s content moderation decisions—send a strong message of disapproval and threat of legal consequences for the host if it continues its “disfavored” content moderation actions. Hosts targeted by a CID as part of a state attorney general’s retaliatory investigation will fear harsh legal consequences if they continue content moderation practices like those that sparked the investigation. In the face of such retaliation, a host may believe that the state attorney general will treat it more leniently or drop an investigation entirely if it ceases the content moderation practices with which the attorney general disagrees.
Paxton’s investigation is part of a trend of government officials in the United States using investigations to pressure or punish hosts for making content moderation decisions with which they disagree. This is bad for everyone: access to online platforms with different rules and environments generally benefits users. The Ninth Circuit’s decision risks encouraging this unconstitutional trend of government officials investigating hosts for content moderation decisions with which they disagree. There are certainly content moderation issues on online platforms, which have rightfully been criticized for removing benign posts, censoring human rights activists and journalists, and other bad content moderation practices—as we noted in our brief. But a chilling government investigation is not the right way to resolve those issues.