We’ve taken Internet service companies and platforms like Facebook, Twitter, and YouTube to task for bad content moderation practices that remove speech and silence voices that deserve to be heard. We’ve catalogued their awful decisions. We’ve written about their ambiguous policies, inconsistent enforcement, and failure to appreciate the human rights implications of their actions. We’re part of an effort to devise a human rights framing for removing or downgrading content and accounts from their sites, and are urging all platforms to adopt them as part of their voluntary internal governance. Just last week, we joined more than 80 international human rights groups in demanding that Facebook clearly explain how much content it removes, both rightly and wrongly, and provide all users with a fair and timely method to appeal removals and get their content back up.

These efforts have thus far been directed at urging the platforms to adopt voluntary practices rather than calling for them to be imposed by governments through law. Given the long history of governments using their power to regulate speech to promote their own propaganda, manipulate the public discourse, and censor disfavored speech, we are very reluctant to hand the U.S. government a role in controlling the speech that appears on the Internet via private platforms. This is already a problem in other countries.

We recently filed an amicus brief in a case currently before the United States Court of Appeals for the Ninth Circuit addressing the proper role, if any, U.S. courts and other branches of the U.S. government should play in content moderation decisions. In the case, Prager University claimed that YouTube violated its First Amendment rights by excluding its channel from Restricted Mode, thereby making it inaccessible to the “small subset of users, such as libraries, schools, and public institutions, who choose to have a more limited viewing experience on YouTube.” Because YouTube is a private company, and because the First Amendment only restricts government action, Prager University needs to establish that YouTube is a “state actor,” that is, in this context, a private entity functioning as the state.

First Amendment law is clear that a private entity does not become a state actor simply by providing its own platform for use to other speakers. Indeed, this ability of private entities to curate and edit their own platforms is itself an important First Amendment right. The First Amendment prevents the government from dictating content moderation rules and controlling what platforms can and can’t publish. It is the legal bulwark against the government compelling the press to publish articles against their will or, more broadly, from compelling a whole variety of speakers from transmitting messages they do not want to transmit.

We do not shy away from challenging existing law when appropriate—indeed, that’s our job as impact litigators.

But in this case, we believe that existing law serves Internet users and human rights best. Existing law—whereby platforms are not constitutionally compelled to publish any users’ speech—allows for both unmoderated and moderated platforms. Even positive and desired moderation would be difficult and burdensome for platforms should they be deemed state actors. This is explained in detail in our brief.

As we told the court, YouTube’s moderation of the videos, like many of its content moderation decisions, was faulty in many ways, and Prager University is rightfully concerned about how the platform enforced content rules against it. And YouTube is not alone: Facebook, Twitter, and others have made, and will continue to make, wrong decisions to take down content, and we will continue to call them out for it.

But the answer to bad content moderation isn’t to empower the government to enforce moderation practices. Rather, the answer, as we told the court, is for users’ platforms to adopt moderation frameworks that are consistent with human rights, with clear take down rules, fair and transparent removal processes, and mechanisms for users to appeal take down decisions. Our brief thus concludes with a discussion of the Santa Clara Principles, a set of minimum standards we helped craft for content moderation practices that provide meaningful due process to affected speakers and better ensure that the enforcement of content guidelines is fair, unbiased, proportional, and respectful of users’ rights.

Tags