Washington, D.C.—The Electronic Frontier Foundation (EFF) called on Facebook, Google, and other social media companies today to publicly report how many user posts they take down, provide users with detailed explanations about takedowns, and implement appeals policies to boost accountability.

EFF, ACLU of Northern California, Center for Democracy & Technology, New America’s Open Technology Institute, and a group of academic experts and free expression advocates today released the Santa Clara Principles, a set of minimum standards for tech companies to augment and strengthen their content moderation policies. The plain language, detailed guidelines call for disclosing not just how and why platforms are removing content, but how much speech is being censored. The principles are being released in conjunction with the second edition of the Content Moderation and Removal at Scale conference. Work on the principles began during the first conference, held in Santa Clara, California, in February.

“Our goal is to ensure that enforcement of content guidelines is fair, transparent, proportional, and respectful of users’ rights,” said EFF Senior Staff Attorney Nate Cardozo.

In the aftermath of violent protests in Charlottesville and elsewhere, social media platforms have faced increased calls to police content, shut down more accounts and delete more posts. But in their quest to remove perceived hate speech, they have all too often wrongly removed perfectly legal and valuable speech. Paradoxically, marginalized groups have been especially hard hit by this increased policing, hurting their ability to use social media to publicize violence and oppression in their communities. And the processes used by tech companies are tremendously opaque. When speech is being censored by secret algorithms, without meaningful explanation, due process, or disclosure, no one wins.

“Users deserve more transparency and greater accountability from platforms that play an outsized role—in Myanmar, Australia, Europe, and China, as well as in marginalized communities in the U.S. and elsewhere—in deciding what can be said on the Internet,” said Jillian C. York, EFF Director for International Freedom of Expression. “Users need to know why some language is allowed and the same language in a different post isn’t. They also deserve to know how their posts were flagged—did a government flag it, was it flagged by the company itself? And we all deserve a chance to appeal decisions to block speech.”

“The Santa Clara Principles are the product of years of effort by privacy advocates to push tech companies to provide users with more disclosure and a better understanding of how content policing works,” said Cardozo. “Facebook and Google have taken some steps recently to improve transparency, and we applaud that. But it’s not enough. We hope to see the companies embrace the Santa Clara Principles and move the bar on transparency and accountability even higher.”

The three principles urge companies to:

  • publish the number of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines;
  • provide clear notice to all users about what types of content are prohibited, and clear notice to each affected user about the reason for the removal of their content or the suspension of their account; and
  • provide human review of content removal by someone not involved in the initial decision, and enable users to engage in a meaningful and timely appeals process for any content removals or account suspensions.

The Santa Clara Principles continue EFF’s work advocating for free expression online and greater transparency about content moderation. Since 2015 EFF has been collecting reports of online takedowns through its Onlinecensorship.org project, which shines a light on what content is take down, why companies make certain decisions about content, and how content takedowns are affecting communities of users around the world.

EFF’s annual Who Has Your Back report, which started in 2010, has revealed which companies are the best and worst at disclosing when they give user’s private information to the government. This year’s Who Has Your Back report will focus exclusively on private censorship issues. Future projects will examine transparency about content policing policies, with the Santa Clara Principles used as a benchmark for the minimum standards companies should have in place.

Content takedown and account deactivation practices can have a profound effect on the lives and work of individuals in different parts of the world,” said York, cofounder of Onlinecensorship.org. “The companies removing online speech should be up front about their content policing policies. Users are being kept in the dark, voices that should be heard are being silenced forever by automation, and that must change.”

Santa Clara Principle participants:
ACLU Foundation of Northern California
Center for Democracy & Technology
Electronic Frontier Foundation
New America’s Open Technology Institute
Irana Raicu (Markkula Center for Applied Ethics, Santa Clara University)
Nicolas Suzor (Queensland University of Technology)
Sarah T. Roberts (Department of Information Studies, School of Education & Information Studies, UCLA)
Sarah Myers West (USC Annenberg School for Communications and Journalism)

For the text of the principles:

For more on content moderation: