Spanish version 

San Francisco—The Electronic Frontier Foundation (EFF) and more than 70 human and digital rights groups called on Mark Zuckerberg today to add real transparency and accountability to Facebook’s content removal process. Specifically, the groups demand that Facebook clearly explain how much content it removes, both rightly and wrongly, and provide all users with a fair and timely method to appeal removals and get their content back up.

While Facebook is under enormous—and still mounting—pressure to remove material that is truly threatening, without transparency, fairness, and processes to identify and correct mistakes, Facebook’s content takedown policies too often backfire and silence the very people that should have their voices heard on the platform.

Politicians, museums, celebrities, and other high profile groups and individuals whose improperly removed content can garner media attention seem to have little trouble reaching Facebook to have content restored—they sometimes even receive an apology. But the average user? Not so much. Facebook only allows people to appeal content decisions in a limited set of circumstances, and in many cases, users have absolutely no option to appeal. Onlinecensorship.org, an EFF project for users to report takedown notices, has collected reports of hundreds of unjustified takedown incidents where appeals were unavailable. For most users, content Facebook removes is rarely restored, and some are banned from the platform for no good reason.

EFF, Article 19, the Center for Democracy and Technology, and Ranking Digital Rights wrote directly to Mark Zuckerberg today demanding that Facebook implement common sense standards so that average users can easily appeal content moderation decisions, receive prompt replies and timely review by a human or humans, and have the opportunity to present evidence during the review process. The letter was co-signed by more than 70 human rights, digital rights, and civil liberties organizations from South America, Europe, the Middle East, Asia, Africa, and the U.S.

“You shouldn’t have to be famous or make headlines to get Facebook to respond to bad content moderation decisions, but that’s exactly what’s happening,” said EFF Director for International Freedom of Expression Jillian York. “Mark Zuckerberg created a company that’s the world’s premier communications platform. He has a responsibility to all users, not just those who can make the most noise and potentially make the company look bad.”

In addition to implementing a meaningful appeals process, EFF and partners called on Mr. Zuckerberg to issue transparency reports on community standards enforcement that include a breakdown of the type of content that has been restricted, data on how the content moderation actions were initiated, and the number of decisions that were appealed and found to have been made in error.

“Facebook is way behind other platforms when it comes to transparency and accountability in content censorship decisions,” said EFF Senior Information Security Counsel Nate Cardozo. “We’re asking Mr. Zuckerberg to implement the Santa Clara Principles, and release actual numbers detailing how often Facebook removes content—and how often it does so incorrectly.”

“We know that content moderation policies are being unevenly applied, and an enormous amount of content is being removed improperly each week. But we don’t have numbers or data that can tell us how big the problem is, what content is affected the most, and how appeals were dealt with,” said Cardozo. “Mr. Zuckerberg should make transparency about these decisions, which affect millions of people around the world, a priority at Facebook.”

For the letter:
https://santaclaraprinciples.org/open-letter/

For the Santa Clara Principles:
https://santaclaraprinciples.org/

For more information on private censorship:
https://www.eff.org/deeplinks/2018/09/platform-censorship-lessons-copyright-wars
https://www.eff.org/deeplinks/2018/04/smarter-privacy-rules-what-look-what-avoid

Contact: