San Francisco – The Electronic Frontier Foundation (EFF) and Visualizing Impact launched Onlinecensorship.org today, a new platform to document the who, what, and why of content takedowns on social media sites. The project, made possible by a 2014 Knight News Challenge award, will address how social media sites moderate user-generated content and how free expression is affected across the globe.

Controversies over content takedowns seem to bubble up every few weeks, with users complaining about censorship of political speech, nudity, LGBT content, and many other subjects. The passionate debate about these takedowns reveals a larger issue: social media sites have an enormous impact on the public sphere, but are ultimately privately owned companies. Each corporation has their own rules and systems of governance that control users’ content, while providing little transparency about how these decisions are made.

At Onlinecensorship.org, users themselves can report on content takedowns from Facebook, Google+, Twitter, Instagram, Flickr, and YouTube. By cataloging and analyzing aggregated cases of social media censorship, Onlinecensorship.org seeks to unveil trends in content removals, provide insight into the types of content being taken down, and learn how these takedowns impact different communities of users.

“We want to know how social media companies enforce their terms of service. The data we collect will allow us to raise public awareness about the ways these companies are regulating speech,” said EFF Director for International Freedom of Expression and co-founder of Onlinecensorship.org Jillian C. York. “We hope that companies will respond to the data by improving their regulations and reporting mechanisms and processes—we need to hold Internet companies accountable for the ways in which they exercise power over people’s digital lives.”

York and Onlinecensorship.org co-founder Ramzi Jaber were inspired to action after a Facebook post in support of OneWorld’s “Freedom for Palestine” project disappeared from the band Coldplay’s page even though it had received nearly 7,000 largely supportive comments. It later became clear that Facebook took down the post after it was reported as “abusive” by several users.

“By collecting these reports, we’re not just looking for trends. We’re also looking for context, and to build an understanding of how the removal of content affects users’ lives. It’s important companies understand that, more often than not, the individuals and communities most impacted by online censorship are also the most vulnerable,” said Jaber. “Both a company’s terms of service and their enforcement mechanisms should take into account power imbalances that place already-marginalized communities at greater risk online.”

Onlinecensorship.org has other tools for social media users, including a guide to the often-complex appeals process to fight a content takedown. It will also host a collection of news reports on content moderation practices.

For Onlinecensorship.org:
https://onlinecensorship.org

Related Issues