Many in the U.S. have spent 2020 debating the problems of content moderation on social media platforms, misinformation and disinformation, and the perceived censorship of political views. But globally, this issue has been in the spotlight for a decade. 

This year is the tenth anniversary of what became known as the "Arab Spring", in which activists and citizens across the Middle East and North Africa (MENA) used social media to document the conditions in which they lived, to push for political change and social justice, and to draw the world's attention to their movement. For many, it was the first time they had seen how the Internet could have a role to play in pushing for human rights across the world. Emerging social media platforms like Facebook, Twitter and YouTube all basked in the reflected glory of press coverage that centered their part in the protests: often to the exclusion of those who were actually on the streets. The years after the uprisings failed to live up to the optimism of the time. Offline, the authoritarian backlash against the democratic protests has meant that many of those who fought for justice a decade ago, are still fighting now. And rather than help that fight, the platform policies and content moderation procedures of the tech giants now too often led to the silencing and erasure of critical voices from across the region. Arbitrary and non-transparent account suspension and removal of political and dissenting speech has become so frequent and systematic in the area that it cannot be dismissed as isolated incidents or the result of transitory errors in automated decision-making.

Along with dozens of other organizations, today EFF has signed an open letter to Facebook, Twitter, and YouTube demanding the companies stop silencing critical voices in the MENA region. The letter asks for several concrete measures to ensure that users across the region are treated fairly and are able to express themselves freely:

  • Do not engage in arbitrary or unfair discrimination.
  • Invest in the regional expertise to develop and implement context-based content moderation decisions aligned with human rights frameworks.
  • Pay special attention to cases arising from war and conflict zones.
  • Preserve restricted content related to cases arising from war and conflict zones.
  • Go beyond public apologies for technical failures, and provide greater transparency, notice, and offer meaningful and timely appeals for users by implementing the Santa Clara Principles on Transparency and Accountability in Content Moderation.

Content moderation policies are not only critical to ensuring robust political debate. They are key to expanding and protecting human rights.  Ten years out from those powerful protests, it's clear that authoritarian and repressive regimes will do everything in their power to stop free and open expression. Platforms have an obligation to note and act on the effects content moderation has on oppressed communities, in MENA and elsewhere.

In 2012, Mark Zuckerberg, CEO and Founder of Facebook, wrote

By giving people the power to share, we are starting to see people make their voices heard on a different scale from what has historically been possible. These voices will increase in number and volume. They cannot be ignored. Over time, we expect governments will become more responsive to issues and concerns raised directly by all their people rather than through intermediaries controlled by a select few.

Instead, governments around the world have chosen authoritarianism, and platforms have contributed to the repression. It's time for that to end.

Read the full letter demanding that Facebook, Twitter, and YouTube stop silencing critical voices from the Middle East and North Africa, reproduced below.

17 December 2020

Open Letter to Facebook, Twitter, and YouTube: Stop silencing critical voices from the Middle East and North Africa

Ten years ago today, 26-year old Tunisian street vendor Mohamed Bouazizi set himself on fire in protest over injustice and state marginalization, igniting mass uprisings in Tunisia, Egypt, and other countries across the Middle East and North Africa. 

As we mark the 10th anniversary of the Arab Spring, we, the undersigned activists, journalists, and human rights organizations, have come together to voice our frustration and dismay at how platform policies and content moderation procedures all too often lead to the silencing and erasure of critical voices from marginalized and oppressed communities across the Middle East and North Africa.

The Arab Spring is historic for many reasons, and one of its outstanding legacies is how activists and citizens have used social media to push for political change and social justice, cementing the internet as an essential enabler of human rights in the digital age.   

Social media companies boast of the role they play in connecting people. As Mark Zuckerberg famously wrote in his 2012 Founder’s Letter

“By giving people the power to share, we are starting to see people make their voices heard on a different scale from what has historically been possible. These voices will increase in number and volume. They cannot be ignored. Over time, we expect governments will become more responsive to issues and concerns raised directly by all their people rather than through intermediaries controlled by a select few.”

Zuckerberg’s prediction was wrong. Instead, more governments around the world have chosen authoritarianism, and platforms have contributed to their repression by making deals with oppressive heads of state; opening doors to dictators; and censoring key activists, journalists, and other changemakers throughout the Middle East and North Africa, sometimes at the behest of other governments:

  • Tunisia: In June 2020, Facebook permanently disabled more than 60 accounts of Tunisian activists, journalists, and musicians on scant evidence. While many were reinstated, thanks to the quick reaction from civil society groups, accounts of Tunisian artists and musicians still have not been restored. We sent a coalition letter to Facebook on the matter but we didn’t receive a public response.
  • Syria: In early 2020, Syrian activists launched a campaign to denounce Facebook’s decision to take down/disable thousands of anti-Assad accounts and pages that documented war crimes since 2011, under the pretext of removing terrorist content. Despite the appeal, a number of those accounts remain suspended. Similarly, Syrians have documented how YouTube is literally erasing their history.
  • Palestine: Palestinian activists and social media users have been campaigning since 2016 to raise awareness around social media companies’ censorial practices. In May 2020, at least 52 Facebook accounts of Palestinian activists and journalists were suspended, and more have since been restricted. Twitter suspended the account of a verified media agency, Quds News Network, reportedly on suspicion that the agency was linked to terrorist groups. Requests to Twitter to look into the matter have gone unanswered. Palestinian social media users have also expressed concern numerous times about discriminatory platform policies.
  • Egypt: In early October 2019, Twitter suspended en masse the accounts of Egyptian dissidents living in Egypt and across the diaspora, directly following the eruption of anti-Sisi protests in Egypt. Twitter suspended the account of one activist with over 350,000 followers in December 2017, and the account still has yet to be restored. The same activist’s Facebook account was also suspended in November 2017 and restored only after international intervention. YouTube removed his account earlier in 2007.

Examples such as these are far too numerous, and they contribute to the widely shared perception among activists and users in MENA and the Global South that these platforms do not care about them, and often fail to protect human rights defenders when concerns are raised.  

Arbitrary and non-transparent account suspension and removal of political and dissenting speech has become so frequent and systematic that they cannot be dismissed as isolated incidents or the result of transitory errors in automated decision-making. 

While Facebook and Twitter can be swift in responding to public outcry from activists or private advocacy by human rights organizations (particularly in the United States and Europe), in most cases responses to advocates in the MENA region leave much to be desired. End-users are frequently not informed of which rule they violated, and are not provided a means to appeal to a human moderator. 

Remedy and redress should not be a privilege reserved for those who have access to power or can make their voices heard. The status quo cannot continue. 

The MENA region has one of the world’s worst records on freedom of expression, and social media remains critical for helping people connect, organize, and document human rights violations and abuses. 

We urge you to not be complicit in censorship and erasure of oppressed communities’ narratives and histories, and we ask you to implement the following measures to ensure that users across the region are treated fairly and are able to express themselves freely:

  • Do not engage in arbitrary or unfair discrimination. Actively engage with local users, activists, human rights experts, academics, and civil society from the MENA region to review grievances. Regional political, social, cultural context(s) and nuances must be factored in when implementing, developing, and revising policies, products and services. 
  • Invest in the necessary local and regional expertise to develop and implement context-based content moderation decisions aligned with human rights frameworks in the MENA region.  A bare minimum would be to hire content moderators who understand the various and diverse dialects and spoken Arabic in the twenty-two Arab states. Those moderators should be provided with the support they need to do their job safely, healthily, and in consultation with their peers, including senior management.
  • Pay special attention to cases arising from war and conflict zones to ensure content moderation decisions do not unfairly target marginalized communities. For example, documentation of human rights abuses and violations is a legitimate activity distinct from disseminating or glorifying terrorist or extremist content. As noted in a recent letter to the Global Internet Forum to Counter Terrorism, more transparency is needed regarding definitions and moderation of terrorist and violent extremist (TVEC) content
  • Preserve restricted content related to cases arising from war and conflict zones that Facebook makes unavailable, as it could serve as evidence for victims and organizations seeking to hold perpetrators accountable. Ensure that such content is made available to international and national judicial authorities without undue delay.
  • Public apologies for technical errors are not sufficient when erroneous content moderation decisions are not changed. Companies must provide greater transparency, notice, and offer meaningful and timely appeals for users. The Santa Clara Principles on Transparency and Accountability in Content Moderation, which Facebook, Twitter, and YouTube endorsed in 2019, offer a baseline set of guidelines that must be immediately implemented. 

Signed,

Access Now
Arabic Network for Human Rights Information — ANHRI
Article 19
Association for Progressive Communications — APC
Association Tunisienne de Prévention Positive
Avaaz
Cairo Institute for Human Rights Studies (CIHRS)
The Computational Propaganda Project
Daaarb — News — website
Egyptian Initiative for Personal Rights
Electronic Frontier Foundation
Euro-Mediterranean Human Rights Monitor
Global Voices
Gulf Centre for Human Rights (GCHR)
Hossam el-Hamalawy, journalist and member of the Egyptian Revolutionary Socialists Organization
Humena for Human Rights and Civic Engagement
IFEX
Ilam- Media Center For Arab Palestinians In Israel
ImpACT International for Human Rights Policies
Initiative Mawjoudin pour l’égalité
Iraqi Network for Social Media - INSMnetwork
I WATCH Organisation (Transparency International — Tunisia)
Khaled Elbalshy - Daaarb website - Editor in Chief
Mahmoud Ghazayel, Independent
Marlena Wisniak, European Center for Not-for-Profit Law
Masaar — Technology and Law Community
Michael Karanicolas, Wikimedia/Yale Law School Initiative on Intermediaries and Information
Mohamed Suliman, Internet activist
My.Kali magazine — Middle East and North Africa
Palestine Digital Rights Coalition (PDRC)
The Palestine Institute for Public Diplomacy
Pen Iraq
Quds News Network
Ranking Digital Rights
Rima Sghaier, Independent
Sada Social Center
Skyline International for Human Rights
SMEX
Syrian Center for Media and Freedom of Expression (SCM)
The Tahrir Institute for Middle East Policy (TIMEP)
Taraaz
Temi Lasade-Anderson, Digital Action
WITNESS
Vigilance Association for Democracy and the Civic State — Tunisia
7amleh – The Arab Center for the Advancement of Social Media

Related Issues