Social media platforms routinely make arbitrary and contradictory decisions about what speech to block or penalize. No one is happy with the status quo: not people who want more censorship, nor people who want less censorship, nor people who simply want platforms to make different choices so that already-marginalized groups won't bear the brunt of their censorship policies. So many are looking for a better way forward. EFF offered a few thoughts on this last week, but we've also been looking at another persistent and intriguing idea, spearheaded largely by our friends at Article 19: the creation of social media council (SMC) to review content moderation decisions. Ever since Facebook announced a plan to create its own version, there’s been a surge of interest in this approach. Can it work?

At root, the concept is relatively simple: we can’t trust the platforms to do moderation well, so maybe we need an independent council to advise them on ways to do it better, and call them out when they blow it. A council might also provide an independent appeal mechanism for content removal decisions.

There are many different models for these councils. An appeals court is one. Or we might look to the international arbitration structure that handles domain name disputes. Or European press councils which administer codes of practice for journalists, investigate complaints about editorial content, and defend press freedom. They are funded by the media themselves, but aim to be independent.

We’re all in favor of finding ways to build more due process into platform censorship. That said, we have a lot of questions.  Who determines council membership, and on what terms? What happens when members disagree? How can we ensure the council’s independence from the companies it’s intended to check? Who will pay the bills, keeping in mind that significant funding will be needed to ensure that it is not staffed only by the few organizations that can afford to participate? What standard will the councils follow to determine whether a given decision is appropriate? How do they decide which of the millions of decisions made get reviewed? How can they get the cultural fluency to understand the practices and vocabulary of every online community? Will their decisions be binding on the companies who participate, and if so, how will the decisions be enforced? A host of additional questions are raised in recent document from the Internet and Jurisdiction Policy Network.

But our biggest concern is that social media councils will end up either legitimating a profoundly broken system (while doing too little to fix it) or becoming a kind of global speech police, setting standards for what is and is not allowed online whether or not that content is legal. We are hard-pressed to decide which is worse.

But our biggest concern is that social media councils will end up either legitimating a profoundly broken system (while doing too little to fix it) or becoming a kind of global speech police, setting standards for what is and is not allowed online whether or not that content is legal. We are hard-pressed to decide which is worse.

To help avoid either outcome, here are some guideposts, taken in part from our work on Shadow Regulation and the Santa Clara Principles:

  • Independence: SMCs should not be subject to or influenced by the platforms they are meant to advise. As a practical matter, that means that they should not depend directly on such platforms for funding (though platforms might contribute to an independently administered trust), and the platforms should not select council members. In addition, councils must be shielded from government pressures.
  • Roles: SMCs should not be regulators, but advisors. For example, an SMC might be charged with interpreting whether a platform’s decisions accurately reflect the platform’s own policies, and whether those policies themselves conform to international human rights standards. SMCs should not seek to play a legal role, such as interpreting local laws. Any such role could lead to SMCs becoming a practical speech police, exacerbating an existing trend where extrajudicial decision-makers are effectively controlling huge swaths of online expression. In addition, SMC members may not have the training needed to interpret local laws (hopefully they won’t all be lawyers), or may interpret them in a biased way. To avoid these pitfalls, SMCs should focus instead on interpreting specific community standards and determining if the company is adhering to its own rules.
  • Subject matter: Different platforms should be able make different moderation choices—and users should be able to as well. So rather than being arbiters of “legitimate” speech, determining a global policy to which all platforms must adhere, SMCs should focus on whether an individual platform is being faithful to its own policies. SMCs can also review the policies to advise on whether the policies are sufficiently clear, transparent, and subject to non-discriminatory enforcement, and identify the areas where they are less protective of speech than applicable law or violate international human rights norms.
  • Jurisdiction: Some have suggested that SMCs should be international, reflecting the actual reach of many social media platforms, and should seek to develop and implement a coherent international standard. We agree with Article 19 that a national approach would be better because a national council will be better placed to review company practice in light of varying local norms and expectations. A regional approach might also work if the region's law and customs are sufficiently consistent.
  • Personnel: SMCs should be staffed with a combination of experts in local, national, and international laws and norms. To promote legitimacy, members should also represent diverse views and communities and ideally should be selected by the people who will be affected by their decisions. And there must be adequate funding to ensure that members who cannot afford to travel to meetings or donate their time are adequately compensated.
  • Transparency: The review process should be transparent and the SMC’s final opinions should be public, with appropriate anonymization to protect user privacy. In addition, SMCs should produce annual reports on their work, including data points such as how many cases they reviewed, the type of content targeted, and how they responded. New America has a good set of specific transparency suggestions (focusing on Facebook).

Facebook says it’s planning to launch its “Oversight Board” by the end of the year. We urge Facebook and others toying with the idea of social media councils to look to these guidelines. And we urge everyone else—platforms, civil society, governments, and users—to fight to ensure that efforts to promote due process in content moderation aren’t perverted to support the creation of an international, unaccountable, extrajudicial speech police force.

Tags