We’ve been skeptical of Facebook’s Oversight Board from day one. We’ll follow closely and keep open minds, because we appreciate it is a first attempt at some semblance of much-needed governance and external review. But no amount of “oversight” can fix the underlying problem: Content moderation is extremely difficult to get right, particularly at Facebook scale. And, given the tiny percentage of disputes the Board will address, we doubt that it will make much of a dent in the universe of content moderation failures.
Now that the Board’s first members have been announced, we have new concerns. Content moderation errors disproportionately impact vulnerable communities—many of which are located outside of the United States. And yet, twenty-five percent of the initial Board is composed of Americans, with still others located or working in the United States. Many members seem to have general experience with law or institution-building, which may be helpful, but we’re not seeing much specific experience with international human rights frameworks.
We’re also concerned about who’s not on the Board. As Kara Swisher remarked in the New York Times, “[T]here are no loudmouths, no cranky people and, most important, no one truly affected by the dangerous side of Facebook.”
We see too few individuals from certain regions—such as the Middle East and North Africa and Southeast Asia. We don’t see any LGBTQ advocates, transgender Board members, or disability advocates. And we don’t see much representation of Internet users who create the content at issue. Although the Oversight Board is designed to identify and decide the most globally significant disputes, the Board in its current composition seems more directed at addressing parochial U.S. concerns, such as alleged moderation of conservative speakers, than other issues that are far more globally relevant—such as the frequent removal of documentation of human rights violations or the inconsistent enforcement of the rules.
Part of the difficulty here is that Facebook has tried to replicate the concept of the court but removed a crucial aspect of many court systems: the checks and balances built into the appointment of judges. In well-functioning democracies, judges may be elected directly by voters—and dismissed by them—or appointed by one branch of government but then affirmed by another. While not perfect, these processes create a system of accountability.
Finally, Board members need experience with content moderation and its discontents. Content moderation is an incredibly complex topic and one that is difficult to fully understand, particularly in the face of corporate opacity. We are concerned that a Board lacking in content moderation experts will rely on Facebook—a company that has been notoriously opaque about its internal operations—to provide them with knowledge about the practice. The Board should look instead to the many outside people and organizations, including EFF, that have been working in this area, some for decades, for in-depth understanding of the challenges it faces. One good feature of the Board is that it has the ability to call on experts—and it should do so, early and often.