Today’s decision from the Facebook Oversight Board regarding the suspension of President Trump’s account — to extend the suspension for six months and require Facebook to reevaluate in light of the platform’s stated policies — may be frustrating to those who had hoped for a definitive ruling. But it is also a careful and needed indictment of Facebook’s opaque and inconsistent moderation approach that offers several recommendations to help Facebook do better, focused especially on consistency and transparency. Consistency and transparency should be the hallmarks of all content decisions. Too often, neither hallmark is met. Perhaps most importantly, the Board affirms that it cannot and should not allow Facebook to avoid its responsibilities to its users.  We agree.

The decision is long, detailed, and worth careful review. In the meantime, here’s our top-level breakdown:

Today’s decision affirms, once again, that no amount of “oversight” can fix the underlying problem.

First, while the Oversight Board rightly refused to make special rules for politicians, rules we have previously opposed, it did endorse special rules and procedures for “influential users” and newsworthy posts. These rules recognize that some users can cause greater harm than others.  On a practical level, every decision to remove a post or suspend an account is highly contextual and requires often highly specific cultural competency. But we agree that special rules for influential users or highly newsworthy content requires even greater transparency and the investment of substantial resources.

Specifically, the Oversight Board explains that Facebook needs to document all of these special decisions well, clearly explain how any newsworthiness allowance applies to influential accounts, clearly explain how it cross checks such decisions including its rationale, standards, and processes of review, and the criteria for determining which pages to include. And Facebook should report error rates and thematic consistency of determinations as compared with its ordinary enforcement procedures.

More broadly, the Oversight Board also correctly notes that Facebook's penalty system is unclear and that it must better explain its strikes and penalties process, and inform users of strikes and penalties levied against them.

We wholeheartedly agree, as the Oversight Board emphasized, that “restrictions on speech are often imposed by or at the behest of powerful state actors against dissenting voices and members of political oppositions” and that  “Facebook must resist pressure from governments to silence their political opposition.” The Oversight Board urged Facebook to treat such requests with special care. We would have also required that all such requests be publicly reported.

The Oversight Board correctly also noted the need for Facebook to collect and preserve removed posts. Such posts are important for preserving the historical record as well as for human rights reporting, investigations, and accountability. 

While today’s decision reflects a notable effort to apply an international human rights framework, we continue to be concerned that an Oversight Board that is US-focused in its composition is not best positioned to help Facebook do better. But the Oversight Board did recognize the international dimension of the issues it confronts, and endorsed the Rabat Plan of Action, from the United Nations Office of the High Commissioner for Human Rights, as a framework for assessing the removal of posts that may incite hostility or violence. It specifically did not apply the First Amendment, even though the events leading to the decision were focused in the US.

Overall, these are good recommendations and we will be watching to see if Facebook takes them seriously. And we appreciate the Oversight Board’s refusal to make Facebook’s tough decisions for it. If anything, though, today’s decision affirms, once again, that no amount of “oversight” can fix the underlying problem: Content moderation is extremely difficult to get right, particularly at Facebook scale.

Tags