Correction—August 7, 2018: Although Facebook found connections between accounts linked to Russia's Internet Research Agency (IRA) and the accounts connected to the canceled event, a post by Chief Security Officer Alex Stamos states that Facebook is not attributing the "coordinated inauthentic behavior" of these accounts to a specific group or country. We regret the error.

Facebook stumbled this week—again—in its effort to police "misinformation": it deleted an event page for the anti-fascist protest "No Unite the Right 2 - DC."

Facebook justified the deletion by claiming that the event was initially created by an "inauthentic" organization with possible foreign connections. In fact, a number of legitimate local organizations and activists had become involved in administering and planning the event. These activists weren't given an opportunity by Facebook to explain or present evidence that they were involved in what had become a very real protest. Nor were they given a chance to dispute claims that the original organizers had Russian connections.

So what makes a protest “real"? Is it who organizes it, or who attends? And what happens when a bad actor creates an event with the intent to sow discord, but prospective attendees take it seriously and make it their own? These are all questions that Facebook is going to have to grapple with as it cracks down on misinformation ahead of US midterm elections.

But first, the company should ask itself how it can reform its content removal policies so that users have a chance to challenge removals before they happen. The event page for "No Unite the Right 2 - DC" may have been created by Resisters, a group suspected by Facebook of being tied to Russia’s Internet Research Agency, but to the organizations that were involved in planning the protest, and the more than two thousand users who had registered to attend, the event was very real. Many of those groups and individuals are now, rightfully, angry that Facebook chose to remove their page without giving them an opportunity to provide explanation or evidence of their involvement in the very real protest.

In a press release, Facebook admits that it doesn’t “have all the facts,” and says that the legitimate groups' pages “unwittingly helped build interest in ‘No Unite Right 2 – DC’" and posted information about transportation, materials, and locations so people could get to the protests.” Facebook doesn’t seem to consider that, to the participants, there was nothing unwitting about their involvement in an anti-Unite the Right protest, or what effect the removal of the group pages will have on them.

The decision is reminiscent of another one the company made nearly eight years ago. Just a few months prior to the uprising in Egypt that would eventually topple long-time dictator Hosni Mubarak, Facebook removed a page called “We Are All Khaled Said”—the same page that later called for the January 25 street protests. The decision to remove the page stemmed from the fact that its administrator was using an “inauthentic” name—but after being contacted by NGOs, the company allowed the page to remain up so long as another administrator stepped in. The legitimate administrators of “No Unite Right 2 – DC” weren’t given the same option.

A lot has happened between 2010 and today, but one thing remains the same: Facebook’s executives continue to make bad—and potentially influential—decisions about what is “authentic.” In this case, we believe that the legitimate organizers of the event—which reportedly includes 18 different local groups—should have had a say in how their event page was handled, and how prospective attendees were contacted. The Santa Clara Principles, a set of minimum standards for content moderation created by EFF and other free expression advocates, expressly calls on social media platforms to provide human review of content removal and give users meaningful and timely opportunity to present additional information as part of that review.

If all it takes going forward to get an event canceled is one bad actor’s involvement in it, then Facebook is likely going to be dealing with this sort of situation again. Therefore, it’s imperative that the company devise a consistent and fair strategy that allows legitimate participants of a group or event to have a stake in how their page is governed.