While 2019 saw the EU ramming through a disastrous Internet copyright rule that continues to reverberate through legal and policy circles, 2020 was a very different story as the EU introduced the Digital Services Act (DSA), the most significant reform of Europe’s platform legislation the EU has undertaken in twenty years. It is an unparalleled opportunity to formulate a bold, evidence-based vision to address today’s most pressing challenges.

One area we’re especially excited by is the EU’s cautious enthusiasm for interoperability, an anti-monopoly remedy that is highly specific to tech and deeply embedded in the history of technology. Early rumblings about future enforcement hint at interop's centrality to the EU's coming strategy, with some specific interoperability mandates under discussion.

In our policy advocacy surrounding the DSA, we will focus on four key areas: platform liability, interoperability mandates, procedural justice and user control. As we have been introducing the principles that will guide our policy work, our message to the EU has been clear: Preserve what works. Fix what is broken. And put users back in control.

Limited Liability and No Monitoring: Preserve What Works

The DSA is an important chance to update the legal responsibilities of platforms and enshrine users’ rights vis-à-vis the powerful gatekeeper platforms that control much of our online environment. But there is also a risk that the Digital Services Act will follow the footsteps of the recent regulatory developments in Germany, France, and Austria. The German NetzDG, the French Avia bill (which we helped bring down in court), and the Austrian law against hate speech (we advised the Commission to push it back) show a worrying trend in the EU to force platforms to police users’ content without considering what matters most: giving a voice to users affected by content takedowns.

In our detailed feedback to the EU on this question, we stress fair and just notice and action procedures (strong safeguards to protect user rights when their content is taken down or made inaccessible).

  1. Reporting Mechanisms: Intermediaries should not be held liable for choosing not to remove content simply because they received a private notification by a user. Save for exceptions, the EU should adopt the principle that actual knowledge of illegality is only obtained by intermediaries if they are presented with a court order.
  2. A Standard for Transparency and Justice in Notice and Action: Platforms should provide a user-friendly, visible, and swift appeals process to allow for the meaningful resolution of content moderation disputes. Appeals mechanisms must also be accessible, easy to use, follow a clearly communicated timeline, and must include human review.
  3. Open the Black Box that is Automated Decision Making: In the light of automated content moderation’s fundamental flaws, platforms should provide as much transparency as possible about how they use algorithmic tools.
  4. Reinstatement of Wrongfully Removed Content: Because erroneous content moderation decisions are so common and have such negative effects, it is crucial that platforms reinstate users’ content when the removal decision cannot be justified by a sensible interpretation of the platforms’ rules or the removal was simply in error
  5. Coordinated and Effective Regulatory Oversight: Coordination between independent national authorities should be strengthened to enable EU-wide enforcement, and platforms should be incentivized to follow their due diligence duties through, for example, meaningful sanctions harmonized across the European Union.

Facing the most significant reform project of Internet law undertaken in two decades, the EU should choose to protect the Internet rather than coerce online platforms to police their users. EFF intends to fight for users’ rights, transparency, anonymity, and limited liability for online platforms every step of the way.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2020.