Update: On 16 June, following the approval by the Council’s Permanent Representatives Committee, the EU Parliament’s Internal Market Committee overwhelmingly endorsed the deal on the EU’s Digital Services Act. EFF helped to ensure that the final language steered clear of intrusive filter obligations. The DSA is expected to be formally adopted by EU lawmakers in July.

The European Union reached another milestone late last week in its journey to pass the Digital Services Act (DSA) and revamp regulation of digital platforms to address a myriad of problems users face—from overbroad content takedown rules to weak personal data privacy safeguards. There’s a lot to like in the new DSA agreement EU lawmakers reached, and a lot to fear.

Based on what we have learned so far, the deal avoids transforming social networks and search engines into censorship tools, which is great news. Far too many proposals launched since work on the DSA began in 2020 posed real risks to free expression by making platforms the arbiters of what can be said online. The new agreement rejects takedown deadlines that would have squelched legitimate speech. It also remains sensitive to the international nature of online platform, which will have to consider regional and linguistic aspects when conducting risks assessments.

What’s more, the agreement retains important e-Commerce Directive Principles that helped make the internet free, such as rules allowing liability exemptions and limiting user monitoring.  And it imposes higher standards for transparency around content moderation and more user control over algorithmically-curated recommendations.

But the agreement isn’t all good news. Although it takes crucial steps to limit pervasive online behavioral surveillance practices and rejects the concerning parliamentary proposal to mandate cell phone registration for pornographic content creators, it fails to grant users explicit rights to encrypt their communications and use digital service anonymously to speak freely and protect their private conversations. In the light of an upcoming regulation that, in the worst case, could make government scanning of user messages mandatory throughout the EU, the DSA is a missed opportunity to reject any measure that leads to spying on people’s private communication. In addition, new due diligence obligations could incentivize platforms in certain situations to over-remove content to avoid being held liable for it.

We’re also worried about the flawed “crisis response mechanism” proposal—introduced by the Commission in closed-door trilogue negotiations—giving the European Commission too much power to control speech on large platforms when it decides there’s a crisis.  But we were glad to see it tempered by adding an extra step requiring the Commission to get a green light from national independent platform regulators first. Also, the Commission will have to take due regard of the crisis’ gravity and consider how any measure taken will impact on fundamental rights.

Finally, the agreement retains provisions allowing government agencies to order a broad range of providers to remove allegedly illegal content, and giving governments alarming powers to uncover data about anonymous speakers, and everyone else. Without specific and comprehensive limitations, these provisions add up to enforcement overreach that will interfere with the right to privacy and threatens the foundation of a democratic society. Unfortunately, European lawmakers didn’t introduce necessary human rights-focused checks and balances in the agreement to safeguard users against abuse of these powers.

The agreement is not the end of the process—the text is still subject to technical revisions and discussions, and may not be released in its final form for days or weeks. We will be analyzing details as we get them, so stayed tuned. Once the DSA text is finalized, it still needs to be voted into law before taking effect.