The European Commission is set to release today a draft of the Digital Services Act, the most significant reform of European Internet regulations in two decades. The proposal, which will modernize the backbone of the EU’s Internet legislation—the e-Commerce Directive—sets out new responsibilities and rules for how Facebook, Amazon, and other companies that host content handle and make decisions about billions of users’ posts, comments, messages, photos, and videos.

This is a great opportunity for the EU to reinvigorate principles like transparency, openness, and informational self-determination. Many users feel locked into a few powerful platforms and at the mercy of algorithmic decision systems they don’t understand. It’s time to change this.

We obtained a copy of the 85-page draft and, while we are still reviewing all the sections, we zeroed in on several provisions pertaining to liability for illegal content, content moderation, and interoperability, three of the most important issues that affect users’ fundamental rights to free speech and expression on the Internet.

What we found is a mixed bag with some promising proposals. The Commission got it right setting limits on content removal and allowing users to challenge censorship decisions. We are also glad to see that general monitoring of users is not a policy option and that liability for speech rests with the speaker, and not with platforms that host what users post or share online. But the proposal doesn’t address user control over data or establish requirements that the mega platforms work towards interoperability. Thus, there is space for improvement and we will work with the EU Parliament and the Council, which must agree on a text for it to become law, to make sure that the EU fixes what is broken and puts users back in control.

Content liability and monitoring

The new EU Internet bill preserves the key pillars of the current Internet rules embodied in the EU’s e-Commerce Directive. The Commission followed our recommendation to refrain from forcing platforms to monitor and censor what users say or upload online. It seems to have learned a lesson from recent disastrous Internet bills like Article 17, which makes platforms police users’ speech.

The draft allows intermediaries to continue to benefit from comprehensive liability exemptions so, as a principle, they will not be held liable for user content. Due to a European-style “good samaritan” clause, this includes situations where platforms voluntarily act against illegal content. However, the devil lies in the details and we need to make sure that platforms are not nudged to employ “voluntary” upload filters.

New due-diligence obligations

The DSA sets out new due diligence obligations for flagging illegal content for all providers of intermediary services, and establishes special type and size-oriented obligations for online platforms, including the very large ones.

We said from the start that a one-size fits all approach to Internet regulations for social media networks does not work for an Internet that is monopolized by a few powerful platforms. We can therefore only support new due diligence obligations that are matched to the type and size of the platform. The Commission rightly recognizes that the silencing of speech is a systemic risk on very large platforms and that transparency about content moderation can improve the status quo. However, we will carefully analyze other, potentially problematic provisions, such as requiring platforms to report certain types of illegal content to law enforcement authorities. Rules on supervision, investigation, and enforcement deserve in-depth scrutiny from the European Parliament and the Council.

Takedown notices and complaint handling

Here, the Commission has taken a welcome first step towards more procedural justice. Significantly, the Commission acknowledges that platforms frequently make mistakes when moderating content. Recognizing that users deserve more transparency about platforms’ decisions to remove content or close accounts, the draft regulations call for online platforms to provide a user-friendly complaint handling system and restore content or accounts that were wrongly removed.

However, we have concerns that platforms, rather than courts, are increasingly becoming the arbiters of what speech can or cannot be posted online. A harmonized notification system for all sorts of content will increase the risk that the platform becomes aware about the illegality of content and thus held liable for it.

Interoperability measures are missing

The Commission missed the mark on giving users more freedom and control over their Internet experience, as rules on interoperability are absent from the proposal. That may be addressed in the Digital Markets Act draft proposal. If the EU wants to break the power of platforms that monopolize the Internet, it needs regulations that will enable users to communicate with friends across platform boundaries, or be able to follow their favorite content across different platforms without having to create several accounts.

Court/administrative content takedown orders

The Internet is global and takedown orders of global reach are immensely unjust and impair users’ freedom. The draft rules address the perils of world-wide takedown orders by requiring such orders take into account users’ rights and their territorial scope must be necessary

Sanctions

Under the proposed regulations, the largest platforms can be fined up to six percent of their annual revenue for violating rules about hate speech and the sale of illegal goods. Proper enforcement actions and dissuasive sanctions are important tools to change the current digital space that is monopolized by very large platforms. That being said, high fines are only good if the substance of the regulations is good, which we will study in great detail in the next few weeks.

Non-EU platforms

Non-EU platform providers will face compliance duties if their services have a substantial connection to the EU. The proposed rules take particular aim at companies outside the Union, such as those in the U.S, that offer services to EU users. But the criteria for imposing the duties is not clear, and we’re concerned that if non-EU platforms are obligated to have legal representation in the EU, some will decide against offering services in the EU.

 

Tags