The EU's Digital Services Act is a chance to preserve what works and to fix what is broken. EFF and other civil society groups have advocated for new rules that protect fundamental rights online, while formulating a bold vision to address today's most pressing challenges. However, while the initial proposal by the EU Commission got several things right, the EU Parliament is toying with the idea of introducing a new filternet, made in Europe. Some politicians believe that any active platform should potentially be held liable for the communications of its users and they trust that algorithmic filters can do the trick to swiftly remove illegal content.

In an opinion piece published on "heise online" on 8 November 2021 under a CC BY 4.0 license, Julia Reda, head of the control © project at the civil rights NGO Gesellschaft für Freiheitsrechte (GFF) and former Member of the EU Parliament has analyzed the current proposals and explained what is at stake for internet users. We have translated this text below.

Edit Policy: Digital Services Act derailed in the European Parliament

It's a familiar pattern in net politics – the EU Commission makes a proposal that threatens fundamental digital rights. Civil society then mobilizes for protests and relies on the directly elected European Parliament to prevent the worst. However, in the case of the EU's most important legislative project for regulating online platforms – the Digital Services Act – the most dangerous proposals are now coming from the European Parliament itself, after the draft law of the EU Commission had turned out to be surprisingly friendly to fundamental rights.

Apparently, the European Parliament has learned nothing from the debacle surrounding Article 17 of the Copyright Directive. It threatens a dystopian set of rules that promotes the widespread use of error-prone upload filters, allows the entertainment industry to block content at the push of a button, and encourages disinformation by tabloid media on social media.

Vote on Digital Services Act postponed

The Committee on the Internal Market and Consumer Protection should have voted this Monday on its position on the Digital Services Act in order to be able to start negotiations with the Commission and the Council. Instead, a hearing for Facebook whistleblower Frances Haugen was on yesterdays’ agenda (8th November). The postponement of the vote is due to a disagreement among MEPs about the principles of platform regulation. Support for a complete departure from the tried-and-tested system of limited liability for Internet services is growing, directly threatening our freedom of communication on the Net.

The Digital Services Act is the mother of all platform laws. Unlike Article 17, the draft law is intended to regulate not only liability for copyright infringement on selected commercial platforms – but liability for all illegal activities of users on all types of hosting providers, from Facebook to non-commercial hobby discussion forums. Even if platforms block content on the basis of their general terms and conditions, the Digital Services Act is intended to define basic rules for this in order to strengthen users' rights against arbitrary decisions. In view of the balanced draft by the EU Commission, it is all the more astonishing what drastic restrictions on fundamental rights are now becoming acceptable in the European Parliament. The following three proposals are among the most dangerous.

Entertainment industry wants blocking within 30-minutes

Until now, platforms have not been liable for illegal uploads by their users, as long as they remove them expeditiously after becoming aware of an infringement. How quickly a deletion must be made depends on the individual case – for example, on whether an infringement can be clearly determined following an alert. Courts often deliberate for years on whether a particular statement constitutes an unlawful insult. In such borderline cases, no company can be expected to decide on blocking in the shortest possible time. For this reason, European legislators have renounced to strict deletion periods in the past.

This is set to change: The European Parliament's rapporteur for the Digital Services Act, Denmark's Christel Schaldemose, is demanding that platforms block illegal content within 24 hours if the content poses a threat to public order. It is unclear exactly when an upload on social networks poses a threat to public order that platforms will have little choice but to block content on demand after 24 hours.

The European Parliament’s co-advisory Legal Affairs Committee, which has already adopted its position on the Digital Services Act, goes even further and wants to give the entertainment industry in particular a free pass to block uploads. Livestreams of sports or entertainment events are to be blocked within 30 minutes; sports associations had already lobbied for similar special regulations during during copyright reform. Such short deletion periods can only be achieved by automated filters – it is hardly possible for humans to check whether a blocking request is justified at all within such a short time.

More dangerous than the Network Enforcement Act

 Strict withdrawal deadlines for illegal content are already familiar from the German Network Enforcement Act (NetzDG). Nevertheless, the proposals under discussion at the EU level are more dangerous in many respects. First, the obligation to block reported content within 24 hours under the Network Enforcement Act is limited to obviously illegal content and to a few large platforms. The European Parliament negotiator's proposal does not include such restrictions.

Second, the consequences of violating the deletion deadlines differ significantly between the NetzDG and the Digital Services Act. The NetzDG provides for fines if a platform systematically violates the requirements of the law. In plain language, this means that exceeding the 24-hour deadline once does not automatically lead to a penalty.

In the planned Digital Services Act, on the other hand, the deletion periods will become a prerequisite for limiting the liability of platforms: For content that the platform has not blocked within 24 hours of being notified, the platform operator itself will be liable, as if it had committed the illegal act itself. In the case of copyright, for example, platforms would be threatened with horrendous claims for damages for every single piece of content affected. The incentives to simply block all reported content unseen are much greater here than under the NetzDG.

Elsewhere, the European Parliament rapporteur also wants to punish platforms for misconduct by making them directly liable for infringements of their users' rights – for example, if the platforms violate transparency obligations. As important as transparency is, this approach carries great dangers. Infringements by platforms can always occur, and an appropriate response is strict market monitoring and the imposition of fines, which may well be high.

But if any violation of the rules by platforms immediately threatens the loss of the liability safe harbor, the legislator creates an incentive for platforms to control the behavior of their users as closely as possible using artificial intelligence. Such systems have high error rates and also block rows of completely unsuspicious, legal content this was recently underlined once again by Facebook whistleblower Frances Haugen.

Article 17 does not yet go far enough for the Legal Affairs Committee

The Legal Affairs Committee envisages that organizations in the entertainment industry can be recognized as so-called "trusted flaggers", which should be able to independently obtain the immediate blocking of content on platforms and only have to account for which content was affected once a year. This regulation opens the door to abuse. Even platforms that are not yet forced to use upload filters under the copyright reform would then automatically implement the blocking requests of the "trusted flaggers," who in turn would almost certainly resort to error-prone filtering systems to track down alleged copyright infringements.

On top of that, the Legal Affairs Committee’s position on redefining the exclusion of liability is absurd. Hosting providers should only be able to benefit from liability exclusion if they behave completely neutral towards the uploaded content, i.e. do not even intervene in the presentation of the content by using search functions or recommendation algorithms. If this position prevails, only pure web hosters would be covered by the liability safe harbor. All modern platforms would be directly liable for infringements by their users – including Wikipedia, GitHub or Dropbox, which were exempted from Article 17 during the copyright reform after loud protests from the Internet community. The Legal Affairs Committee's proposal would simply make it impossible to operate online platforms in the EU.

Disinformation for the ancillary copyright

Moreover, the ancillary copyright for press publishers, the prime example of patronage politics in copyright law, is once again playing a role in the debate about the Digital Services Act. The Legal Affairs Committee has been receptive to the press publishers' demand for special treatment of press content on social media. The committee is demanding that major platforms like Facebook would no longer be allowed to block the content of press publishers in the future – even if it contains obvious disinformation or violates terms and conditions.

The purpose of this regulation is clear – even if the Legal Affairs Committee claims that it serves to protect freedom of expression and media pluralism, this is yet another attempt to enforce the ancillary copyright. If the use of press articles by platforms is subject to a fee under the ancillary copyright, but at the same time the platforms are prohibited by the Digital Services Act from blocking press articles, then they have no choice but to display the articles and pay for them.

For publishers, this is a license to print money. In their crusade for the ancillary copyright, however, the publishers ensure that platforms can no longer counter disinformation as long as it only takes place in a press publication. That such a regulation is dangerous is not only revealed by the disclosures around fake news offers used for propaganda around autocratic regimes. A glance at the tabloid press is enough to understand that the publication of an article by a press publisher is no guarantee for quality, truth or even compliance with basic interpersonal rules of conduct.

The European Parliament still has time to live up to its reputation as a guarantor of fundamental rights. But unless the negotiations on the Digital Services Act take another turn, this law threatens to exacerbate the problems with online platforms instead of helping to solve them.

Original text at "heise online": Edit Policy: Digital Services Act entgleist im Europaparlament