Today EFF and 56 other civil society organizations have sent an open letter [PDF] to European lawmakers outlining our grave concerns with Article 13 of the proposed new Directive on Copyright in the Digital Single Market, which would impose a new responsibility on Internet platforms to filter content that their users upload. The letter explains:

Article 13 introduces new obligations on internet service providers that share and store user-generated content, such as video or photo-sharing platforms or even creative writing websites, including obligations to filter uploads to their services. Article 13 appears to provoke such legal uncertainty that online services will have no other option than to monitor, filter and block EU citizens’ communications if they are to have any chance of staying in business. ...

Article 13 would force these companies to actively monitor their users' content, which contradicts the "no general obligation to monitor" rules in the Electronic Commerce Directive. The requirement to install a system for filtering electronic communications has twice been rejected by the Court of Justice, in the cases Scarlet Extended (C 70/10) and Netlog/Sabam (C 360/10). Therefore, a legislative provision that requires internet companies to install a filtering system would almost certainly be rejected by the Court of Justice because it would contravene the requirement that a fair balance be struck between the right to intellectual property on the one hand, and the freedom to conduct business and the right to freedom of expression, such as to receive or impart information, on the other.

European Commission Foreshadows More Of the Same To Come

Article 13 is bad enough as a copyright filtering mandate. But what makes the proposal even more alarming is that it won't stop there. If we lose the battle against the use of upload filters for copyright, we'll soon see a push for a similar mandate on platforms to filter other types of content, beginning with ill-defined "hate speech" and terrorist content, and ending who knows where. Evidence for this comes in the form of a Communication on Tackling Illegal Content Online, released last month. The Communication states:

Online platforms should do their utmost to proactively detect, identify and remove illegal content online. The Commission strongly encourages online platforms to use voluntary, proactive measures aimed at the detection and removal of illegal content and to step up cooperation and investment in, and use of, automatic detection technologies.

The Communication also talks up the possibility of "so-called 'trusted flaggers', as specialised entities with specific expertise in identifying illegal content," being given special privileges to initiate content removal. However, we already have bodies that have expertise in identifying illegal content. They're called courts. As analyses of the Communication by European Digital Rights (EDRi), the Center for Democracy and Technology (CDT), and Intellectual Property Watch point out, shifting the burden of ruling on the legality of content from courts onto private platforms and their "trusted flaggers" will inevitably result in over-removal by those platforms of content that a court would have found to be lawful speech.

The Communication clearly foreshadows future legislative measures as soon as 2018 if no significant progress is made by the platforms in rolling out automated filtering and trusted flagging procedures on a "voluntary" basis. This means that the Communication, although expressed to be non-binding, is not really "voluntary" at all, but rather a form of undemocratic Shadow Regulation by the unelected European Commission. And the passage of the upload filtering mandate in the Digital Single Market Directive would be all the encouragement needed for the Commission to press forward with its broader legislative agenda.

The Link Tax Paid To Publishers ... That Publishers Don't Want

The upload filtering mandate in Article 13 isn't the only provision of the proposed Directive that concerns us. Another provision of concern, Article 11, would impose a new "link tax" payable to news publishers on websites that publish small snippets of news articles to contextualize links to those articles. Since we last wrote about this, an interesting new report has come out providing evidence that European publishers—who are the supposed beneficiaries of the link tax—actually oppose it. The report also states:

[T]here is little evidence that the decline in newspaper revenues has anything to do with the activities of news aggregators or search engines (that appear as the primary targets of the new right). In fact, it is widely recognised that there are two reasons for the decline in newspaper revues: changes in advertising practice associated with the Internet (but not especially related to digital use of new material on the Internet); and the decline in subscriptions, which may be in part related to the decision of press publishers to make their products available on the Internet. These are simply changes in the newspaper market that have little, if anything, to do with the supposed "unethical" free riding of other internet operators.

The European Parliament's Civil Liberties (LIBE) Committee is due to vote on its opinion on the Digital Single Market proposals this Thursday, 19 October. Although it's not the final vote on these measures, it could be the most decisive one, since a recommendation for deletion of Article 11 and Article 13 at the LIBE committee would be influential in convincing the lead committee (the Legal Affairs or JURI Committee) to follow suit.

Digital rights group OpenMedia has provided a click-to-call tool that you can use, available in EnglishFrenchGermanSpanish, and Polish, to express your opposition to the upload filtering mandate and the link tax. If you are European or have European friends or colleagues, please do take this opportunity to speak out and oppose these proposals, which could change the Internet as we know it in harmful and unnecessary ways.