Today, Electronic Frontier Foundation sent the note below to every member of the EU bodies negotiating the final draft of the new Copyright Directive in the "trilogue" meetings.

The note details our grave misgivings about the structural inadequacies and potential for abuse in the late-added and highly controversial Articles 11 and 13, which require paid licenses for links to news-sites (Article 11) and censoring public communications if they match entries in a crowdsourced database of copyrighted works.

#

I write today on behalf of the Electronic Frontier Foundation, to raise urgent issues related to Articles 11 and 13 of the upcoming Copyright in the Digital Single Market Directive, currently under discussion in the Trilogues.

The Electronic Frontier Foundation is the leading nonprofit organization defending civil liberties in the digital world. Founded in 1990, EFF champions user privacy, free expression, and innovation through impact litigation, policy analysis, grassroots activism, and technology development. We work to ensure that rights and freedoms are enhanced and protected as our use of technology grows. We are supported by over 37,000 donating members around the world, including around three thousand within the European Union.

We believe that Articles 11 and 13 are ill-considered and should not be EU law, but even stipulating that systems like the ones contemplated by Articles 11 and 13 are desirable, the proposed text of the articles in both the Parliament and Council texts contain significant deficiencies that will subvert their stated purpose while endangering the fundamental human rights of Europeans to free expression, due process, and privacy.

It is our hope that the detailed enumeration of these flaws, below, will cause you to reconsider Articles 11 and 13's inclusion in the Directive altogether, but even in the unfortunate event that Articles 11 and 13 appear in the final language that is presented to the Plenary, we hope that you will take steps to mitigate these risks, which will substantially affect the transposition of the Directive in member states, and its resilience to challenges in the European courts .

#

Article 13: False copyright claims proliferate in the absence of clear evidentiary standards or consequences for inaccurate claims.

Based on EFF’s decades-long experience with notice-and-takedown regimes in the United States, and private copyright filters such as YouTube's ContentID, we know that the low evidentiary standards required for copyright complaints, coupled with the lack of consequences for false copyright claims, are a form of moral hazard that results in illegitimate acts of censorship from both knowing and inadvertent false copyright claims.

For example, rightsholders with access to YouTube's ContentID system systematically overclaim copyrights that they do not own. For instance, the workflow of news broadcasters will often include the automatic upload of each night's newscast to copyright filters without any human oversight, despite the fact that newscasts often include audiovisual materials whose copyrights do not belong to the broadcaster – public domain footage, material used under a limitation or exception to copyright, or material that is licensed from third parties. This carelessness has predictable consequences: others — including bona fide rightsholders — who are entitled to upload the materials claimed by the newscasters are blocked by YouTube and have a copyright strike recorded against them by the system, and can face removal of all of their materials. To pick one example, NASA's own Mars lander footage was broadcast by newscasters who carelessly claimed copyright on the video by dint of having included NASA's livestream in their newscasts which were then added to the ContentID database of copyrighted works. When NASA itself subsequently tried to upload its footage, YouTube blocked the upload and recorded a strike against NASA.

In other instances, rightsholders neglect the limitations and exceptions to copyright when seeking to remove content. For example, Universal Music Group insisted on removing a video uploaded by one of our clients, Stephanie Lenz, which featured incidental audio of a Prince song in the background. Even during the YouTube appeals process, UMG refused to acknowledge that Ms. Lenz’s incidental inclusion of the music was fair use – though this analysis was eventually confirmed by a US federal judge. Lenz's case took more than ten years to adjudicate, largely due to Universal's intransigence, and elements of the case still linger in the courts.

Finally, the low evidentiary standards for takedown and the lack of penalties for abuse have given rise to utterly predictable abuses. False copyright claims have been used to suppress whistleblower memos detailing flaws in election security, evidence of police brutality, and disputes over scientific publication.

Article 13 contemplates that platforms will create systems to allow for thousands of copyright claims at once, by all comers, without penalty for errors or false claims. This is a recipe for mischief and must be addressed.

#

Article 13 Recommendations

To limit abuse, Article 13 must, at a minimum, require strong proof of identity from those who seek to add works to an online service provider's database of claimed copyrighted works and make ongoing access to Article 13's liability regime contingent on maintaining a clean record regarding false copyright claims.

Rightsholders who wish to make copyright claims to online service providers should have to meet a high identification bar that establishes who they are and where they or their agent for service can be reached. This information should be available to people whose works are removed so that they can seek legal redress if they believe they have been wronged.

In the event that rightsholders repeatedly make false copyright claims, online service providers should be permitted to strike them off of their list of trusted claimants, such that these rightsholders must fall back to seeking court orders – with their higher evidentiary standard – to effect removal of materials.

This would require that online service providers be immunised from Article 13's liability regime for claims from struck off claimants. A rightsholder who abuses the system should not expect to be able to invoke it later to have their rights policed. This striking-off should pierce the veil of third parties deputised to effect takedowns on behalf of rightsholders ("rights enforcement companies"), with both the third party and the rightsholder on whose behalf they act being excluded from Article 13's privileges in the event that they are found to repeatedly abuse the system. Otherwise, bad actors ("copyright trolls") could hop from one rights enforcement company to another, using them as shields for repeated acts of bad-faith censorship.

Online service providers should be able to pre-emptively strike off a rightsholder who has been found to be abusive of Article 13 by another provider.

Statistics about Article 13 takedowns should be a matter of public record: who claimed which copyrights, who was found to have falsely claimed copyright, and how many times each copyright claim was used to remove a work.

#

Article 11: Links are not defined with sufficient granularity, and should contain harmonised limitations and exceptions.

The existing Article 11 language does not define when quotation amounts to a use that must be licensed, though proponents have argued that quoting more than a single word requires a license.

The final text must resolve that ambiguity by carving out a clear safe-harbor for users, and ensure that there’s a consistent set of Europe-wide exceptions and limitations to news media’s new pseudo-copyright that ensure they don’t overreach with their power.

Additionally, the text should safeguard against dominant players (Google, Facebook, the news giants) creating licensing agreements that exclude everyone else.

News sites should be permitted to opt out of requiring a license for inbound links (so that other services could confidently link to them without fear of being sued), but these opt-outs must be all-or-nothing, applying to all services, so that the law doesn’t add to Google or Facebook's market power by allowing them to negotiate an exclusive exemption from the link tax, while smaller competitors are saddled with license fees.

As part of the current negotiations, the text must be clarified to establish a clear definition of "noncommercial, personal linking," clarifying whether making links in a personal capacity from a for-profit blogging or social media platform requires a license, and establishing that (for example) a personal blog with ads or affiliate links to recoup hosting costs is "noncommercial."

In closing, we would like to reiterate that the flaws enumerated above are merely those elements of Articles 11 and 13 that are incoherent or not fit for purpose. At root, however, Articles 11 and 13 are bad ideas that have no place in the Directive. Instead of effecting some piecemeal fixes to the most glaring problems in these Articles, the Trilogue take a simpler approach, and cut them from the Directive altogether.

Thank you,

Cory Doctorow
Special Consultant to the Electronic Frontier Foundation

Related Issues