After the defeat of SOPA/PIPA, Big Content has mostly focused on quiet, backroom deals for copyright legislation, like the unconstitutional CASE Act, which was so unpopular it had to be slipped into a must-pass bill in the dead of winter. But now, almost exactly a decade later, they’ve come screaming out of the gate with a proposal almost as bad as SOPA/PIPA. Let’s hope it doesn’t take an Internet Blackout to kill it this time.

The new proposal, cynically titled the SMART Copyright Act, gives the Library of Congress, in “consultation” with other government agencies, the authority to designate “technical measures” that internet services must use to address copyright infringement. In other words, it gives the Copyright Office the power to set the rules for internet technology and services, with precious little opportunity for appeal.

First, a little background: One of the conditions of the safe harbors from copyright liability included in the Digital Millennium Copyright Act—safe harbors that are essential to the survival of all kinds of intermediaries and platforms, from a knitting website to your ISP—is that the provider must accommodate any “standard technical measures” for policing online infringement. Congress sensibly required that any such measures be “developed pursuant to a broad consensus of copyright owners and service providers in an open, fair, voluntary, multi-industry standards process.” As a practical matter, no such broad consensus has ever emerged, nor even a “multi-industry standards process” to develop it. There are many reasons why, and one of the biggest ones is that the number and variety of both service providers and copyright owners has exploded since 1998.  These industries and owners have wildly varying structures, technologies, and interests. What has emerged instead are privately developed and deployed automated filters, usually deployed at the platform level. And some influential copyright owners want to see those technologies become a legal requirement for all levels.

This legislation seeks to accomplish that by setting up a new process that jettisons the whole notion of consensus and fair process. Instead, it puts the Librarian of Congress in charge of designating technical measures—and requires virtually every service provider to comply with them.

This bill cannot be fixed. Let us count the ways:

Tech Mandates Will Inevitably Stifle Lawful Expression

For decades, Big Tech has struggled to appease Big Content by implementing a technical measure they love: filters. The most well-known example, YouTube’s Content ID system, works by having people upload their content into a database maintained by YouTube. New uploads are compared to what’s in the database, and when the algorithm detects a match, the system applies the default rule chosen by copyright holders, such as taking it down or monetizing it (the benefits of which will flow to the copyright holder.) They can also, after being informed of a match, send a DMCA notice, putting the creator in danger of losing their account.

Despite more than a decade of tinkering, and more than $100 million in sunk costs, the system fails regularly. In 2015, for example, Sebastien Tomczak uploaded a ten-hour video of white noise. A few years later, as a result of YouTube’s Content ID system, a series of copyright claims were made against Tomczak’s video. Five different claims were filed on sound that Tomczak created himself. Although the claimants didn’t force Tomczak’s video to be taken down, they all opted to monetize it. In other words, ads were put on the video without Tomczack’s consent and the ten-hour video would then generate revenue for those claiming copyright on the static. In 2020, CBS found that its Comic-Con panel had been blocked. YouTube creators report avoiding any use of music on their videos, no matter how clearly lawful, for fear of copyright flags.

Things are no better on Facebook. For example, because filters cannot tell the difference between two different performances of the same public domain work, a copyright holder’s claim to a particular version of a work can block many other performances. As a result, as one headline put it, “Copyright bots and classical musicians are fighting online. The bots are winning.”

Third-party tools can be even more flawed. For example, a “content protection service” called Topple Track sent a slew of abusive takedown notices to have sites wrongly removed from Google search results. Topple Track boasted that it was “one of the leading Google Trusted Copyright Program members.” In practice, Topple Track algorithms were so out of control that it sent improper notices targeting an EFF case page, the authorized music stores of both Beyonce and Bruno Mars, and a New Yorker article about patriotic songs. Topple Track even sent an improper notice targeting an article by a member of the European Parliament that was about improper automated copyright notices.

The core problem is this: distinguishing lawful from unlawful uses usually requires context. For example, the “amount and substantiality” factor in fair use analysis depends on the purpose of the use. So while the use may be a few seconds, as for some kind of music criticism, it can also be the whole piece, such as in a music parody. Humans can flag these differences; automated systems cannot.

Tech Mandates Will Stifle Competition

Any requirement to implement filtering or another technical measure would distort the market for internet services by privileging those service providers with sufficient resources to develop and/or implement costly filtering systems, reduce investment in new services, and impair incentives to innovate.

In fact, the largest tech companies will likely have already implemented mandated technical measures, or something close to them, so the burden of this mandate will fall mainly on small and medium-sized services.  If the price of hosting or transmitting content is building and maintaining a copyright filter, investors will find better ways to spend their money, and the current tech giants will stay comfortably entrenched.

Tech Mandates Put Your Security and Privacy at Risk

Virtually any tech mandate will raise security and privacy concerns. For example, when DNS filtering was proposed a decade ago as part of SOPA/PIPA, security researchers raised the alarm, explaining the costs would far outweigh the benefits. And as 83 prominent internet inventors and engineers explained in connection with the site-blocking and other measures proposed in those ill-fated bills, any measures that interfere with internet infrastructure will inevitably cause network errors and security problems. This is true in China, Iran, and other countries that censor the network today; it will be just as true of American censorship. It is also true regardless of whether censorship is implemented via the DNS, proxies, firewalls, or any other method. Network errors and insecurity that we wrestle with today will become more widespread and will affect sites other than those blacklisted by the American government.

The desires of some copyright holders to offload responsibility for stopping online infringement to service providers large and small must give way to the profound public interest in a robust, reliable, and open internet.

Tech Mandates Give the Library of Congress a Veto Right on Innovation—a Right It Is Manifestly Ill-Equipped to Exercise

Bill proponents apparently hope to mitigate at least some of these harms through the designation process itself, which is supposed to include consideration of the various public interests in play, as well as any effects on competition, privacy, and security. Recognizing that the Librarian of Congress is unlikely to have the necessary expertise to evaluate those effects, the bill requires them to consult with other government agencies that do.

There are at least two fundamental problems here. First, at best this means a group of well-meaning D.C. bureaucrats get to dictate how we build and use technology, informed primarily by those who can afford to submit evidence and expertise. Startups, small businesses, independent creators, and ordinary users, all of whom will be affected, are unlikely to know about the process, much less have a voice in it.

Second—and this is perhaps the most cynical aspect of the entire proposal—it is modeled on the Section 1201 exemption process the Library already conducts every three years. Anyone who has actually participated in that process can tell you it has been broken from the start.

Section 1201 of the DMCA makes it illegal to “circumvent” digital locks that control access to copyrighted works and to make and sell devices that break digital locks. Realizing that the law might inhibit lawful fair uses, the statute authorizes the Library of Congress to hold a triennial rulemaking process to identify and grant exemptions for such uses. The supposed “safety valve” is anything but. Instead, it creates a burdensome and expensive speech-licensing regime that has no binding standards, does not move at the speed of innovation, and functions at all only because of the work of clinical students and public interest organizations, all of whom could put that energy to better causes than going hat in hand to the Copyright Office every three years.

What is worse, while the 1201 exemptions for lawful expression expire if they are not renewed, once adopted the tech mandates will be permanent until they are successfully challenged. In other words, there are higher barriers to protecting fair use than impeding it.

Worse still, the Library of Congress will now be in charge of both designating technical mandates and designating when and how it’s OK to break them for fair use purposes. That is a terrifying power—and one far too great to put in the hands of a bunch of D.C. lawyers, no matter how well-meaning. It’s worth remembering that the Copyright Office didn’t grant a single meaningful exemption to Section 1201 for the first six years of that law’s operation. What innovative new services, and which potential challengers to today’s tech giants, could we lose in six years?

Remaking the internet to serve the entertainment industry was a bad idea ten years ago and it’s a bad idea today. This dangerous bill is a nonstarter.

Take Action

Tell your senators to oppose The Filter Mandate

Related Issues