The House Judiciary Committee heard testimony Thursday on a law that underpins the Internet as we know it today: the copyright notice-and-takedown system and the safe harbor for service providers that comes with it, set up in Section 512 of the Digital Millennium Copyright Act. This is the committee's eighth hearing in a series reviewing various aspects of copyright law in anticipation of a possible revision that's been dubbed "The Next Great Copyright Act."

As has become the standard, this hearing brought together a handful of stakeholders with very different relationships to the law—two academics, a composer and a company that send takedowns, a service that receives some, and Google—to air varying perspectives in the hope of coming to a balanced conclusion. While these hearings sometimes seem formulaic, what the public usually gets is a smattering of different takes on the law: some good, some bad, and some ugly.

Sieminski explains DMCA abuse and

First the good: several of the panelists, along with committee chairman Rep. Bob Goodlatte, recognized that abuse of DMCA takedown notices is a real and serious problem. One of the witnesses, Automattic's Paul Sieminski, outlined the issue and noted that, in the absence of enforceable penalties, copyright takedown notices can be used to silence legitimate speech. From his testimony:

The DMCA gives copyright holders a powerful and easy-to-use weapon: the unilateral right to issue a takedown notice that a website operator (like Automattic [the company behind WordPress]) must honor or risk legal liability. The system works so long as copyright holders use this power in good faith. But too often they don't, and there should be clear legal consequences for those who choose to abuse the system.

Automattic backs up that argument with legal action, too. It has gone to court to defend two of its bloggers from bogus takedown notices under Section 512(f). Cases under that section are rare, in part because litigation is expensive, time-consuming, and the stakes of losing a copyright suit can include enormous financial penalties. Automattic's are two of the highest-profile suits under that section since we began representing Stephanie Lenz in Lenz v. Universal, the long-running "dancing baby" case.

Then there's the bad. Years after we first saw members of Congress complain about "rogue" websites in their Google results during the Stop Online Piracy Act (SOPA) hearing, some are still trying that same ploy. Some legislators even referred to the problem of foreign rogue websites in particular, as if SOPA’s legislative meltdown had never occurred.

Google's representative at the hearing, Senior Copyright Policy Counsel Katherine Oyama, was clear that the hypothetical search terms that Congress members suggested are actually used relatively infrequently—but more importantly, that demoting certain search results is useless unless there are other matching search results to elevate in their place. In other words, if people are actually searching for a movie title plus the word "free"—and again, Google's records show they generally are not—the studio can already take advantage of its prominent search engine placement with a page explaining what free options are available.

That's good enough for Google, but we'll go one further: private voluntary agreements (for example, between Google and film studios) run the risk of chilling or limiting lawful speech, often without meaningful oversight and accountability. That was on display with the ridiculous suggestion during the hearing that Google should treat searches that include "free" or "watch" differently than other searches, as if the company should be eliminating common English words to appease a particular industry.

Finally, some proposals from witnesses and Congress members ran the risk of wiping out fair use from certain contexts altogether. One witness, Professor Sean O'Connor of the University of Washington School of Law, advocated expanding the current "notice-and-takedown" system with "notice-and-stay-down," where an accusation made once could stick around as a permanent filter and prevent future uploads, legal or not, from ever occurring.

Not only does that idea raise the stakes of false or overreaching accusations, but it overlooks the fact that uploads that are infringing in some contexts might be perfectly lawful fair uses in others. Rightsholders may like the idea of a permanent and automatic veto power on all future uploads, but it's not one granted by copyright law in the U.S.

Incidentally, this exact issue has recently been litigated in Spain, where the court held in January that YouTube cannot be forced to prevent future instances of infringing content to be uploaded.

Section 512 has its flaws, and parts of it are certainly due for review, but the safe harbor it describes has been essential for a generation of sites that embrace user-generated content to take root and grow. As Congress continues to conduct this review, it should look to the actual benefits of the safe harbor—and the actual costs of takedown abuse.