Copyright's robot wars have burst onto the scene of streaming video sites, silencing live feeds with bogus infringement accusations and no human oversight. Two examples from just the past week show the danger that lies ahead if copyright enforcement is left to bots alone, and sit alongside last month's Mars lander takedown as embarrassing results of the unchecked and lopsided “algorithmic copyright cops" that are becoming increasingly common online.
On Sunday, the live Ustream feed of the annual Hugo science-fiction award ceremonies was cut off in mid-stream after airing clips from nominated TV programs, including Doctor Who and Community. These clips were provided by the studios behind the programs, and would have been a clear fair use even without that explicit permission. But still, the stream went down and didn't come back.
Then on Tuesday, just after the speeches at the Democratic National Convention had concluded, YouTube showed a copyright error message on the stream, rendering the prominently embedded video temporarily unplayable. According to a YouTube spokesman, the message was the result of an "incorrect error message on the page," and did not interfere with the live-stream during the speech. Nevertheless, this highlights the potential danger of bots: at one of the most prominent political events of the presidential campaign season, an error occurred with all the hallmarks of a copyright takedown. We have asked YouTube for more information on why the error text had copyright messaging.
In the case of the Hugos, this wasn't a bot running off the rails. In fact, the system was working exactly as expected. Vobile, the third-party copyright filtering system used by Ustream, identified a matching clip from its database, and—lacking the context that any human oversight could have provided about fair use and licenses—decided the stream’s fate in a microsecond: termination. And because these bots generally operate outside of the Digital Millennium Copyright Act (DMCA), there is little accountability or opportunity for the uploader to remedy the situation.1
Most copyright takedowns on the web are handled under the "notice-and-takedown" procedure set out by the DMCA, which provides a legal "safe harbor" from liability for service providers that comply. While the DMCA is far from perfect, and is itself subject to abuse, it still requires a human to swear under penalty of perjury that there is infringement, and allows for the material to return after a counter-notice.
But the automated copyright filters in use by Ustream, YouTube, and others, go beyond the requirements of the DMCA and thus operate outside of it. As a result, users are left without the standard appeals process, and have only the recourse provided by the video platform. YouTube has an appeals system built into its Content ID system, but it has its own host of problems. For Ustream's part, it realized the error during the program but was, by its own admission, unable to lift the block in time to restart the stream. Ustream has since apologized and promised to “ensure fair use of copyright as permitted by the law.”
Following fair use principles is a laudable goal, but hard to accomplish with a bot. To be sure, a human can review video and readily determine most cases of fair use. Indeed, the DMCA process requires the reviewer to consider fair use. No reasonable human reviewer watching the Hugo Awards stream and seeing demonstrative clips of nominated shows would think to cut off the feed.
However, there is no copyright enforcement bot that is programed to assess fair use. The algorithms look for matches of audio or video, and are designed to resist circumvention attempts by finding inexact matches. While no bot will work well, programmers should at least follow the advice we gave in 2010:
insisting that the audio and video tracks both come from the same copyrighted work and that the entire (or almost entire) video is drawn from the same copyrighted work. Unless these conditions are met, "block" should not be an option available to copyright owners.
If the rules require that any other match leads to human intervention, this will protect against most fair use takedowns. The Mars Curiosity problem may be trickier, because the news video matched exactly the original NASA video, but can be avoided if content owners are careful to only claim original content.