One of the less-heralded issues in a series of prominent cases (here, here, and here, for example) testing the limits of the Digital Millennium Copyright Act ("DMCA") safe harbor provisions is the question of when and how service providers must terminate the accounts of "repeat infringers." As a condition of safe harbor eligibility, the DMCA requires that service providers "adopt and reasonably implement" a repeat infringer policy that provides for termination of users' accounts "in appropriate circumstances." But what does this requirement mean? How does one “adopt and reasonably implement”? Who are "repeat infringers"? What do service providers need to do to comply with the law and protect their users' rights to post lawful content?

The right answers to these questions are crucial, because while termination may be needed to punish (or at least impede) large-scale infringers, improper termination can have drastic consequences for legitimate users.

Consider, for example, the effect of YouTube's termination policy on animal-rights advocates Showing Animals Respect and Kindness (SHARK). SHARK videotapes rodeos in order to expose animal abuse, injuries, and deaths and posted more than two dozen videos to YouTube to publicize animal mistreatment. In December 2008, the Professional Rodeo Cowboys Association (PRCA) filed baseless DMCA takedown demands for 13 of the videos. YouTube promptly removed the videos and, following its policy, canceled SHARK's entire YouTube account, removing all of SHARK's uploaded videos from the site and leaving SHARK unable to post new videos. SHARK counter-noticed and the account was restored – but not before SHARK had been silenced for weeks in the middle of the end-of-year fundraising season.

To avoid similar events, service providers who care about free speech and their customers should consider the following as they develop their policies.

What Does the Law Require?

Courts interpreting the DMCA’s safe harbor provision have been careful to limit the burdens on service providers. For example, a service provider does not have to actively monitor posted content, nor must it "act on or address difficult infringement issues" or even determine on its own whether any given content is legal. Also, a service provider need not terminate any user’s account unless it has "actual knowledge" of multiple instances of infringing content, which only occurs when the provider receives a DMCA-compliant notification.

That said, service providers do have to provide at least a "working notification system, a procedure for dealing with DMCA-compliant notifications, and ... not actively prevent copyright owners from collecting information needed to issue such notifications."

Specifically, courts have found that service providers do not comply with the DMCA’s repeat infringer policy requirement when they:

Beyond the above, courts have so far provided little guidance. However, we know Congress enacted the DMCA in order to foster the growth of the Internet as a forum for speech and commerce. We also know that Congress understood that not all alleged infringements are equal: when it passed the DMCA, Congress stated "that there are different degrees of on-line infringement, from the inadvertent and noncommercial, to the willful and commercial." Keeping this in mind, along with the goal of combating online infringement, service providers should seek to implement policies that balance the interests of providers, customers, and content owners.

A Fair Repeat Infringer Policy

In order to protect their customers’ rights, service providers should avoid knee-jerk and over-simplified policies such as “three strikes (takedown notices) and you’re out.” Instead, before shutting down an account, service providers should do the following:

  • Notice: Promptly notify the user or poster about the reports(s) lodged against him, including information on how to challenge the notice and also how to contact the content owner;
  • Information: Allow users – usually lay people without easy access to legal counsel – a fair opportunity to counter-notice. For example, service providers should offer clear information on those procedures, such as explicit instructions, easy-to-locate email addresses and/or user-friendly web forms;
  • No Instant Termination: Upon receipt of a notice of infringement that could trigger termination of a user's account, notify the user and provide her with a meaningful opportunity to counter-notice, (at least ten business days) before terminating her account (remember, specific content may still be removed in the meantime). If a counter-notice is received, the “strike” should be removed immediately, unless and until the content is found to be infringing; and
  • Trust: Create a system of additional protections for “trusted” users, such as users who have not posted any infringing material for a specified amount of time, or for whom noticed content represents only a small percentage of their total posts. Such protections could include additional "strikes" before termination (say, five rather than three), and a fast-tracked appeal procedure, such as a dedicated email address through which users could request immediate review by the service provider and the content owner of content the users reasonably believes was taken down improperly. (Keep in mind that if a service provider does not believe content is infringing, it does not need the safe harbor.) This would provide some recourse when, for example, a political video is taken down two weeks before an election. In addition, trusted users could be provided with a longer window in which to send a counter-notice to prevent termination of their accounts.

These steps won’t eliminate the problem of takedown abuse, but they should help service providers do their part to protect their customers while protecting themselves.

Related Issues