Major entertainment companies are once again trying to expand copyright law to gain leverage over a wide variety of user-generated content sites. If they succeed, they would have a veto over Internet users’ access to the tools that allow us to remix, mashup, and participate in popular culture. EFF, along with the Center for Democracy and Technology, and Public Knowledge, filed an amicus brief in the case of BWP Media v. Polyvore, urging the United States Court of Appeals for the Second Circuit to defend copyright law’s important protections for user-generated content platforms.

Polyvore’s platform lets its users collect images from other sites, then manipulate those images and remix them into new collages. The popular site has millions of users from around the globe and views its mission as “democratizing fashion.” In 2010, when the New Yorker covered the site, Polyvore boasted "1.4 million registered users, two hundred thousand of whom are…dedicated 'creators.'"

For many of us, choosing our clothes is an act of self-expression. Whether we’re donning an EFF T-shirt or emulating our idols, the way we choose and mix our wardrobe expresses our aesthetic tastes and reflects how we want others to see us. Playing with, collaging, and remixing images of fashion and design objects give us a way to experiment with different possibilities, and create new aesthetic stories. By giving users the tools to create their own fashion and design collages, Polyvore lets us ditch the dictates of the fashion industry in favor of DIY assemblages we and our peers create ourselves.

The site’s also become a popular resource for “casual cosplay,” and supports an entire community of users that create and share creative interpretations of their favorite characters’ costumes using everyday clothing and objects.

Big media gatekeepers have a long history of trying to use copyright law to maintain control of technology that enables ordinary people to take popular culture into their own hands. In the 1980’s it was home video recording, in the 2000’s, remote video storage. Since then, the battle has been waged over various forms of user-generated content platforms and search tools.  BWP’s suit against Polyvore (with the Copyright Alliance and the MPAA weighing in to support it and assert their own pet theories) is just the latest effort to strip user-generated content sites of their safe harbors under Section 512 of the DMCA, and to hold those platforms responsible when any of their users infringes a copyright.

Without the safe harbors of Section 512, most online speech platforms could not exist. Even the big players that can afford to pay off the media cartels would face swarms of lawsuits from independent rightsholders seeking to cash in on the high cost of defending against a copyright claim, coupled with the extraordinary damages that can be awarded in copyright cases beyond any measure of actual market harm.

That’s not to say that rightsholders with legitimate claims are left without a remedy. The law is clear: when the user is the one responsible for causing an online platform to copy infringing material, they are liable, not the entity that created the tool they were using.  Plus, if a toolmaker goes so far as to make a tool that can only be used in an infringing way, or they actively induce people to infringe, then they, too, could be liable under doctrines of “secondary liability” for copyright infringement.

The safe harbor of Section 512 gives online platforms a clear roadmap for avoiding potential liability, to reduce the chilling effect of copyright threats that would otherwise demand a complex and costly defense.

But BWP also has a novel (and incorrect) theory to undermine Section 512 itself. The safe harbor requires that platforms comply with “standard technical measures” for managing copyrighted works and sets forth a process for establishing a technology as such a standard technical measure. Although no technology has ever gone through that process – something the Copyright Office recently recognized – BWP argues that metadata associated with digital images should qualify.

Not only does BWP’s metadata argument get the law wrong (the DMCA has pretty strict requirements for what counts as a “standard technical measure”), requiring platforms to retain metadata would also be disastrous for user privacy. Photo metadata often contains a lot of private information that you might not want to be shared publicly – like your location, the time you were there, and the identity of the device that took the photo.  Because of this, online safety guides often recommend that users be cautious about sharing photos with metadata, and some service providers opt to strip metadata from users’ uploaded photos to protect them from stalking, burglary, doxxing, or retaliation for political activities.

Fortunately, legal protections for online speech platforms are well-established, and their benefits for user expression and technological innovation have long been clear. Unfortunately, these legal protections continue to be targets for big content aggregators whose primary innovation is coming up with far-fetched legal theories to shake those platforms down. EFF has been there every step of the way, and will continue to protect the platforms we all use to communicate and express ourselves.