On May 20, 2015, the U.S. Department of Commerce's Bureau of Industry and Security (BIS) published its proposed implementation of the December 2013 changes to the Wassenaar Arrangement. What follows is a long post, as we're quite troubled by the BIS proposal. In short, we're going to be submitting formal comments in response, and you should too.
What is the Wassenaar Arrangement?
The Wassenaar Arrangement is a multi-national agreement intended to control the export of certain "dual-use" technologies. It's a voluntary agreement among 41 participating states that mostly regulates the export of guns, other weapons (such as landmines), and their components (such as fissile material). In December 2013, the list of controlled technologies was amended to include surveillance systems for the first time, in response to reports linking exports of Western surveillance technologies to human rights abuses in countries such as Bahrain and the UAE, Turkmenistan, and Libya.
The Wassenaar Arrangement isn't law on its own; it's not even a treaty. Its effectiveness is dependent on each participating state's individual implementation of the export controls it contains. The European Union's member states implemented the new rules over the course of 2014, with the rules going into effect at the start of this year. The United States, also a participant in Wassenaar, started work on its implementation in 2014, with a call for comments that closed in October. EFF signed on to comments written by Dartmouth Professor Sergey Bratus voicing concerns about the rules' broad scope and their potential effect on security research. Apparently, the Commerce Department didn't fully digest the comments it received last year, as the rules it proposed this month are a disaster.
The Wassenaar Arrangement includes controls for technology connected to "intrusion software." Under Wassenaar, "intrusion software" is defined as:
"Software" specially designed or modified to avoid detection by 'monitoring tools', or to defeat 'protective countermeasures', of a computer or network capable device, and performing any of the following:
a. The extraction of data or information, from a computer or network capable device, or the modification of system or user data; or
b. The modification of the standard execution path of a program or process in order to allow the execution of externally provided instructions.
Only the following categories are actually subject to export control:
4. A. 5. Systems, equipment, and components therefor, specially designed or modified for the generation, operation or delivery of, or communication with, "intrusion software".
4. D. 4. "Software" specially designed or modified for the generation, operation or delivery of, or communication with, "intrusion software".
4. E. 1. c "Technology" for the "development" of "intrusion software".
4. D. 1. a "Software" specially designed or modified for the "development" or "production" of equipment or "software" specified by 4.A. or 4.D.
4. E. 1 "Technology" according to the General Technology Note, for the "development", "production" or "use" of equipment or "software" specified by 4.A. or 4.D.
Wassenaar provides a further narrowing of its definitions by including a number of exceptions designed to protect security research. These can be found in the "General Software" and "General Technology" notes. Notably, the controls are not intended to apply to software or technology that is generally available to the public, in the public domain, or part of basic scientific research. We have significant problems with even the narrow Wassenaar language; the definition risks sweeping up many of the common and perfectly legitimate tools used in security research.
What's the problem with BIS' new proposal?
In spite of comments encouraging minimal, narrowly-written regulations, BIS has instead proposed an unworkably-broad set of controls, going even further than the European Union implementation in January. Not only does the proposed implementation fail to contain Wassenaar's exceptions, but it goes much further than the Wassenaar text. Specifically, the BIS proposal would add to the list of controlled technology:
Systems, equipment, components and software specially designed for the generation, operation or delivery of, or communication with, intrusion software include network penetration testing products that use intrusion software to identify vulnerabilities of computers and network-capable devices.
Technology for the development of intrusion software includes proprietary research on the vulnerabilities and exploitation of computers and network-capable devices.
On its face, it appears that BIS has just proposed prohibiting the sharing of vulnerability research without a license. This is where things get confusing.
According to Randy Wheeler, Director of the Information Technology Controls Division of BIS and participant in an open conference call to discuss the proposed implementation, "there is a policy of presumptive denial for items that have or support rootkit or zero-day exploit capabilities." She went on to say: "We generally agree that vulnerability research is not controlled, nor is the technology related to choosing a target or finding a target, controlled." However, she undermined her message by stating that any software that is used to help develop 0-day exploits for sale would be covered by the proposal. This is tremendously worrisome because security researchers use the very same tools to develop academic proofs of concept, demonstrating that the vulnerabilities they have found are valid, as to develop 0-days for sale. Indeed the only difference between an academic proof of concept and a 0-day for sale is the existence of a price tag. In other words, BIS may think that it's not regulating vulnerability research, but the proposed rules could end up doing just that.1
The controls BIS is proposing aren't required by Wassenaar, nor are they included in other Wassenaar implementations. For instance, the UK's implementation does not attempt to control the export of exploits or "intrusion software" itself, while a plain reading of the BIS proposal seems to do just that. Similarly, the UK implementation doesn't affect jailbreaking, fuzzing, or vulnerability reporting, while the BIS rules could be interpreted to include them. In other words, the U.S. proposed implementation disregards the protections of Wassenaar's General Notes and goes much further than the equivalent UK rules.
Where do we go from here?
BIS has posted a request for comments on this proposed rule and the comment period is open through July 20, 2015. BIS is specifically asking for information about the negative effects the proposed rule would have on "vulnerability research, audits, testing or screening and your company's ability to protect your own or your client's networks." We encourage independent researchers, academics, the security community, and companies both inside and outside the U.S. to answer BIS' call and submit formal comments. Researchers and companies whose work has been hindered by the European regulations, which are notably less restrictive than the U.S. proposal, are also encouraged to submit comments about their experience.
EFF will be submitting our own comments closer to the July 20 deadline, but in the meantime, we'd love it if those of you who are submitting comments to copy us (firstname.lastname@example.org) so that we can collect and highlight the best arguments both in our own comments and on this blog.
- 1. If the regulations go into effect as worded, and ultimately do restrict sharing vulnerability research, that would also raise First Amendment issues, which could be the basis for a legal challenge. However, it would be better to stop this terrible proposal before it gets to that point. Along those lines, we're pretty sure that BIS won't be swayed by a First Amendment argument as BIS believes it can leave that analysis to the courts. We therefore plan to leave a detailed First Amendment legal argument out of our comments.