Open Letter to Tech Companies Includes 10 Principles to Protect Users From NSA Sabotage

In the past nine months, our trust in technology companies has been badly shaken. Today, in collaboration with prominent security researchers and technologists, EFF presents an open letter to technology companies, urging them to protect users from NSA backdoors and earn back the trust that has been lost.

From the Snowden revelations emerge stories of collusion between government spy agencies and the companies whose services are integral to our everyday lives. There have been disturbing allegations published by Reuters indicating that RSA, an influential information security firm, accepted a $10 million contract from NSA that included, among other items, an agreement to use what we now know to be an intentionally compromised random number generator as the default for its BSAFE cryptographic library.

A future where we cannot trust the very technologies meant to secure our communications is fundamentally unsustainable. It's time for technology companies to start helping users regain trust, with transparency and active opposition to illegal surveillance. Implementing the requisite changes in technical infrastructure and business practices may have short-term costs; however, the long-term cost of keeping users in perpetual fear of NSA sabotage is far greater.

How to Protect Your Users from NSA Backdoors: An Open Letter to Technology Companies

As security researchers, technologists, and digital rights advocates, we are deeply concerned about collaboration between government agencies and technology companies in undermining users' security. Among other examples, we are alarmed by recent allegations that RSA, Inc. accepted $10 million from NSA to keep a compromised algorithm in the default setting of a security product long after its faults were revealed. We believe that covert collusion with spy agencies poses a grave threat to users and must be mitigated with commitment to the following best practices to protect users from illegal surveillance:

  1. Provide public access to source code whenever possible, and adopt a reproducible build process so that others can verify the integrity of pre-compiled binaries. Both open and closed source software should be distributed with verifiable signatures from a trusted party and a path for users to verify that their copy of the software is functionally identical to every other copy (a property known as "binary transparency").
  2. Explain choices of cryptographic algorithms and parameters. Make best efforts to fix or discontinue the use of cryptographic libraries, algorithms, or primitives with known vulnerabilities and disclose to customers immediately when a vulnerability is discovered.1
  3. Hold an open and productive dialogue with the security and privacy communities. This includes facilitating review and responding to productive criticism from researchers.
  4. Provide a clear and secure pathway for security researchers to report vulnerabilities. Fix security bugs promptly.
  5. Publish government request reports regularly (often these are called "Transparency Reports"). Include the most granular reporting allowed by law.
  6. Invest in secure UX engineering to make it as easy as possible for users to use the system securely and as hard as possible for users to use the system unsafely.
  7. Publicly oppose mass surveillance and all efforts to mandate the insertion of backdoors or intentional weaknesses into security tools.
  8. Fight in court any attempt by the government or any third party to compromise users’ security.
  9. Adopt a principle of discarding user data after it is no longer necessary for the operation of the business.
  10. Attempt to protect as much data-in-transit as possible with strong encryption in order to prevent dragnet surveillance. Follow best practices for setting up SSL/TLS on servers whenever applicable.

Sincerely,
The Electronic Frontier Foundation in collaboration with*:

  • Stephen Checkoway, Assistant Research Professor, Department of Computer Science, Johns Hopkins University
  • Roger Dingledine, Project Leader, Tor Project
  • Brendan Eich, Founder, Mozilla
  • Matthew Green, Assistant Research Professor, Department of Computer Science, Johns Hopkins University
  • Nadia Heninger, Assistant Professor, Department of Computer and Information Science, University of Pennsylvania
  • Tanja Lange, Professor, Department of Mathematics and Computer Science, Technische Universiteit Eindhoven
  • Nick Mathewson, Chief Architect, Tor Project
  • Ruben Niederhagen, Department of Mathematics and Computer Science, Technische Universiteit Eindhoven
  • Eleanor Saitta, OpenITP / IMMI
  • Bruce Schneier, Security Technologist
  • Peter Schwabe, Assistant Professor, Digital Security Group, Radboud University Nijmegen
  • Christopher Soghoian, Principal Technologist, Speech, Privacy and Technology Project, American Civil Liberties Union
  • Ashkan Soltani, Independent Researcher and Consultant
  • Jon A. Solworth, Associate Professor, Department of Computer Science, University of Illinois at Chicago
  • Brian Warner, Tahoe-LAFS Project
  • Zooko Wilcox-O'Hearn, Founder and CEO, LeastAuthority.com

*Affiliations listed for identification purposes only.

  • 1. If disclosing a vulnerability is likely to cause additional harm to users, disclosure should be postponed until this is no longer the case.