Encryption is back in the headlines again, with government officials insisting that they still need to compromise our security via a backdoor for law enforcement. Opponents of encryption imagine that there is a “middle ground” approach that allows for strong encryption but with “exceptional access” for law enforcement. Government officials claim that technology companies are creating a world where people can commit crimes without fear of detection.

Despite this renewed rhetoric, most experts continue to agree that exceptional access, no matter how you implement it, weakens security. The terminology might have changed, but the essential question has not: should technology companies be forced to develop a system that inherently harms their users? The answer hasn’t changed either: no.

Let us count the reasons why. First, if mandated by the government, exceptional access would violate the First Amendment under the compelled speech doctrine, which prevents the government from forcing an individual, company, or organization to make a statement, publish certain information, or even salute the flag.

Second, mandating that tech companies weaken their security puts users at risk. In the 1990s, the White House introduced the Clipper Chip, a plan for building backdoors into communications technologies. A security researcher found enormous security flaws in the system, showing that a brute-force attack could likely compromise the technology.

Third, exceptional access would harm U.S. businesses and chill innovation. The United States government can’t stop development on encryption technologies; it can merely push it overseas.

Finally, exceptional access fails at its one stated task—stopping crime. No matter what requirements the government placed on U.S. companies, sophisticated criminals could still get strong encryption from non-U.S. sources that aren’t subject to that type of regulation.

There’s No Such Thing as a Safe Backdoor

Despite the broad consensus among technology experts, some policymakers keep trying to push an impossible “middle ground.” Last month, after years of research, the National Academy of Sciences released a report on encryption and exceptional access that collapsed the question of whether the government should mandate ‘exceptional access’ to the contents of encrypted communications with how the government could possibly accomplish this mandate without compromising user security. Noted crypto expert Susan Landau worried that some might misinterpret the report as providing evidence that an exceptional access system is close to being securely built:

"The Academies report does discuss approaches to ‘building ... secure systems’ that provide exceptional access—but these are initial approaches only…The presentations to the Academies committee were brief descriptions of ideas by three smart computer scientists, not detailed architectures of how such systems would work. There's a huge difference between a sketch of an idea and an actual implementation—Leonardo da Vinci’s drawings for a flying machine as opposed to the Wright brothers’ plane at Kitty Hawk."

And it didn’t stop with the NAS. Also last month, the international think-tank EastWest Institute published a report that proposed “two balanced, risk-informed, middle-ground encryption policy regimes in support of more constructive dialogue.”

Finally, just last week, Wired published a story featuring Microsoft’s previous chief technology officer Ray Ozzie and his attempt to find an exceptional access model for phones that can supposedly satisfy “both law enforcement and privacy purists.” While Ozzie may have meant well, experts like Matt Green, Steve Bellovin, Matt Blaze, Rob Graham and others were quick to point out its substantial flaws. No system is perfect, but a backdoor system for billions of phones magnifies the consequences of a flaw, and the best and the brightest in computer security don’t know how to make a system bug-free.

The reframing keeps coming, but the truth remains. Any efforts for “constructive dialogue” neglect a major obstacle: the government’s starting point for this dialogue is diametrically opposed to the very purpose of encryption. To see why, read on.

Encryption: A User’s Guide to Keys

Encryption is frequently described using analogies to “keys”—whoever has a key can decrypt, or read, information that is behind a “lock.” But if we back up, we can see the problems with that metaphor.

In ancient times, encryption was achieved using sets of instructions that we now call “unkeyed ciphers,” that explained how to both scramble and unscramble messages. These ciphers sometimes used simple rules, like taking alphanumeric text and then rotating every letter or number forward by one, so A becomes B, B becomes C, and so on. Ciphers can also use more complex rules, like translating a message’s letters to numbers, and then running those numbers through a mathematical equation to get a new string of numbers that—so long as the cipher is unknown—is indecipherable once seen by an outside party.

As encryption progressed, early cryptographers started to use “keyed ciphers” with ever-stronger security. These ciphers use secret information called a “key” to control the ability to encrypt and decrypt.

Keys continue to play a major role in modern encryption, but there is more than one kind of key.

Some digital devices encrypt stored data, and the password entered to operate the device unlocks the random key used to encrypt that data. But for messages between people—like emails, or chats—all modern encryption systems are based on “public key encryption.” The advantage of this form of encryption is that the people communicating don’t have to have a secret (like a password) in common ahead of time.

In public key encryption, each user—which can be a person or an entity, like a company, a website, or a network server—gets two related keys. (Sometimes, more pairs are generated than just one.) There is one key to encrypt data, and another key to decrypt data. The key that encrypts data is called the “public key,” and it can be shared with anyone. It’s sort of like a public instruction set—anyone that wishes to send encrypted messages to a person can use their public instruction set to encrypt data according to those rules. The second key is called a “private key,” and it is never shared. This private key decrypts data that has been encrypted using a corresponding public key.

In modern encryption, these keys aren’t used for encrypting and decrypting messages themselves. Instead, the keys are used to encrypt and decrypt an entirely separate key that, itself, both encrypts and decrypts data. This separate key, called a session key, is used with a traditional symmetric cipher—it represents a secret set of instructions that can be used by a message sender and receiver to scramble and unscramble a message.

Public key encryption ensures that a session key is secure and can’t be intercepted and used by outsiders. Private keys hold the secret to session keys, which hold the secret to encrypted messages. The fewer opportunities for private encryption keys to be stolen or accidentally released, the greater the security.

Yet this is precisely what exceptional access demands—more keys, more access, and more vulnerability. Exceptional access, at its core, erodes encryption security, granting law enforcement either its own set of private keys for every encrypted device and individual who sends and receives encrypted messages, or requiring the creation—and secure storage—of duplicate keys to be handed over.

And that’s why law enforcement’s proposals for a “responsible solution” are irresponsible. Any system that includes a separate channel for another party to access it is inherently less secure than a system that does not have that channel. In encryption systems, the very existence of duplicated or separate, devoted keys makes those keys attractive for bad actors. It would be like creating duplicate, physical keys for a bank vault—the risk of one of those keys getting lost, or stolen, is bad enough. Copying that key (for law enforcement agencies in the U.S. and potentially around the globe) multiplies the risk.

There is no good faith compromise in the government’s exceptional access request. The “middle ground” between what law enforcement agencies want—bad encryption—and what users want—good encryption—is still just bad encryption.

In a 2017 interview with Politico (paywall), Deputy Attorney General Rod Rosenstein conceded that a device with exceptional access “would be less secure than a product that didn’t have that ability.” He continued:

“And that may be, that’s a legitimate issue that we can debate—how much risk are we willing to take in return for the reward?”

The answer to that question has to be informed by solid information about what we risk when we give up strong encryption. So this week EFF is bringing the nerds (aka technologists) to Washington, D.C. to host an informative briefing for Senate staffers. We need all policymakers to get this right, and not fall prey to rhetoric over reality.