In the digital world, strong encryption is how private conversations stay private. It’s also what keeps our devices secure. Encryption is under a new set of attacks by law enforcement, who continue to seek a magic bullet—a technological backdoor that could circumvent encryption, but somehow not endanger privacy and security more broadly. But that circle can’t be squared, and at this point, the FBI and DOJ know that. That’s why as the government has pushed forward with this narrative, it’s been increasingly backed by false claims

Now, a group of prominent academics and policy makers has signed on to a deeply misguided report that attempts to re-frame the debate along the lines that law enforcement agencies have long urged. The paper is the work of a small group convened by the Carnegie Institute for Peace, which claims to seek a more “pragmatic and constructive” debate about the “challenges” of encryption. Unfortunately, the report begins with the premise that the “problem” to be solved is that law enforcement agencies sometimes can’t access encrypted devices, then suggests those who disagree with the premise hold “absolutist” positions. It goes on to endorse a version of the discredited “key escrow” scheme that, as we have explained before, just won’t work. 

It’s hard to search for “middle ground” in the debate when it is, by definition, a security flaw.

The Carnegie report seeks to differentiate itself from earlier discussions by narrowing areas of disagreement between law enforcement and privacy advocates, seeking to break down the issues into their “component parts.” That’s not a bad idea in itself. But in this case, the separation of the various components ends up just being a way to limit the areas of damage to encryption, focusing on data at rest on a mobile phone. And the report limits this intervention to the strategy it deems most palatable to those with privacy concerns: a system in which phones have a decryption key, specific to that phone. Once police fulfill proper legal process, such as getting a warrant, then they’ll get access to the key on the device. Presumably, that will happen via a separate key held by the company that created the device, or another external agent (the report says only that the key will be “held securely.”)

But building new ways to break into encrypted devices—also known as backdoors—is just a bad idea. Narrowing down the situations and methods under which it takes place doesn’t change that fundamental calculation.

Breaking Encryption Hurts Privacy and Security

As we said when the National Academy of Sciences published a paper on this topic last year, there’s no substitute for strong encryption. If an additional decryption key exists, it can and will be misused. Putting it in the hands of the company that created the phone, and insisting on proper legal procedure, is no guarantee against misuse. Nor would it prevent an attack by an outside actor—a criminal who stole the keys, a rogue government agent who subverted legal process, or an insider at the key-holding company that abuses their access for personal interests.

Maintaining strong encryption—in which only the intended recipient of a message can see the message—isn’t an extreme or “absolutist” position. It’s a position that privacy- and security-enhancing technology should work properly, and shouldn’t be broken by design. It’s hard to search for “middle ground” in the debate when middle ground is, by definition, a security flaw.

Second, it’s not just U.S. government agencies that are interested in gaining access to mobile phones. Other governments, including repressive governments, will insist on having similar systems of access for their own police. 

We can’t deny that in certain cases, providing exceptional access to law enforcement will provide helpful evidence. But constantly calling encryption a “challenge” to criminal investigations is a circular and disingenuous argument. It’s not much different than the “challenge” to law enforcement presented by any unrecorded, face-to-face conversation between two human beings. On this basis, any human interaction that is not overseen and recorded for law enforcement could be cited as an investigative “challenge.” Privacy does present challenges, but it’s indispensable to our lives. Without privacy, we won’t have the free expression and free debate we need for democracy to thrive. 

Moving Beyond Breaking Phones

The FBI and DOJ have spent years arguing to the American people that they should have access to plaintext of every digital conversation that crosses our devices. But that ignores the many other techniques that make it possible to investigate, and draw conclusions about, what has happened in the past—including simple interviews that rely on memory.

One of the reasons for the Carnegie working group report’s narrow focus is, in fact, the astonishing amount of data police currently have access to. For instance, cloud services are excluded from consideration, dismissed as “a less worrisome area than encrypted phones or encrypted messaging.” The paper rightly points out that the prevalence of cloud data is already “a tool and source of data for law enforcement.” Even if more cloud data becomes encrypted—as EFF has urged—the adoption of Internet-connected devices will continue to generate data that’s accessible to law enforcement.

The paper also strategically leaves aside other methods of access, such as forced software updates. The authors correctly note that if software updates are mechanisms of access for law enforcement, consumers could lose trust in those updates. And the report acknowledges that this could be even more pronounced in vulnerable communities, citing “minority groups who fear law enforcement targeting.” These trust problems are real, and the “key escrow” system that the authors propose does not magically avoid them. 

In the end, we’re disappointed that this thoughtful group chose to examine encryption solely as a “challenge” to police seeking a form of special access. We shouldn’t lose sight of the huge benefits that secure, private encryption provides us all.