In a recent Deeplinks post and in some of our other communications about the Apple case, we've referred to what the government wants Apple to do as creating a "backdoor." Some people have questioned the use of the term, but we think it's appropriate. Here's why.

The term "backdoor" has a long history. It was originally used—along with "trapdoor"—throughout the 1980s to refer to secret accounts and/or passwords created to allow someone unknown access into a system. People worried, for instance, that a malicious programmer or system administrator might leave behind a trapdoor that they would be able to use to get into a system long after they were officially working on it. Later, in the first round of the crypto wars, throughout the 1990s, privacy advocates often referred to the government's key escrow proposals—where the government, or private companies, would keep copies of people's decryption keys—as a "backdoor" into our encryption.

Such use of the term "backdoor" shows that it can broadly refer to any mechanism someone designs into a system that allows for access via bypassing normal security measures. And although the word "backdoor" was historically used most often to refer to secret ways to access a system, a backdoor doesn't need to be secret. The government's ability to bypass the Clipper Chip's security wasn't a secret back in the 1990s. It was part of the system's basic design. But it was still a backdoor; it was a mechanism designed to allow access through bypassing security features.

A backdoor needn't stop being a backdoor just because it is well-known and public. Take the most famous backdoor in recent cryptographic lore, the Dual EC DRBG standard. The NSA secretly designed it to include a deliberate mathematical flaw which only the NSA could exploit using a secret backdoor key to spy on anybody using it. This backdoor became known to the public, but we still call it a backdoor.

Backdoors that track the original definition of the word—i.e., secret accounts or passwords—are still widely used. Researchers found built-in, secret ways of accessing some D-Link routers, Juniper firewalls, and Fortinet firewalls, among other products. All of these vulnerabilities were secretly, and intentionally, placed into the products by someone, and they each provide a point of access that allows third parties to get into their respective systems down the line. They also each highlight the importance of transparency regarding what the software running on our devices really does.

What the FBI is trying to force Apple to do in the current case doesn't share all of the features of these classic backdoors, leading some people to question how well the term applies here. Some point to the fact that the government's attempt to undermine security in this case isn't clandestine. But as we note above, backdoors don't have to be secret. Others point out that the software in question would be developed after the fact, rather than ahead of time. But that doesn't make it any less of an intentionally designed security vulnerability. (Still others have suggested that the true backdoor is in Apple's ability to approve software updates that change a locked device's security properties without erasing its encryption keys, but Apple says it never expected the government to make the demands the company is currently facing.)

We decided to use the term "backdoor" in this case because the FBI's requested software would intentionally weaken the device's security, and we think that intentionally weakening security is the most fundamental hallmark of a backdoor. We think the reasonableness of this term is heightened by Apple's assertion that it never expected the government to demand that it develop this software, and by the risk that the government might go on to demand that Apple use this case as precedent to repeatedly break the security of other iPhones in the same way—or even prohibit Apple from rolling out future security improvements that would interfere with the company's ability to comply with any such demands.

Whatever you call it, the FBI's demand is deeply concerning for the future of technology and security. That's why so many technology experts and companies have weighed in to explain how it threatens the rights and safety of everyone who uses or develops technology.

It's unfortunate that our government officials didn't learn the lessons of the first crypto war—that it is technically impossible to design a "backdoor" that doesn't compromise security—and are now pushing for new forms of backdoors to enable access to encrypted data.