This article was first published on Just Security.
Government Communications Headquarters (GCHQ), the UK’s counterpart to the National Security Agency (NSA), has fired the latest shot in the crypto wars. In a post to Lawfare titled Principles for a More Informed Exceptional Access Debate, two of Britain’s top spooks introduced what they’re framing as a kinder, gentler approach to compromising the encryption that keeps us safe online. This new proposal from GCHQ—which we’ve heard rumors of for nearly a year—eschews one discredited method for breaking encryption (key escrow) and instead adopts a novel approach referred to as the “ghost.”
But let’s be clear: regardless of what they’re calling it, GCHQ’s “ghost” is still a mandated encryption backdoor with all the security and privacy risks that come with it.
Backdoors have a (well-deserved) horrible reputation in the security community. But that hasn’t dissuaded law enforcement officials around the world from demanding them for more than two decades. And while the Internet has become a more dangerous place for average users, making encryption more important than ever, this rhetoric has hardly changed.
What has changed is the legal landscape governing encryption and law enforcement, at least in the UK. 2016 saw the passage of the Investigatory Powers Act, which gives the UK the legal ability to order a company like Apple or Facebook to tamper with security features in their products—while simultaneously being prohibited from telling the public about it.
As far as is publicly known, the UK has not attempted to employ the provisions of the Investigatory Powers Act to compromise the security of the products we use. Yet. But GCHQ’s Lawfare piece previews the course that the agency is likely to take. The authors lay out six “principles” for an informed debate, and they sound pretty noncontroversial.
Privacy and security protections are critical to public confidence. Therefore, we will only seek exceptional access to data where there’s a legitimate need, that access is the least intrusive way of proceeding and there is appropriate legal authorisation.
Investigative tradecraft has to evolve with technology.
Even when we have a legitimate need, we can’t expect 100 percent access 100 percent of the time.
Targeted exceptional access capabilities should not give governments unfettered access to user data.
Any exceptional access solution should not fundamentally change the trust relationship between a service provider and its users.
Transparency is essential.
So far so good. I absolutely agree that law enforcement should only act where there’s a legitimate need and only when authorized by a court, in a way that evolves with the tech, that doesn’t have unrealistic expectations, that doesn’t enable mass surveillance, that doesn’t undermine the public trust, and that is transparent.
But unfortunately, the authors fail to apply the principles so carefully laid out to the problem at hand. Instead, they’re proposing a way of undermining end-to-end encryption using a technique that the community has started calling the “ghost.” Here’s how the post describes it:
It’s relatively easy for a service provider to silently add a law enforcement participant to a group chat or call. The service provider usually controls the identity system and so really decides who’s who and which devices are involved – they’re usually involved in introducing the parties to a chat or call. You end up with everything still being end-to-end encrypted, but there’s an extra ‘end’ on this particular communication. This sort of solution seems to be no more intrusive than the virtual crocodile clips that our democratically elected representatives and judiciary authorise today in traditional voice intercept solutions and certainly doesn’t give any government power they shouldn’t have.
Applying this idea to WhatsApp, it would mean that—upon receiving a court order—the company would be required to convert a 1-on-1 conversation into a group chat, with the government as the third member of the chat. But that’s not all. In WhatsApp’s UX, users can verify the security of a conversation by comparing “security codes” within the app. So for the ghost to work, there would have to be a way of forcing both users’ clients to lie to them by showing a falsified security code, as well as suppress any notification that the conversation’s keys had changed. Put differently, if GCHQ’s proposal went into effect, consumers could never again trust the claims that our software makes about what it’s doing to protect us.
The authors of the Lawfare piece go out of their way to claim that they are “not talking about weakening encryption or defeating the end-to-end nature of the service.”
[T]he ghost will require vendors to disable the very features that give our communications systems their security guarantees in a way that fundamentally changes the trust relationship between a service provider and its users.
They’re talking about adding a “feature” that would require the user’s device to selectively lie about whether it’s even employing end-to-end encryption, or whether it’s leaking the conversation content to a third (secret) party. Is the security code displayed by your device a mathematical representation of the two keys involved, or is it a straight-up lie? Furthermore, what’s to guarantee that the method used by governments to insert the “ghost” key into a conversation without alerting the users won’t be exploited by bad actors?
Despite the GCHQ authors’ claim, the ghost will require vendors to disable the very features that give our communications systems their security guarantees in a way that fundamentally changes the trust relationship between a service provider and its users. Software and hardware companies will never be able to convincingly claim that they are being honest about what their applications and tools are doing, and users will have no good reason to believe them if they try.
And, as we’ve seen already seen, GCHQ will not be the only agency in the world demanding such extraordinary access to billions of users’ software. Australia was quick to follow the UK’s lead, and we can expect to see similar demands, from Brazil and the European Union to Russia and China. (Note that this proposal would be unconstitutional were it proposed in the United States, which has strong protections against governments forcing actors to speak or lie on its behalf.)
The “ghost” proposal violates the six “principles” in other ways, too. Instead of asking investigative tradecraft to evolve with technology, it’s asking technology to build investigative tradecraft in from the ground floor. Instead of targeted exceptional access, it’s asking companies to put a dormant wiretap in every single user’s pocket, just waiting to be activated.
We must reject GCHQ’s newest “ghost” proposal for what it is: a mandated encryption backdoor that weakens the security properties of encrypted messaging systems and fundamentally compromises user trust.
GCHQ needs to give up the ghost. It’s just another word for an encryption backdoor.