The Guardian ran a sensational story on Friday claiming a backdoor was discovered in WhatsApp, enabling intelligence agencies to snoop on encrypted messages. Gizmodo followed up saying it's no backdoor at all, but reasonable, intended behavior. So what's really going on here?

The lost phone, lost message dilemma

The issue at question is WhatsApp's answer to the question of what applications should do when someone's phone number changes (or they reinstall their app, or switch phones).

Suppose Alice sends a message to Bob encrypted with Bob's key K1. Alice's message is stored encrypted at the server until Bob can connect and download it. This behavior is required for any app that allows asynchronous communications (meaning you can send a message to somebody while they are offline), which nearly all popular messaging apps support.

Unfortunately, Bob just dropped his phone in a lake. Later on, Bob gets a new phone and reinstalls WhatsApp. On this new phone, the app will create a new key K2. There are two possible behaviors here:

  • Fail safe: The server can delete the queued message, since it was encrypted with K1, which no longer exists. Bob will never see the message. If Alice has turned on key change notifications, she will be warned that Bob is using a new key. She will be told that her message was not delivered and given the option to re-send it. This is what Signal does.
  • Proceed: The server will tell Alice's phone that Bob has a new key K2, and to please re-encrypt the message for K2. Alice's phone will do this, and Bob will get the message. If Alice has turned on key change notifications, she will then be warned that Bob's key had changed. This is what WhatsApp does.

Note that the second behavior makes the service seem more reliable: it's one less way a message can fail to be delivered.

The issue here is that the second behavior opens a security hole: Bob need not have actually lost his phone for the server to act as if he has lost it. Acting maliciously, the server could pretend that Bob's new key is a key that the server controls. Then, it will tell Alice about this new key, but will not give Alice a chance to intervene and prevent the message from being sent. Her phone will automatically re-send the message, which the server can now read. Alice will be notified and can later attempt to verify the new fingerprint with Bob, but by then it will be too late.

By contrast, the first behavior of failing safe prevents this potential attack vector. As far as reliability, however, it also introduces a case in which messages could fail to be delivered.

What to do if you use WhatsApp

If you are a high-risk user whose safety might be compromised by a single revealed message, you may want to consider alternative applications. As we mention in our Surveillance Self-Defense guides for Android and iOS, we don't currently recommend WhatsApp for secure communications.

But if your threat model can tolerate being notified after a potential security incident, WhatsApp still does a laudable job of keeping your communications secure. And thanks to WhatsApp's massive user base, using WhatsApp is not immediate evidence of secretive activity.

If you would like to turn on WhatsApp's key change notifications, go into Settings → Account → Security, and slide “Show security notifications” to the right.

In defense of security trade-offs

The difference between WhatsApp and Signal here is a case of sensible defaults. Signal was designed as a secure messaging tool first and foremost. Signal users are willing to tolerate lower reliability for more security. As anybody who's used Signal extensively can probably attest, these types of edge cases add up and overall the app can seem less reliable.

WhatsApp, on the other hand, was a massively popular tool before end-to-end encryption was added. The goal was to add encryption in a way that WhatsApp users wouldn't even know it was there (and the vast majority of them don't). If encryption can cause messages to not be delivered in new ways, the average WhatsApp user will see that as a disadvantage. WhatsApp is not competing with Signal in the marketplace, but it does compete with many apps that are not end-to-end encrypted by default and don't have to make these security trade-offs, like Hangouts, Allo, or Facebook Messenger, and we applaud WhatsApp for giving end-to-end encryption to everyone whether they know it's there or not.

Nevertheless, this is certainly a vulnerability of WhatsApp, and they should give users the choice to opt into more restrictive Signal-like defaults.

But it's inaccurate to the point of irresponsibility to call this behavior a backdoor.

This is a classic security trade-off. Every communication system must make security trade-offs. Perfect security does no good if the resulting tool is so difficult that it goes unused. Famously, PGP made few security trade-offs early on, and it appears to be going the way of the dodo as a result.

Ideally, users should be given as much control as possible. But WhatsApp has to set defaults, and their choice is defensible.

Detecting bad behavior more easily with Key Transparency

Coincidentally, Google just announced the launch of its new Key Transparency project. This project embraces a big security trade-off: given that most users will not verify their contacts' key fingerprints and catch attacks before they happen, the project provides a way to build guarantees into messaging protocols that a server's misbehavior will be permanently and publicly visible after the fact. For a messaging application, this means you can audit a log and see exactly which keys the service provider has ever published for your account and when.

This is a very powerful concept and provides additional checks on the situation above: Bob and anyone else with the appropriate permissions will know if his account has been abused to leak the messages that Alice sent to him, without having to verify fingerprints.

It's important to note that transparency does not prevent the server from attacking: it merely ensures that attacks will be visible after the fact to more people, more readily. For a few users, this is not enough, and they should continue to demand more restrictive settings to prevent attacks at the cost of making the tool more difficult to use. But transparency can be a big win as a remedy against mass surveillance of users who won't tolerate any reduction in user experience or reliability for the sake of security.

Adding key transparency will not prevent a user from being attacked, but it will catch a server that's carried out an attack.

We are still a long way from building the perfect usable and secure messaging application, and WhatsApp, like all such applications, has to make tradeoffs. As the secure messaging community continues to work towards the ideal solution, we should not write off the current batch as being backdoored and insecure in their imperfect but earnest attempts.

Related Issues