More lessons from "Facebook Research"
Last week, Facebook was caught using a sketchy market research app to gobble large amounts of sensitive user activity after instructing users to alter the root certificate store on their phones. A day after, Google pulled a similar iOS “research program” app. Both of these programs are a clear breach of user trust that we have written about extensively.
This news also drew attention to an area both Android and iOS could improve on. Asking users to alter root certificate stores gave Facebook the ability to intercept network traffic from users’ phones even when that traffic is encrypted, making users' otherwise secure Internet traffic and communications available to Facebook. How the devices alert users to this possibility—the "UX flow"—on both Android and iOS could be improved dramatically.
To be clear, Android and iOS should not ban these capabilities altogether, like Apple has already done for sideloaded applications and VPNs. The ability to alter root certificate stores is valuable to researchers and power-users, and should never be locked-down for device owners. A root certificate allows researchers to analyze encrypted data that a phone’s applications are sending off to third-parties, exposing whether they’re exfiltrating credit-card numbers or health data, or peddling other usage data to advertisers. However, Facebook’s manipulation of regular users into allowing this ability for malicious reasons indicates the necessity of a clearer UX and more obvious messaging.
Confusing prompts for adding root certificates
When regular users are manipulated into installing a root certificate on their device, it may not be clear that this allows the owner of the root certificate to read any encrypted network traffic.
On both iOS and Android, users installing a root certificate click through a process filled with vague jargon. This is the explanation users get, with inaccessible jargon bolded.
Android: “Note: The issuer of this certificate may inspect all traffic to and from the device.”
iOS: “Installing the certificate “<certificate name>” will add it to the list of trusted certificates on your iPhone. This certificate will not be trusted for websites until you enable it in Certificate Trust Settings.”
Regular users probably don’t know about the X.509 Certificate ecosystem, who certificate issuers are, what it means to “trust” a certificate, and its relationship to encrypting their data. On Android, the warning is vague about who has what capabilities: an “issuer … may … inspect all traffic”. On iOS, there’s no explanation whatsoever, even in the “Certificate Trust Settings,” about why this may be a dangerous action.
Security-compromising actions should have understandable messaging for non-technical users
The good news: it’s possible to get this sort of messaging right.
For instance, these dangers also apply on browsers, where the warnings to users are much more clear. Compare the above messaging flow for trusting root certificates on your phone to the equivalent warnings on browsers when you hit a website with a self-signed or untrusted certificate. Chrome warns in very large letters, “Your connection is not private,” and Firefox similarly announces, “Your connection is not secure.” Chrome’s messaging even lists possible types of sensitive data that may be exfiltrated: “passwords, messages, or credit cards." Changing your browser’s root certificate store then involves multiple steps hidden behind an “Advanced” button.
Another good example comes from Facebook itself: when you open a browser developer console on Facebook’s website, a big red “Stop!” appears to prevent users not familiar with the console from doing something dangerous. Here, Facebook goes out of its way to warn users about the dangers of using a feature meant for researchers and developers. Facebook’s “market research” app, Android, and iOS did none of this.
The answer should not be to vilify root certificates and their capabilities in general. Tools like this prove themselves invaluable to security researchers and privacy experts. At the same time, they should not be presented to general users without abundantly clear messaging and design to indicate their potential dangers.