The legal dispute between Apple and the FBI might prove pivotal in the long-running battle to protect users' privacy and right to use uncompromised encryption. The case has captured the public imagination. Of course, EFF supports Apple's efforts to protect its users.

The case is complicated technically, and there is a lot of misinformation and speculation. This post will offer a technical overview, based on information gleaned from the FBI's court motion and Apple's security documentation.

What is Apple being asked to do?

Apple is being asked to assist the FBI's ongoing investigation of last December's San Bernardino mass shooting by providing software to unlock a phone used by (deceased) suspect Syed Rizwan Farook (though owned by his employer, the San Bernardino County Department of Public Health). Legally, the FBI is citing the All Writs Act, a general-purpose law first enacted in 1789 that can allow a court to require third parties’ assistance to execute a prior order of the court when "necessary or appropriate." Judges have questioned the application of this general purpose law for unlocking phones.

Farook's phone is an iPhone 5c, a model sold in late 2013 and early 2014. This model supports disk encryption, meaning straightforward forensic techniques to examine the phone's storage cannot be used by themselves (i.e. physically removing the memory device to examine it directly). The data on disk cannot be decrypted without knowing the correct cryptographic key. This key is generated by combining the user's passcode with a key baked in to the hardware in a way that is designed to be difficult to extract. The only intended way to access the phone's memory is through the phone itself, using the correct passcode.

In this case, the FBI is requesting that Apple create and digitally sign a special version of iOS which is modified in three ways, as specified on page 8 of the court order:

  1. iOS can be set to erase its keys after 10 incorrect passcode guesses. The FBI wants software with this feature disabled.
  2. iOS imposes increasingly long delays after consecutive incorrect passcode guesses to slow down guessing (this is commonly called rate limiting). The FBI wants software that accepts an arbitrary number of guesses with no delays.
  3. iOS requires individual passcodes be typed in by hand. The FBI wants a means to electronically enter passcodes, allowing it to automatically try every possible code quickly.

The FBI has told the court that its goal is to guess Farook's passcode to unlock his phone. If it just tries entering passcodes, though, it might erase the device's keys, at which point the data may never be recoverable. Hence, it wants Apple to write special "cracking software" to ensure that can't happen, and to make the passcode-guessing process easier and faster.

Why can't the FBI just write its own cracking software?

This is technically possible, although it would be a considerable amount of work for the FBI to reverse-engineer all of the details of Apple's encryption format. However, iPhones are designed to only run software which is digitally signed by Apple. Digital signatures are a cryptographic process requiring a secret signing key. In the case of iPhones, this signing key is known only by Apple. So even if the FBI wrote its own software, the phone wouldn't run it unless it were signed by Apple.

Would it be easy for Apple to write and test the requested cracking software?

Despite some claims to the contrary, no. The court document claims that Apple is in the business of writing software so they must have the ability to produce this cracking software. However, writing any secure software is hard, especially so for low-level operating system software that interacts closely with hardware. Apple would also want to apply rigorous testing to any code it is signing. This is a little bit like asking General Motors to build a new truck with a fifth wheel by next month. Of course they could theoretically do this with enough effort. They're in the business of building cars. But it would be expensive and time-consuming for them to be sure the new car is safe.

The bottom line is, implementing any new security feature is not trivial and should not be treated as such. Apple is right to resist being asked to modify the security-critical portions of its software at the government's request. Building backdoors has been a security nightmare in the past.

Could this new cracking software be abused to unlock other phones in the future?

From a technical standpoint, it would be possible to craft the cracking software to only run on Farook's phone by checking its hardware ID. The court order was actually quite specific that this is all that is being requested. Many commentators have claimed that binding to a specific phone is "easy", but again, writing secure software is never easy. For example, it might not be simple for the cracking software to determine for certain which phone it is running on. In this case, one risk would be that the software attempts to only run on Farook's phone, but there's a way to modify other phones to fool the cracking software into running on them as well, turning a device-specific key into a master key.

As Apple wrote to its customers "while the government may argue that its use would be limited to this case, there is no way to guarantee such control." The government is telling Apple to try to limit the scope of the cracking software, but this is a burden Apple would reasonably like to avoid.

Would it be easy for Apple to sign the requested cracking software?

The answer any trained security engineer will give you is "it shouldn't be." It's important to realize that if Apple's iOS signing key were ever leaked, or used to sign a malicious piece of code, it would undermine the secure boot loading sequence of the entire iOS platform. Apple has worked very hard to try to limit its devices to run only Apple-signed firmware and OS code. There are pros and cons to this approach, but Apple considers this signing key among the crown jewels of the entire company. There is no good revocation strategy if this key is leaked, since its corresponding verification key is hard-coded into hundreds of millions of devices around the world.

While we don't know what internal security measures Apple takes with its signing key, we should hope they are very strict. Apple would not want to store it on Internet-connected computers, nor allow a small group of employees to abscond with it or to secretly use the key on their own. It is most likely stored in a secure hardware module in a physical vault (or possibly split across several vaults) and requires several high-level Apple personnel to unlock the key and sign a new code release. A rough comparison showing the complexity that is involved in making high-assurance digital signatures is the DNSSEC Root KSK signing ceremony process (for which video is available online). This is a complicated procedure involving dozens of people.

Whatever Apple's process is, it's not something they want to undertake frequently. This enables a deliberately slow and costly security process. If the government begins routinely demanding new phone-specific cracking software, this could overwhelm the security of this process by requiring many more signatures. This is another valid reason why Apple is right to fight this order.

Would writing this software make it easier to comply with requests in the future?

It would get easier. Much of the cost and difficulty of implementing this cracking software would not need to be repeated in the next case. Assuming Apple developed a working version, it would simply be a matter of changing which phone ID the software was bound to and re-signing it. However, the difficulty of signing code would need to be repeated every time. And as Apple releases new devices, this code would need updating and maintenance (though as pointed out below, it will be less effective on newer phones).

Practically, writing this code would probably encourage more government requests (potentially from other governments around the world). Furthermore, in the US, once this code exists, it could be argued that the burden of complying with requests will become lower. This could have legal impact because of courts' interpretation of the All Writs Act's "necessary or appropriate" limitation to require that assistance not be "unduly burdensome."

Could the FBI get this data some other way?

It's impossible to rule this out. There are many potential avenues for a security compromise in Apple's secure boot-loading process that could be used to evade passcode rate-limiting. Indeed, the hobbyist community has worked on ways to "jailbreak" iPhones for years which enables them to run unsigned code. The FBI could also attempt to physically extract the encryption keys from the hardware, something government agencies have studied how to do in other contexts, though this is a messy process with some risk of destroying the keys before they can be read.

So it's possible there are other ways. This point is potentially relevant legally, as cases interpreting the All Writs Act require "an absence of alternative remedies." However, we lack firm evidence that the FBI has such a capability. Apple also probably doesn't want to argue that its phone is insecure so the authorities should just break into it some other way.

It might seem the FBI wouldn't go through such a public battle if they had a secret way to get the data, but even if they did it's likely they wanted to take this case to court because the high-profile nature of the San Bernardino shootings makes this a sympathetic case to create a legal precedent in their favor.

Has Apple really complied with similar orders in the past?

It's been stated that Apple has unlocked phones 70 times in the past for the authorities. However, this was a very different proposition for older phones without disk encryption. For older phones with no encryption, Apple already had a software version to bypass the unlock screen (used, for example, in Apple stores to unlock phones when customers had forgotten their passcode). So this past history might be completely irrelevant for the case at hand if it's a difference between writing new security-critical software and using software that already existed. In this case, Apple is not refusing to do something that it has done before; the kind of assistance it provided in the past would not be relevant here.

Furthermore, even if Apple had written custom-cracking software in the past, it might reasonably no longer want to do so due to the security risks of writing, testing, and signing this software listed above.

Would this case be different with a modern iPhone?

Maybe, but we don't know for sure how much. It appears from Apple's security whitepaper that the iPhone's encryption uses an iterated key-derivation function which is designed to require the phone to take a minimum of 80 ms to check a passcode guess, regardless of what software is running. This calculation is designed to be inherently complex and slow to complete. Apple says that passcode strength affects how long a guessing attack would take; for long passcodes, that delay, by itself, might be long enough delay to make guessing infeasible even with special cracking software. For example, if the passcode contained eight random lowercase letters, it would take about 700 years to guess at this rate.

Since late 2014, all iPhones ship with a "secure enclave," a cryptographic coprocessor which stores keys in tamper-resistant hardware. The secure enclave imposes much longer timeouts of its own on passcode guessing attempts; after nine incorrect attempts, it permits only one guess per hour, a further 45,000 times slower than the delay mentioned above. (Older devices that lack the secure enclave also apply a similar delay, but apparently do so from within the operating system.) It's possible, though not completely clear from Apple's documentation, that a software update would be able to modify this behavior; if not, this would completely prevent Apple from enabling faster passcode guessing on newer devices.

There has been further speculation that the secure enclave could refuse to accept any software update (even one signed by Apple) unless unlocked by entering the passcode, or alternately, that a software update on a locked phone would erase all of the cryptographic keys, effectively making the phone's storage unreadable. Apple's security documentation makes no such guarantees, though, and there are indications that this isn't the case. So special cracking tools from Apple could potentially still modify the secure enclave's behavior to remove both the 10-guess limit and the delays between guesses.

It's important to realize though that even if today's secure enclave doesn't erase the device after a software update while the device is locked, this could be changed in future devices. It's possible someday Apple or another company will ship devices with secure hardware controlling the device encryption keys that cannot be overridden even by signed software. In that future world, there really would be no way for Apple to assist in an investigation like this.

Summary

EFF supports Apple's stand against creating special software to crack their own devices. As the FBI's motion concedes, the All Writs Act requires that the technical assistance requested not be "unduly burdensome," but as outlined above creating this software would indeed be burdensome, risky, and go against modern security engineering practices.