Apple has been served with a court order at the FBI’s request, demanding that it assist the government agency with unlocking an iPhone 5C that was used by Syed Rizwan Farook. Farook and his wife, Tashfeen Malik, killed 14 and injured 24 in an attack in San Bernardino, California on December 2, 2015.
In response, Apple CEO Tim Cook said that the FBI was demanding the equivalent of a backdoor and that complying with the FBI’s demand would undermine the security of all iPhones.
Whether you call it a “backdoor” or not, it’s important to recognize that the ordered changes to the iPhone operating system would not circumvent the core of the iPhone’s encryption. The court isn’t asking Apple to defeat the encryption in any way. Nor does the court require Apple to create a vulnerability that would jeopardize the security of any other phone. Rather, it’s asking Apple to do the one thing that Apple alone can do: use the iPhone’s built-in method of installing firmware written by Apple.
The FBI wants to search the iPhone 5C and has been granted permission to do so by the device’s owner, the San Bernardino County Department of Public Health (Farook’s employer). To perform this search, the FBI needs the device’s PIN. Without it, the government has no way of decrypting the iPhone’s storage and hence no way of examining any data stored on the device.
The encryption used by the iPhone to protect its storage is a multi-tiered system. At its core are two keys, one embedded in the hardware and the second derived from the PIN. The hardware key is used to generate a file system key that is in turn used to encrypt the file system metadata. That metadata includes an encryption key for each individual file. That per-file key is encrypted using (indirectly) an encryption key that is derived from a combination of the hardware key and the PIN key. As such, without the PIN key, it’s impossible to decrypt those per-file keys and hence impossible to decrypt files stored on the iPhone.
There are some minor nuances over where and how the hardware key is stored and where the different encryption operations are performed. There are variations between different iPhone models, but the broad design is true of all iPhone models running iOS 9.
It’s important to note here that the cryptography aspect is robust. The FBI is not asking for, and Apple almost surely could not provide, any kind of bypass or backdoor for the cryptographic parts of the system. There is no “master key” that can decrypt the files or otherwise break the dependence on the PIN key. The cryptography appears to be secure.
In practice, encryption isn’t usually defeated by cryptographic attacks anyway. Instead, it’s defeated by attacking something around the encryption: taking advantage of humans’ preference for picking bad passwords, tricking people into entering their passwords and then stealing them, that kind of thing. Accordingly, the FBI is asking for Apple’s assistance with the scheme’s weak spot—not the encryption itself but Apple-coded limits to the PIN input system.
PINs, especially four-digit PINs, are highly susceptible to brute-force attacks. With four digits and hence only 10,000 possible combinations, it’s straightforward to simply try every number in sequence until you hit the right one. To combat this, the iPhone uses three specific techniques.
The first is that the iPhone imposes delays between PIN attempts. While the first four attempts can be entered back-to-back, the iPhone will force you to wait one minute before the fifth attempt, five minutes before the sixth, 15 minutes before the seventh and eighth, and a full hour before the ninth.
The second technique is that the iPhone can be configured to wipe the device after ten failed PIN attempts. When this option is turned on, the phone will discard its file system key after 10 bad PINs, rendering all the file system metadata (including the per-file keys) permanently inaccessible.
The third and final technique is that the computation used to derive the PIN key from the PIN itself is slow, taking approximately 80 milliseconds.
It’s the first two of these mechanisms that the FBI is asking for assistance with. While the 80 millisecond delay is in some sense unavoidable (a faster system might be able to perform the key derivation more quickly, but it’s not as if the iPhone hardware is readily upgradeable), both the escalating long delays and device-wiping functionality are arbitrary software decisions. The FBI is asking for Apple to create a custom iPhone firmware that removes the escalating delays and omits the device wipe. As a bonus, the FBI is also asking for a way to enter PINs other than typing them in one after the other on the touchscreen. Thus, the FBI wants Apple to make a special version of iOS that is amenable to brute-force attacks on its PIN.
As long as the phone uses a PIN, this would ultimately let the FBI unlock it. If it’s locked with a secure password, unlocking the phone may well prove intractable even with the special firmware.
Such a firmware would not seem to be generally useful for attacking other iPhones, though. The FBI’s request is that the special firmware be tied to the specific device. Every iPhone contains a multitude of unique identifiers that are baked into its hardware (the serial number, the cellular radio IMEI, and the Wi-Fi and Bluetooth MAC), and the court order explicitly states that the custom firmware must be tied to the San Bernardino phone’s unique identifier, such that it can only run on that specific phone.
Assuming that this can be done (and done robustly), it means that even if the custom firmware were given to nation-states or even published on the Internet, it would not serve as a general-purpose way of performing brute-force PIN attacks. It would be useless on any device other than the San Bernardino device. To make such leakage less likely, the court order does allow for the possibility that the custom firmware might be used only at an Apple location, with the FBI having remote access to the passcode recovery system.
Such an approach is consistent with the way Apple already performs lock screen bypasses on devices running old versions of iOS; law enforcement sends the device to Apple, Apple does the data extraction using tools the company has explicitly created to perform the extraction, and law enforcement receives a FireWire or USB drive with the data. Apple’s custom tools never leave Cupertino.
Hypothetically, if the special firmware were to leak, what exactly would prevent people from making it work with a different unique identifier—or even with any unique identifier. This concern strikes at the very heart of the matter, and it’s why Apple is involved at all.
The FBI does not really need Apple to write a custom firmware that lets you brute force the iPhone PIN without risk of wiping the device or suffering lengthy timeouts. It’s much easier for Apple to write this code, of course, because Apple knows all about the iPhone, but there’s no doubt that the FBI could pay some enterprising reverse engineers and hackers to develop the software itself. The problem for the FBI is not so much the development of the software; it is getting that software to run on the iPhone.
The iPhone requires that its firmware have a digital signature that authentically demonstrates that the firmware was developed by Apple and has not been subsequently modified. The FBI does not have (and is not asking for) access to Apple’s signing key. It is instead asking for Apple to use its signing key to sign the custom firmware so that the iPhone will accept it and run it. It is this signature requirement that means the FBI cannot create the software itself.
It’s this same requirement that also means that iPhone users would be safe even if the special firmware leaked. Changing the embedded unique identifier within the special firmware would break the signature and thus cause targeted iPhones to reject the firmware. This is why complying with the court demand would not jeopardize the security of any other phones. The cryptographic safeguards don’t allow it.
The security of these digital signatures is being taken for granted by the FBI; once again, the cryptography underpinning the system is sound, and the government is not asking for it to be bypassed or backdoored or otherwise attacked.
The FBI’s request does, however, put into sharp relief the parts that aren’t sound. The PIN lockouts and device wiping measures are all “just software.” They’re not dependent on any particular mathematical feature of the algorithms, they’re not proven by years of analysis of the underlying mathematics. And as “just software,” Apple has every ability to override them.
One could imagine ways in which iPhones were made a little more resilient against this kind of thing, but they’re not straightforward. The court order suggests the use of the iPhone’s “DFU” mode. This is an extremely low-level mode designed for last-ditch recovery of the device. In this mode, the screen is not even activated or enabled; the phone has to be connected to a computer via USB to transfer a new firmware image. One could imagine ways in which even this mode could be PIN protected, perhaps even making it destroy the file system key if a correct PIN is not available, but this is tricky. One of the points of DFU mode is its simplicity. It does one thing as a fail-safe emergency measure. Making it more complex would jeopardize its ability to serve its fundamental purpose.
Overall, the FBI’s request could be seen as a testament to just how good encryption is. The FBI can’t attack the iPhone’s encryption directly, and it can’t bypass the firmware signature mechanism. There’s no existing backdoor to the crypto.
But what the iPhone does have is software lockouts, and the security of those lockouts is entirely up to Apple. Apple’s signing key gives the company wide power over the software-level protections built in to iOS. The FBI knows this, and that is why it’s demanding the company’s assistance.
SOURCE: Peter Bright | Ars Technica