We’ve talked about encryption here before, and since this is such an interesting story, with so many broad implications, I thought it would be good to give a rundown of the actual cryptography at play here. If you are curious about the legal and ethical implications, well, maybe you can find a blog on legal blogs somewhere, or you can read this quick explainer of the legal pieces of the puzzle.
All of Apple’s security schemes are described in tediously technical detail in the iOS Security Guide, and what we can get from that is that the iPhone’s security system is multi-tiered and really complicated, but there are two important numbers to keep in mind:
The first time you boot up and enter your PIN, your phone will use these two numbers to generate a third number called the passcode key — this process takes about 80 millisecond and it’s personalized to your phone. The UID and the precise recipe for tangling up the UID and PIN are both stored in the so-called secure enclave of your phone’s hardware. On his cryptographic engineering blog, Matthew Green gives a bunch of this information and more in a nice plain(isn) language primer on iPhone security at least up until iOS8.
Now, we can definitely give a hat-tip to Apple and say that cryptographically, this sucker is pretty airtight. There’s no way to open up the enclave or find some brute force method to extract the 256-bit UID encryption key, it’s just way too hard and it would take forever. Strictly speaking, there’s also no way to extract the PIN since it’s in the secure enclave of your brain. But the fortress of the iPhone is not impregnable, because of one thing: there are only a finite number of PIN choices, in fact 10,000 if we restrict to 4 digit numbers. So provided you can guess the PIN, the rest of the system will fall into place for you. But that, as they say, is where they get ya.
Apple is so clever that they have build a second barrier of security, namely you have to wait some number of minutes between each attempted PIN, and you can only try out 10 different PINs before you get locked out and the data on the phone gets wiped forever. This “some number of minutes” amounts to several days once you’ve made more than 4 attempts. So what the FBI is really asking is for Apple to write a tool to disable these lockout features. This seems like a feasible project, but due to special signature recognitions in the software, such a program can only be written in-house at Apple.
What’s not totally obvious is to me is how dangerous it would be to birth a tool like this into existence. Yes, in some sense it does open up a backdoor to the iPhone security system, but in another sense it may be ad hoc to the point of being useless outside the context of this one particular iPhone in question. I suppose the larger question is whether it’s appropriate for the FBI to make these sorts of demands on industry. A recent PEW Research poll shows that 51% of American say yes. It suffices to say, it’s a complicated issue and a worthwhile reminder that mathematical cryptography is only as strong as the humans who hold the keys.