The Semantics of "Backdoor"
There’s some confusion around Tim Cook’s February 16th letter to customers. Here are the salient parts:
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
And:
In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
This has caused some amount of controvery and misunderstanding. As soon as the letter was posted, at least one astute developer noticed that Tim’s chief complaint, that a backdoor created to access the San Bernardino phone could be reused to access other phones, was factually incorrect.
Even before the letter, Dan Guido explained how it is technically feasible for Apple to comply with the FBI request without jeopardizing the security of all iOS users. The short version is this: Apple can create a version of iOS which can be brute-forced, authorize it for installation on a only single device, and install it just there, giving the FBI exactly what they want.
But, Tim says they can’t.
So – Is Tim lying? Can they or can’t they create a backdoor to access just a single device? Well, it’s more nuanced than that.
I believe that Apple’s view of what consititutes a backdoor is different the FBI’s. The FBI’s request is strictly technical, but when Tim speaks of backdoors, he’s speaking of the process of a government requiring Apple to create an insecure OS and authorize it for installation on a single device, and not the technical method in which that OS’s security is compromised. When you view things through this lens, the letter is entirely truthful.
If that process were established – one that requires Apple to create and authorize new, insecure versions of iOS, it would set a precedent which compromises the security of all users, which is true to Tim’s word.
“But that’s not what Tim said!”
It’s confusing, especially for technical people who understand iOS well, because Tim’s letter uses technical languages and speaks obliquely about modern cryptography, to where a backdoor to one encrypted message would indeed mean a backdoor to all encrypted messages. Specifically, here:
In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices.
“This tool” is not an intentionally insecure version of iOS. It is the process of creating such a tool.
“But Apple has done this for law enforcement agencies before!”
Yes, there are documented cases of Apple bypassing security on pre-iOS 8 devices to assist law enforcement agencies. The difference is that on pre-iOS 8 devices, not all user data was encrypted by default. Apple already had the existing tools to extract it. The creation of a custom, insecure OS to be installed on the device was not necessary.1
“But what about the Secure Enclave / Secure Element, and the fact that this is an iPhone 5c without those features?”
None of that really matters, because while Apple maintains the ability to modify the OS on a locked phone2, it’s possible for them to to change it in a way which compromises user data. Apple has chosen to draw a line in the sand here, at “we will not craft a malicious or knowningly insecure OS,” because they believe that is what constitutes a backdoor.
For that, they deserve a standing ovation.
-
For more info, see the EFF comments here, The Daily Dot’s coverage here, and Motherboard’s coverage here ↩
-
I wouldn’t at all be surprised to see iOS 10 not support locked device restores / DFU mode, simply to further prevent anyone, even themselves, from being able compromise user devices. ↩