The FBI has the iPhone of the San Bernadino shooters, and would very much like to examine its contents. But they have a problem: the contents are encrypted; guess the wrong password ten times, and the phone will self-destruct like one of those tapes in Mission Impossible (that’s not a technically correct analogy, of course: the data would end up permanently encrypted, and there would be no smoke). The FBI thus got a judge, using the authority of the 1789 (sic!) All Writs Act (more on this another day; for some initial analysis, see here), to order Apple to disable the auto-destruct, which would allow the FBI to fire up its biggest computers to try to guess the password by brute force. In what is likely to be the beginning of a very long legal fight, Apple refused, arguing that there would be no way to open this individual phone without creating a back door that would enable access to all phones of that type.
The FBI insisted: no, they only wanted that one phone, which has a unique identifier. It’s been a long time since I did any computer programming, but to me Apple’s argument makes sense and the FBI’s doesn’t: I can’t imagine how you could write a program that would disable auto-destruct only if it first encountered a specific serial number. You could write it to do that, but the technique to disable the auto-destruct would have to be there in a generalizable version, one that could be discovered or exploited by others (including the FBI, foreign intelligence agencies, and so on). In other words, any technique would have to be coded first, and then the program could be written to look for the one serial number. But if it can operate with one serial number, it can operate with any other, creating a de facto backdoor. Ben Thompson at Stratechery explains:
“The judge has ordered Apple to build a custom version of the operating system using their signature that removes the 10-try limitation and the artificial delay between passcode entries (and adds a way to enter guesses via an external device, as opposed to having someone enter passcode guesses manually); this would allow the FBI to bruteforce the passcode and potentially gain access to the device. I say “potentially” because even with the five-second software limitation removed the 5C’s hardware needs 80-milliseconds to process each request, and we don’t know how long the terrorist’s passcode is: a 4-digit numeric passcode would only take 34 minutes to brute force, while an 8 digit alphanumeric passcode would still take over a million years.”
So, Apple correctly understands itself to be fighting for the right not to make what amounts to a backdoor to its iPhones; those without strong passcodes would be highly vulnerable to being cracked by anyone with enough computing firepower (one might note in passing that the backdoor would therefore probably not help very much against terrorists, who would use strong passcodes, but would be very useful to criminals attempting to scam the elderly). From CEO Tim Cook’s open letter announcing the refusal:
“Once created, the technique [for accessing the shooters’ iPhone – GH] could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable. The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.”
There are a variety of reasons why Apple might do this, beyond Tim Cook’s love of privacy and the security of Apple’s customers. One is that, as Will Oremus points out on Slate, of the big three Silicon Valley companies (Apple, Google and Facebook), Apple is the only one whose business model is not premised on selling off or otherwise distributing customer data. So this is a very good opportunity for brand differentiation. The leaders of places like Russia and China will also be watching: if Apple can be compelled by the U.S. government to enable access to its phones, then presumably it can be compelled by those governments too, and China in particular represents a very, very big market. Indeed, the FBI itself has a very poor track record when it comes to respecting privacy and keeping its searches within the bounds of legal authorization, and it’s easy to argue that the fight over encryption is very useful in getting us all to forget the incredible growth of surveillance – one might even argue that we now live in a surveillance state.
I want to argue here that there is a philosophically-interesting, constitutional argument that Apple could make as well. Let’s start with an incomplete version: it’s commonplace to say that people put “their entire lives” on their iPhones. Without parsing the meaning of that phrase too closely, it seems clear that a warrant to search an iPhone comes pretty close to something the authors of the Fourth Amendment were very afraid of: generalized warrants, a colonial practice which essentially authorized the search of anything having to do with a person – no need to specify what was being searched, or what specific item(s) they were looking for. A tempting analogy for a cellphone search is to a house (it’s in Tim Cook’s statement, above, although that argument has seen mixed results in court): the government isn’t interested in the house per se; it’s interested in the information it presumes can be found there.
I think there is something more going on. People use their cellphones in a way strikingly similar to the way that Andy Clark and David Chalmers introduce and discuss their extended mind hypothesis. After all, one of the original examples was of Otto, an Alzheimer’s patient who carried around a notebook into which he jotted down information important to him; Clark and Chalmers argue that this notebook is so integral to his cognition that we should consider it as part of his mind. The leap to cellphones is not hard here, because we outsource a lot more to the cellphone than Otto did to his notebook. In Natural Born Cyborgs, Clark makes the case that we have always embedded our cognition at least partly in the environment, especially in technologies the use of which we have completely naturalized. One of his examples is a watch: if someone asks you if you know the time, first you say “yes,” and then you check your watch. Similarly the phone: if someone asks you how to get to the MOMA (this is the Otto example), you say yes, and then open up GPS. As Clark says, “we are our own best artifacts, and always have been” (194).
To doubters, he essentially throws down a gauntlet: psychological studies can prove that when people gesture while talking, the gesturing is doing some of the cognitive work. We know that not all of our cognition happens in our brains, in other words (and, of course, there’s always a risk of a homoncleus problem if you go the other direction: where in our brains are our minds located?). But if that’s true, why stop at the borders of the body, either? Clark defends this proposition in later work, and elsewhere draws a sharp distinction between the argument that the mind relies upon external support systems and his argument that the mind is partly constituted by these support systems. He argues that the weaker hypothesis – the one that says the mind relies on external support systems – obscures most of what is interesting, and here I think there’s a good example of that. In the Renaissance, memory techniques included locating thoughts in the rooms of one’s house – the idea being that the association with the familiar room would trigger the less familiar memory. The weaker argument thus says the house is essentially external scaffolding, and that in turn implies that it isn’t part of the memory, and that its location isn’t important. In other words, it lets you externalize the house: the mind relies on memory places and triggers. But these places are repositories that can be searched, subject to some limitations, because they are (merely) tools the mind uses. The stronger argument – the one Clark defends – says that the house is part of the mind, whether or not the house is located in your brain.
If this is the case, then the demand that Apple program a backdoor into the iPhone is tantamount to the demand that people’s minds be made available for inspection at any time by any computer with enough firepower to guess a password or otherwise overpower the mind’s resistance (truth serum, anyone?). And that is not just a Fourth Amendment problem – it is a Fifth Amendment problem, particularly if you think (as the late William Stuntz did) that the Fourth and Fifth Amendments were concerned to make the prosecution of thought crimes impossible by making the evidence necessary to prosecute them unattainable.
The San Bernadino shooters are dead, of course, and so the encrypted iPhone sits like some sort of revenant, containing some of the contents of their otherwise inscrutable, no longer existing minds. Certainly these shooters deserve no sympathy: they massacred colleagues who had very recently thrown them a baby shower, in an act that left that baby with no parents. Perhaps the fact that they are dead is an argument that the government should make: the dead have no rights. But this is tricky territory, and it seems to me that for those who are still living, the FBI’s position, when generalized (as inevitably it will be) stands for the idea that government has the right to access our (extended) minds.
UPDATE (2/19): Two interesting pieces on Slate today. One argues that this particular phone almost certainly has no useful information on it - and so we can assume that the government is trying to set a precedent for accessing phones, not anything else. The other has excerpts from the DOJ reply, which accuses Apple of grandstanding. Also, Orin Kerr (a well-known privacy law scholar) points out the phone belongs to the employer, which has consented to a search - so there are no 4th Amendment issues in this case (my argument above is about the precedent potentially being set here). He also reiterates that we are at the beginning of a potentially long, drawn-out legal process.