Apple is resisting the FBI's request that the company write software to help unlock the IPhone of Syed Rizwan Farook, the perpetrator, with Tashfeen Malik, of the massacres in San Bernardino, California, on December 2, 2015. Apple is said to worry that if it lets the FBI into Farook’s phone, it will open a global can of worms, and set a precedent for doing the same thing for less “friendly” governments. And a “back door” to individual phone data will compromise overall security, leaving phones vulnerable, in Tim Cook’s words, to “hackers and criminals who want to access it, steal it, and use it without our knowledge or permission.”
Since the appearance of the Snowden documents, it’s hard for many of us, at least on the level of sentiment, to root for the US government wanting access to phone data. Though the case is complex (and Apple has unlocked phones for the FBI before), the surveillance state is a remarkably frightening prospect, and even the very targeted, essentially forensic, aims of the FBI in the San Bernardino case understandably evoke worries.
But Apple’s battle with the FBI brings to mind Bob Dylan’s quip that “you’re gonna have to serve somebody.” We face something like the classic high-school English class choice between Orwell’s “Big Brother” and Huxley’s “Brave New World.” If the FBI concerns us, Apple should, perhaps, concern us even more.
As Hannah Arendt makes clear in The Human Condition, privacy never stands alone: It always has its co-dependents—especially, the public, the political, and the social. Changes in the meaning of “privacy” mean changes in the meaning of the “public,” and the other way around. The private and the public are interlocking political concerns.
In other words, whenever you are faced with a debate about privacy, also ask what the implications of the debate's potential outcomes are for public life.
Apple is doing more than making things difficult for the FBI. They are contesting the very meaning of “privacy” by taking an absolutist, technologist stance regarding phone data, and the stakes are high for public life.
Ironically, the key difference between this case and other cases where the FBI has asked Apple for help in unlocking phones is that this case has unfolded in public. Public knowledge of FBI access to one phone could leave open the possibility, in the minds of consumers, that any iPhone could be opened to government surveillance at any time, even though the probability of this happening would not really increase. A feeling of vulnerability among global consumers, Apple seems to believe, could compromise iPhone sales.
So Apple has constructed a holy cause of sorts, “privacy.” However, this commitment to privacy is partial and limited. Apple will not rush to court on your behalf if someone is caught spying through your window with an IPhone connected drone. And Apple is not invested in stronger consumer privacy laws. Instead, in appealing to the sacrosanct status of privacy, Apple is both drawing on and reaffirming a limited and problematic notion of privacy.
In his letter to customers, Cook asserts Apple's commitment to protecting the "personal data" of customers:
Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data. Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.
What specifically is Apple promising to protect as "private"? Some of the data on a particular device. Their vigilance does not extend to the data that leaves your phone and moves through cables and into and out of Apple's servers, data that is stored remotely "in the cloud," or the data that is collected by a myriad of companies you digitally contact. (As I have argued elsewhere on this blog, we need to rethink the relationship of "metadata" to privacy.) Apple, in other words, is not fighting to protect much in the way of your data; it is not even offering to protect what you store on its iCloud service.
It is, however, out to protect your sense of "privacy."
Apple's commitment to a strong sense of privacy, whatever the facts, makes the way the company is defining "privacy" more, not less, significant—for in as much as it is it is vying for your sense of privacy, it is vying for the meaning of privacy.
What can we say about this sense of privacy? First of all, whatever privacy is, it has to be in Apple's eye primarily an engineering problem. Apple's privacy is an engineer's construct, even conceit. Many everyday senses of privacy follow this very limited idea of “data on my device." Though I've entered vital data online numerous times, I would be more likely to feel a violation of privacy at an "unauthorized" family member thumbing through the pictures on my phone than a stranger using my date of birth and social security number to secure fraudulent credit. There's something about Apple's sense of "personal data" that gels very well with our sense that the gadgets we carry with us are "personal devices" rather than nodes in a massive economic and technological system.
But what about privacy's co-dependents, especially the "public"? Apple's narrow and problematic sense of privacy, if Apple sticks to it and if it were made the rule among tech companies, could have major public consequences, reshaping our experience of public life. First of all, Apple is explicitly pitting a forensic good, a good having to do with public justice, against the protection of privacy, and it is doing so in an absolutist fashion that undermines the delicate balance between certain rights and justice so vital to public life (just as the NSA did, but in reverse fashion).
In the case of Syed Rizwan Farook’s iPhone, we are talking about a specific and targeted forensic investigation—exactly what critics of the NSA call for. It is quite plausible that the data on Farook's phone may be critical in helping to forensically reconstruct the networks (if any) of which Farook was a part. The knowledge that would come out of such an investigation may not end up preventing another similar attack. Nevertheless, it represents an immediate public good both with respect to our sense of justice and to making sense of indiscriminate acts of political violence that are, in their very performance, meant to cripple or otherwise alarm the citizenry. My point here is simply that legally sanctioned and legitimate forensic police work represents a public good, and Apple is now pitting that good against the good of privacy—and privacy as Apple defines it.
Then there's terrorism itself. The effects of indiscriminate violence on behalf of political causes—terrorism—are public as much as private. They affect how individual bodies move through public spaces, be they airports, public buildings, city streets, or town squares. “Publicity is the oxygen of terrorism,” Margaret Thatcher once said. She was right about this, if not about its implications. As a political tactic, terrorism seeks to redefine, even destroy, public life. The irony here is that an absolute commitment to privacy threatens to do the same: If publicity is the oxygen of terrorism, privacy is the oxygen tank. The surveillance state is not the solution to terrorism. But an absolutist commitment to privacy, technologically enforced, puts public life at great risk.
But for me the biggest problem is the way the cause of privacy advocated by technologists undermines the aims of democratically minded actors and activists everywhere. An absolutist commitment to personal privacy, and the engineering of that privacy through practically unbreakable encryption, may protect dissidents and others from political oppression. But in such cases privacy is not an ultimate, sacrosanct good, but a means toward hoped-for public goods. Freedom of speech, freedom of assembly, freedom of movement—these are “ends” of the modern democratic project, and any claims to privacy need to be considered in light of the protection of these very public liberties.
Public life, not private lives, is at the crux of political freedom. On this both Orwell and Huxley agreed. Technologists should give it some thought.