Infernal Machine   /   January 29, 2015

The Public and Private, Once Again

Three surveillance cameras, Wikimedia Commons

In the wake of the Charlie Hebdo attacks, a political fire that has been burning for a long time is turning into a firestorm. Recently, the British Prime Minister David Cameron has called for tech companies to provide government security services with encryption keys to ensure that government authorities may legally access an individual’s data when warranted. The concern, now publicly shared by President Obama, is that terrorists are using the new encryption technologies being developed by companies like Apple, Google, WhatsApp, and Snapchat, especially “end-to-end” encryption, which “makes it nearly impossible for anyone to read users’ messages—even the company itself.”

And so, as The Economist has recently stated in an editorial about the matter, we are confronted again with the age-old dilemma “liberty vs. security, once again,” or more precisely “privacy vs. security.”

There are a host of legal, technological, political, and, perhaps above all, economic issues at play here. I do not claim to know precisely how one balances liberty with security, let alone balancing liberty with the tech companies’ push for profit maximization or governments’ desire to save face in the wake of acts of terror. But I do think that the scales are already set to fall off—that is, that these debates are taking place against a background of assumptions about privacy that are themselves problematic.

In calling privacy a right, we tend to do more than assert the necessity for its legal protection. We tend to carry with our idea of the right to privacy the metaphor of private space, even private property. Privacy as that which is bounded, set off from that which is public. Hence we have our private life and our public life, our private opinion and our public statements, our private information and our public profile, etc. In this very common way of thinking about things, the private and the public are two distinct realms, and the right to privacy is the guarantee of a wall around our private realm.

The privacy vs. security dilemma is imbedded in this way of thinking: It has to do with when it is legitimately permissible for the government to break down the wall of privacy for the sake of security. It is a version of the broader dilemma of liberty within the quasi-utilitarian liberalism that underlies our assumptions about privacy. We are to be free, so long as we do not interfere with the freedom of others; but when we do so interfere, the state has the right to encroach on our freedom, indeed even on our privacy, in the name of preserving maximum freedom for the greatest number.

Indeed, in recent rebuttals by libertarians, some liberals, and the tech industry to the call by Cameron and Obama for preserving a “back door” option by which to access user data, we see the greatest good for the greatest number argument used on behalf of super-encryption: Back doors, Cameron’s critics argue, can and will be used by the bad guys (criminals, hackers, the Russians, the Chinese) as well as the good guys, and the damage done by the bad guys could well be catastrophic. As Harvard’s Margo Seltzer recently said in The Financial Times,

If bad guys who are breaking laws cannot use encryption, they will find another way. It is an arms race and if governments say you cannot do this, that means the good guys can’t and the bad guys can. End-to-end encryption is the way to go.

Protecting privacy as an inviolable right, more sophisticated arguments go, is not only consistent with liberal societies, but also the most effective means of security—even if it means terrorists can communicate with little fear of being detected. It’s often assumed here that an absolute right to privacy will neatly reconcile itself with, even produce, the greatest good for the greatest number (albeit, the privacy of one’s data from tech companies themselves is more penetrable).

I think the super-encryption efforts of tech companies are socially and politically problematic. I think they are the wrong solution addressing the wrong problem. But in arguing so I am not interested in hypothetical calculations of the greatest good for the greatest number. Rather, I simply want to start with the manifest relationship of the private to the public. How do things work with respect to the private and the public?

Rather than starting with the regnant bugaboo, terrorism, let’s think about political corruption. Do politicians have an absolute right to the privacy of their deliberations and communications about public matters? Does the Speaker of the House, or the President, have an absolute right to the full and permanent protection of behind-the-scenes communications about matters of public consequence? If Legislator A and Donor K used WhatsApp to work out a deal for campaign donations in exchange for sponsoring legislation in the House of Representatives, would we, as citizens, accept the records of those conversations as being forever and irredeemably private, such that we simply could not ever access them?

I suspect that most of us, once we stop to think about it, would not be too comfortable with this already real-life scenario. What if the messages concerned bribes, threats, or other forms of back room dealings? What if the President told the Speaker things that the latter was not authorized to know? What if the CEO of Company X was privy to the messages, too? Or what if the Speaker sent the President the CEO’s messages without the CEO’s knowledge? This is the stuff of scandal and corruption, and these are each instances where communications, though “private,” indeed have public importance. The public would have a right to know about them.

This is not because we are willing to “sacrifice” privacy for the integrity of our political system; it is not a version of "liberty vs. security, once again.” Rather this is because, even with the high premium we put on the right to privacy, we understand that the private stands in a flexible, dialectical, and dependent relationship with the public: When private acts have direct public consequences, they are not strictly private—they can be called to public account.

This is the case whether we are talking about political corruption or communication among persons who would commit acts of terror. More important, in calling private acts to public account, we are not breaking down the wall of privacy; rather, we are simply walking through the door from the private to the public the reverse way, so to speak. An exchange between the private and the public has already taken place. We are but re-tracing it.

What I find particularly troubling about the unbreachable encryption efforts of Apple, Google, and others is that they technologically (or, more properly, mathematically) prevent this kind of reverse traffic in the name of the public good. Rather in the name of “privacy”—and, let’s be honest, in the name of corporate profits—tech companies are creating, in effect, not so much inviolable walls around privacy but something more like trap doors from the private to the public that can be gone through only one way. In such a scenario, it is only the public that will suffer.

The genuine political worry articulated by super-encryption is that about Big Brother. As Wired writes of WhatsApp founder Jan Koum,

Growing up in Soviet Ukraine in the 1980s, WhatsApp founder Jan Koum learned to distrust the government and detest its surveillance. After he emigrated to the U.S. and created his ultra-popular messaging system decades later, he vowed that WhatsApp would never make eavesdropping easy for anyone. Now, WhatsApp is following through on that anti-snooping promise at an unprecedented scale.

But the United States and the United Kingdom are not the Soviet Union, and while both governments have participated aggressively in very troubling illegal, large-scale dragnet-like surveillance in the last decade, we have not seen a corresponding development of a police state working in tandem with the data collection agencies. To the contrary, the greatest problem faced by American and British citizens is that of government secrecy, which has provided cover for illegal and otherwise questionable state surveillance programs, together with the cultural problem seen in repeated demands from politicians that intelligence agencies unfailingly connect the dots prior to a terrorist attack, or be held culpable when they do not. This cultivates a culture of self-preservation in intelligence communities, encourages them to lean always to more aggressive actions rather than less aggressive ones, and opens the door to all sorts of government contractors promising infallible technological fixes for what are, in the end, inherently political and social crises.

Encryption processes that simply block government surveillance outright, in keeping with Silicon Valley’s longstanding delusion, are also but a supposed technological fix for what are political and cultural problems—be it the NSA or al-Qaeda and their affiliates. End-to-end encryption and its equivalents in no way address the real problems we face from a civil liberties perspective—government secrecy and the unrealistic expectations before counter-terrorism agencies. Worse, encryption offers a false substitute for real solutions—something that is the moral equivalent of vigilante force when what we need is better government and law.

Ned O’Gorman, associate professor of communication and Conrad Humanities Professorial Scholar at the University of Illinois, Urbana-Champaign. He is the author of Spirits of the Cold War: Contesting Worldviews in the Classical Age of American Security Strategy and the forthcoming The Iconoclastic Imagination: Image, Catastrophe, and Economy in America since the Kennedy Assassination.

Editor's Note: Ned O’Gorman is also a contributor to The Hedgehog Review's Spring 2015 issue.  Reserve your copy today here.