Consider four news stories from the past year. First, the Wall Street Journal reported last November that Google had quietly entered the medical records business through an agreement with the health-care provider Ascension. This machine-learning venture—code-named “Project Nightingale”—harvested names, test results, diagnoses, hospital admissions, and prescription histories from more than 2,600 hospitals across twenty-one states without informing any of the patients or physicians whose records these were. Second, in March of the current year, Washington’s governor, Jay Inslee, signed a law regulating the use of facial recognition within the state. Hailed by some as model legislation, the law was written by a state senator who is in management at Microsoft. Third, in April, an Israeli spyware firm, NSO, alleged in court filings that, in 2017, WhatsApp (a Facebook-owned messaging service) attempted to buy NSO’s surveillance technology in order to spy on Apple devices. Finally, Apple had its own surveillance embarrassment in the summer of 2019, when a whistleblower revealed that the company had been farming out Siri recordings—with or without the user’s intentional activation of the virtual assistant—to contractors around the world for sound quality tests. Among the recorded material: confidential medical information, drug deals, and the intimacies of couples between the sheets.
All of these examples come from a file I started to keep after reading Shoshana Zuboff’s The Age of Surveillance Capitalism, in which Zuboff argues that surveillance capitalism represents “a new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales.” It is a big book, an important book, and a conscious attempt—successful in my view—to write a magnum opus on the way capitalism has been quickly transformed as a result of the digital revolution. Your hospital records, your facial features, your Internet activity, call history, and app usage, your most private utterances: All of these facets of human life have become data that Big Tech can mine. As Zuboff shows, this captured data allows Big Tech to assemble digital pictures of what you like, where you go, who you know, who you are. That shadowy portrait then becomes the basis of predictions about where you’ll go next, what you’ll do, buy, and want next.
Zuboff goes even further than prediction, though. She draws on the behavioral science of B.F. Skinner to argue that Big Tech’s ambition, ultimately, is “guaranteed outcomes,” in which its products not only forecast our next steps but also lead them. You may or may not feel controlled by Google’s search results, YouTube’s recommendations about what to watch next, or Amazon’s suggestions about what might interest you; but we must grant that the architecture of our decision making is shaped by these platforms, which, moreover, are working not just from what we explicitly tell them but also from all of the “bread crumbs” we leave behind us as we search, play, write, and snap.
Among the implications of this economic model is a drive among the major players for ever more data streams to sharpen their predictions—against the rival claims of other data gatherers. Facebook’s attempt to break into Apple’s data hoard is a case in point. So is Project Nightingale. Its stated purpose is laudable: to help health-care providers determine what ails a patient, which drugs to prescribe, even which doctors should be assigned to a case. The trouble is that Google is also gathering information on all of the named parties through other channels.