Questioning the Quantified Life   /   Summer 2020   /    Questioning the Quantified Life

Into the Whirlpool

How Predictive Data Put Brainwashing on the Spin Cycle

Rebecca Lemov

Digital binary computer code whirlpool; Robert Eastman/Alamy Stock Photo.

One of the first attempts ever made to add up every single thing in one place occurred in Ireland from 1655 to 1656. Petty’s Down Survey, so called because it “laid down” all important information, measured Ireland’s shape and size accurately for the first time. Yet the surveyers did well to proceed with caution in carrying out this probing numerical exercise, intended to assist appropriation of the land by English merchants and ex-soldiers, because only a few years earlier a data-gathering colleague had encountered wrathful Irish villagers who “would not have their country discovered.” So strongly did they resent this intrusion that they decapitated the would-be surveyor. Still, after Petty’s Down, accurate tabulation proceeded more quickly, and, by the time of John Adams’s 1680 Index Villaris, 24,000 towns and villages back in England had submitted to the encompassing embrace of measurement.11xAs accounted in Paul Slack, “Government and Information in Seventeenth-Century England,” Past and Present 184, no. 1 (August 2004): 36–38, https://doi.org/10.1093/past/184.1.33. Note that the Irish villagers’ resentment arose because the surveyors were English and had recently occupied Ireland with a view to “enclosing” the Irish lands for appropriation by English merchants and ex-soldiers. The sense that every person and every place should be amenable to being counted was not yet fully inculcated, but it would be.

Today, most people are frequently subjected to counting, of course. Our takeout orders, our opinions about those takeout orders, and the steps we take across the room—all of this, and so much more, is logged and fed into the “surplus behavioral data” stores that Shoshana Zuboff has so carefully traced in her examination of what she has labeled “surveillance capitalism.”22xShoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York, NY: PublicAffairs, 2019). Stranger yet, efforts to gather up massive amounts of personal information—to count widely, deeply, and broadly—are today largely shrugged off. We know, and yet we don’t know. Data is siphoned almost constantly—we feed it into the machines, and the “computational factories,” as Zuboff calls them, go to work. Perhaps the most curious part of this arrangement is that surveyor demise rarely follows. Increasingly intimate details of our lives are surrendered, bits at the very edge of scrutability and far beyond the now-obliterated edges once called privacy. Until recently, objection was deemed eccentric. Professors who specialized in media studies continued to post pictures of their children’s violin lessons and soccer games online, hardly seeming to know why or professing to know better. Meanwhile, in the name of increasing human agency, new techniques are subjecting your “choice points” to external controls, whether commercial or political. As a result, your likes and dislikes now end up shaping the choices with which you are presented, predicting your behavior and increasing in accuracy as more time-on-device engagement and closer-to-real-time tracking takes hold. It could all be said to amount to a mild if pervasive form of brainwashing.

Standardizing Thought

In the not-so-distant past, about halfway through the last century, brainwashing seemed like a slightly less subtle and more obviously sinister business. Before Marshall McLuhan was fully Marshall McLuhan—when he was not yet famous for being famous and was mostly a little-known literature professor from Toronto—he recounted a conversation he once had with an advertising expert. From this expert he learned that the key tool of the ad trade was to “standard[ize] thought by supplying the spectator with a ready-made visual image before he has time to conjure up an interpretation of his own.33xMarshall McLuhan, The Mechanical Bride: Folklore of Industrial Man (New York, NY: Vanguard Press, 1951), page v. In that instant before the process of making sense was completed, a presupplied image and, subsequently, a thought (not quite your own) could take hold. Thought was being standardized. There should be no mistake here, McLuhan commented: This was mental tyranny disguised as market research.

McLuhan’s response, during the 1940s, was to give lectures across Ontario in which he flashed slides of individual advertisements and offered his own analyses of them for the audience, the corpus of which material he began calling the “Folklore of Industrial Man.” These ads-plus-words look today a bit like long-winded Instagram posts in a lost jargon. Gathering them together in his first book, The Mechanical Bride (1951), McLuhan recalled that conversation with the advertising expert. The book gives equal space to pictures and words, putting up a series of “ready-made images” in the form of hosiery print ads, full-page spreads for funeral homes, and early stick-deodorant encomiums, then juxtaposing them with pun-filled verbal jousts—all somewhat dated today. In the book’s pages, McLuhan aims to “reverse that process” of thought standardization by making room for pausing, disrupting, and, ultimately, introspection.44xIbid., v–vi. He describes a technology-and-dream-driven whirlpool into which the reader must enter in order to interrupt or understand the “helpless state” engendered by widespread propaganda.

In a 1943–46 study called “Mass Persuasion,” a young sociologist named Robert K. Merton located similar dynamics at work among radio listeners who found themselves helplessly stuck to their dials when confronted with a patriotic eighteen-hour CBS “radiocast” in marathon style led by the famous singer Kate Smith to sell war bonds. Merton and a team of social researchers innovated with different methods to investigate how persuasion worked on a mass level to cause audience members to act in certain ways. Employing the recently debuted technique of content analysis (developed by Merton’s Columbia University colleague Paul Lazarsfeld), the researchers identified “stimulus patterns” in the Smith broadcast. Through intensive “focused interviews” with 100 people, they gathered minute-by-minute accounts (yielding data of a qualitative but analyzable sort) on listeners’ responses. This focused interview, in fact, was the precursor of the “focus group”—a term not yet coined, yet a technique that emerged directly from Merton’s radiothon study. “Fears, anxieties, and hopes” could be extracted from the structure of answers given in “non-structured” situations (that is, not using questionnaires but engaging in conversation closer in spirit to therapy than fill-in-the-numbers). For a broader picture, researchers carried out simple polling interviews with almost a thousand people in the greater New York City area. What they discovered was surprising. Audiences found themselves emotionally chained to the radio, in effect, by a mix of obligation and suspense. Still others seemed to have lost the ability to discern a choice. This paralysis of will was a phenomenon the sociologist isolated as being of particular interest: “Listeners did not consider the alternatives of turning off the radio…tuning in another program, or merely failing to listen closely. The solution to their problem waited in a vestibule of consciousness, but was not admitted to awareness,” Merton and his team wrote, possibly cheekily—because his interpretation raised the specter of a dim public so fettered to their receivers that even the “solution” of turning their attention to something else was inconceivable. They were, in effect, pinned there.55xRobert K. Merton, Marjorie Fiske Lowenthal, and Alberta Curtis, Mass Persuasion: The Social Psychology of a War Bond Drive (New York, NY: Harper, 1946), 27–28, emphasis added. As one listener testified, “Usually I get tired of listening and I turn it off. It’s funny, whenever there is any commercial on, you turn it off, but then [with Kate Smith] I had it on all afternoon. I didn’t realize it at the time, but I had to keep listening.”66xIbid., 28. CBS listeners could both know and not know they had been overtaken by a kind of compulsion.

Manchurian Candidates

In the same years McLuhan was touring with his slide show of delirious advertising images and Merton was studying irresistible compulsion, the word brainwashing entered the English language. At first it slipped in unnoticed. Its first use in the modern, alarming sense it continues to have—always in all-caps, as it were, followed by an implicit or explicit exclamation point, as in BRAINWASHING!—was in 1950, when Edward Hunter, a propaganda expert in the guise of a journalist who was working for the CIA’s forerunner, the Office of Strategic Services, published an article in the Miami Daily News announcing a new threat. Hunter, who was based in Hong Kong, proclaimed that “brain-changing,” alternately “brain-washing” (the term being, he claimed, a translation of a common Chinese locution), could overcome the human mind’s best-laid defenses. A 1951 best-selling book by Hunter got the word circulating widely in North American culture.77xEdward Hunter, Brain-Washing in Red China (New York, NY: Vanguard Press, 1951). See also Kathleen Taylor, “The Birth of a Word,” in Brainwashing: The Science of Thought Control (New York, NY: Oxford University Press, 2012), 3–4, and Maarten Derksen, “Brain-Washing and Menticide,” in Histories of Human Engineering: Tact and Technology (New York, NY: Cambridge University Press, 2017), 141. A recent blog essay argues, to the contrary, that Hunter was publicity mongering; see Marcia Holmes, “Edward Hunter and the Origin of ‘Brainwashing’,” The Hidden Persuaders’ Project blog, May 26, 2017 http://www.bbk.ac.uk/hiddenpersuaders/blog/hunter-origins-of-brainwashing/. Whether or not the newly communist Chinese used the phrase at all before 1950 (and it is unlikely they did), Hunter’s efforts meant that US citizens caught wind of a terrible method by which communist technology fiends—using behavioral psychology, hypnosis, propaganda, sleep deprivation, and other weapons not yet understood—could gain a foothold in the interior life of otherwise loyal Americans. One had only to look, for example, at American troops captured by the Chinese in the course of the Korean War, then ongoing. Downed pilots such as Lieutenant Colonel Frank Schwable were making ashen-faced confessions to outrageous crimes such as dropping bacteriological weapons on Chinese territory.88xDebate resurfaced recently about the existence of actual, as opposed to falsely extracted, confessions of US bacteriological warfare campaigns operating in secret during this period. Errol Morris’s 2018 Netflix series Wormwood advanced the claim that such weapons were in fact dropped in China as part of a larger US biological weapons program. Subsequently, the New York Review of Books published Michael Ignatieff’s seeming endorsement of such claims in “Who Killed Frank Olson?”, The New York Review of Books, February 22, 2018, https://www.nybooks.com/articles/2018/02/22/who-killed-frank-olson/. In response, Milton Leitenberg, a scholar at the University of Maryland School of Public Policy, clarified: “The false allegation was disproved as long ago as 1998 when documents from the Soviet Central Committee Archives that had been sent to Mao Zedong and to Kim Il-Sung in the month following Stalin’s death in 1953 were obtained from the Soviet Presidential Archive.… More recently, Chinese documents became available, and additional Soviet-era documents were published by the Soviet archive RGANI.” Milton Leitenberg, “No, They Didn’t,” The New York Review of Books, March 22, 2018, https://www.nybooks.com/articles/2018/03/22/biological-warfare-korean-war/. Soon GIs were arriving home in prisoner exchanges in such unusual condition that the Truman administration’s Psychological Strategy Board recommended that “the American public be prepared, beginning immediately, to face the grim realities of the brain-washing process,”99xSuggested Guidance for Public Aspects of US Position on Korean Prisoner-of-War Talks (Washington, DC: Psychological Strategy Board, 1953), 3. Retrieved from Central Intelligence Agency website: https://www.cia.gov/library/readingroom/docs/CIA-RDP80R01731R003200100001-9.pdf. as seen in the confused and indoctrinated men. By 1952 Time magazine was breezily reattributing the word back to the Chinese: “Ai Tze-chi was Red China’s chief indoctrinator or, as he was generally called, Brainwasher No 1.”1010x“China: Brainwasher at Work,” Time, May 26, 1952, http://content.time.com/time/magazine/article/0,9171,859632,00.html. Not long after, CIA director Allen Dulles publicly warned of the threat of interior invasion by means of such techniques: People’s brains, he said, could be spun like records on a phonograph.

Brainwashing Goes Public

Thought interference, a technique that raised concerns among fledgling media scholars in 1951, and whose effects apparently were displayed by real-life American POWs soon thereafter, grew into an obsession. Whether because of targeted messages (to the folks at home) or targeted indoctrination (of soldiers away from home), the mind was no sanctuary, but was, rather, an arena for shaping and molding the self. The threat that some outside power could choreograph your own thoughts as precisely as the dance steps of a line of chorus girls was arguably at the heart of the Great Brainwashing Scandals that took hold in the United States by the mid-1950s.

Exposure to a cacophonous whirl of “messaging” caused its subject to lapse into a state of unusual receptivity to messages from other sources. In an extreme form, one sees this in the cult inculcation process, as famed deprogrammer Ted Patrick noted from the experience of getting himself inducted into a faux-Christian religious cult, the “Children of God,” in the late 1960s. As Patrick recalled, “The indoctrination is cleverly orchestrated, and while you are being queried directly about your financial status, you are at the same time listening to tape recordings of Bible verses, being exhorted by one member to pray and praise the Lord while being hugged by another member who also tells you he loves you, brother—so it is all very confusing and one does not use his powers of concentration or critical ability in a normal way. It’s a sort of mental and psychological blitzing. They confuse and harass you quite effectively in that you often don’t really know what you are doing or saying.”1111xTed Patrick and Tom Dulack, Let Our Children Go! (New York, NY: E.P. Dutton, 1976), 41. Mental and psychological blitzing is almost precisely what McLuhan described in The Mechanical Bride’s succession of ready-made images—the media whirlpool, the sense of “prolonged mental rutting,”1212xMcLuhan, The Mechanical Bride, v. and the overarching environment of chaotic pulls soliciting one’s love, one’s vanity, and one’s finances.

Cults entered the story of brainwashing decisively in the 1960s and 1970s, by which time the US military’s key “brainwashing” experts were no longer busy facing an enemy defined as threatening to the American psyche. Brainwashing experts from the Korean War now appeared in San Francisco’s jail and courtrooms to defend Patty Hearst, because they were the only ones considered competent to justify the mysterious behavior displayed by a young heiress evidently indoctrinated into embracing the ideology of a gang of would-be revolutionaries who had kidnapped her. Korean War–era mind-control experts such as Dr. Louis Jolyon West and Dr. Robert J. Lifton, both of whom examined Hearst, were well placed to understand and recognize commonalities between what had happened to American POWs a decade before (confinement, extreme distress, ill health, threats against one’s family, apparent abandonment by family and country, coercion extending to every single bodily movement and autonomous function, strong inducement to embrace a new ideology, and a canned and rhetoric-filled ideology at the ready). All this Hearst, too, experienced during the Symbionese Liberation Army’s brutal abduction of her from her Berkeley apartment. It was no wonder, then, that she appeared “zombie-like” in the courtroom, unable to argue a convincing defense for herself despite representation by high-profile lawyers. As Dr. West commented, “In most of the POWs that I studied, the results were very similar to that in her case. In other words, the desired behaviors were achieved. They did what they were told to do, they said what they were told to say.” As Hearst commented, in the face of her overwhelming fear that her captors would kill her, “I accommodated my thoughts to coincide with theirs.”1313xLouis Jolyon West in United States v. Patricia Hearst, trial transcript, reproduced in The Trial of Patty Hearst (San Francisco, CA: The Great Fidelity Press, 1976), 257–258. Hearst is quoted in William Graebner, Patty’s Got a Gun: Patricia Hearst in 1970s America (Chicago, IL: University of Chicago Press, 2008), 19.

Autoplayed and Awakened

Not long ago, a young and fairly liberal college student from West Virginia named Caleb Cain was “pulled into a far-right universe, watching thousands of videos filled with conspiracy theories, misogyny, and racism,” according to a subhead for a New York Times article about the student. (With the help of an online graphic, Times readers could follow Cain’s viewing habits as he jumped from video to video.) Like nearly 70 percent of YouTube watchers, he employed the “UP NEXT” function on the site, which will “autoplay” as a default option. Unless the viewer stops the video feed, it will keep rolling. This creates an onrush of content configured to match the user’s ongoing behavior profile. Behavioral data—compiled from the individual’s choices and a vast collation of other viewers’ clicks, likes, and other habits—then informs which videos come next, with the prime value being to optimize ways to keep the watcher watching. Time on site overrides ideological content. Within a few months, having dropped out of college and with a lot of time on his hands, Cain “fell down the alt-right rabbit hole,” as he himself recalled, becoming, in his own word, “brainwashed.”1414xKevin Roose, “The Making of a YouTube Radical,” New York Times, June 8, 2019, https://www.nytimes.com/interactive/2019/06/08/technology/youtube-radical.html. However, by following several algorithmically recommended alternative views that occasionally popped up, he eventually made his way leftward again, never abandoning YouTube. Decrying the circle of right-leaning YouTube personalities, which he sees as a “dangerous cult,” Cain now works as a deradicalizing counselor to alert young people about the risks of online manipulation.

Some months after the West Virginian’s “brainwashed rightward by YouTube” story, another front-page article described large numbers of young Brazilians who were gradually nudged toward extremism by means of the same recommendation software. YouTube’s tinkering with algorithms’ functioning was a key factor in behavior change that led to ideological change. The shift in this case was in priority: Videos that keep people glued to the screen are recommended, so that viewing time was maximized, rather than logical connections among ideas. A Zika virus conspiracy, right-wing politics, and antifeminist abortion activism might flow one into the other via the recommender: Each had held others’ attention, so why wouldn’t it hold yours? As a political observer commented, “It feels like the connection is made by the viewer but the connection is made by the system.” Such viewers might find their most deeply held thoughts shaped by algorithmically designed interactions forged from behavioral data. A wave of newly hatched right-wing YouTubers began running for local offices alongside Brazilian presidential candidate Jair Bolsonaro. You could be “autoplayed” straight into a whole other set of beliefs, it seemed, while believing yourself to be undergoing a political awakening.

Are there commonalities among these incidents of self-described brainwashing—some taking place in the early 1950s in remote POW camps, others in “Jesus freak” gatherings in the 1970s—and these data-driven belief nudges in the late teens of our own century? And how do we find out about these commonalities and potential differences? How do we know what they mean? Is quantification the culprit? If so, how do we think about it?

Dollars and Eyeballs

The general situation today gives grounds for both optimism and pessimism. A brief sketch of three current examples—a bit like Kate Smith’s marathon bond-drive broadcast, the McLuhan girdle ads, the Korean War GIs, or the Children of God ecstasies—helps to mark watershed moments when we can see new arrangements of the power of persuasion (i.e., surveillance and control) coalescing.

The interior invasion of today differs from the interior invasion of yesterday. Unlike those of the 1940s and 1950s, today’s turning points are not generally being studied by university researchers but rather by insider research groups (e.g., those at Facebook or Google) that are themselves engaged in the very processes they are studying—or by journalists, who are doing most of the tracking, monitoring, and exposing. Despite the growing numbers of academics who study the history and social context of datafication, algorithms, artificial intelligence, and machine learning, targeted incursions into the realm of preference and belief are not well understood or addressed in scholarship—although the work of Sarah Igo, Jacqueline Wernimont, Luke Stark, Dan Bouk, and Shoshana Zuboff is starting to home in on tricky areas where coercion meets persuasion.1515xSee especially Jacqueline Wernimont, Numbered Lives: Life and Death in Quantum Media (Cambridge, MA: MIT Press, 2018); Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power; Sarah Igo, The Known Citizen: A History of Privacy in America (Cambridge, MA: Harvard UP, 2018); and Dan Bouk, How Our Days Became Numbered: Risk and the Rise of the Statistical Individual (Chicago, IL: University of Chicago Press, 2015). State attorneys general and other legal entities may subpoena records, but they are not in a position to assimilate the inner implications of algorithmic suasion.

The “Facebook Experiment” is a good place to start. Run in 2012 on almost 700,000 users, the experiment tested whether “mass emotional contagion” could be produced by altering the emotional valence of a user’s news feed. For one week, Facebook showed people less-positive posts in their algorithmically delivered feeds and found that users in turn posted 0.1 percent fewer positive words in their own posts. Publishing their findings in 2014 in the Proceedings of the National Academy of Sciences, the researchers concluded that contagion on a large scale had occurred.1616xAdam D.I. Kramer, Jamie E. Guillory, and Jeffrey T. Hancock, “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks,” Proceedings of the National Academy of Sciences 111, no. 4 (June 17, 2014): 8788–90, first published June 2, 2014, https://doi.org/10.1073/pnas.1320040111. Drawing from earlier studies (1992 and 2011) on the nature of emotional contagion, the researchers were especially interested in showing that emotion could be spread through visual stimuli alone (stimuli leading to emotional response, leading to a behavior change).

The response to the experiment was significant. News outlets pronounced the study unethical. Facebook claimed that its data use policy automatically opted every user into consenting to “data testing, analysis, research,” but that line in the user agreement was not added until four months after the experiment. And while Facebook chief operating officer Sheryl Sandberg offered a tepid apology for “poorly communicating” about the experiment, another Facebook executive asserted at the 2014 Aspen Ideas Festival that this type of experiment was at the core of innovation and the company’s mission of optimizing user experience. The Facebook research group that published the experiment largely ceased publishing in response to the controversy, but certainly A/B testing—running batches of digital users through conditions (“A” and “B”) that differ by only a single variable—remains the stock in trade of social media and Big Tech generally. Such testing allows whatever tests best, A or B, to be adopted. Thus, Amazon and eBay may show slightly different versions of pages within their site to different audiences in order to gauge relative click-through rate. Most marketing automation software now comes with the ability to run continuous A/B testing. Tiny elements are tailored almost continuously in response to user responses. One can see that, aside from dollars and eyeballs, the ever more exquisite responsiveness of such systems would be an outgrowth of such built-in barometrics.

Only a year after the Facebook experiment, in 2013, a political consulting/data analysis firm called Cambridge Analytica in effect scaled up emotional engineering, this time in secret. When 300,000 Facebook users installed a personality quiz app designed by British researcher Aleksandr Kogan, they opened themselves unwittingly to large-scale harvesting of their personal data, as well as the personal data of their friends. This social-data harvesting ramified to create a data set of what the New York Times later estimated to be fifty million Facebook users. Cambridge Analytica itself estimated the total at thirty million, while Facebook later put it at eighty-seven million, with 70.6 percent from the United States. With data on so many people’s public profiles, page likes, birthdays, and current relationship status (along with news feed, timeline, and message access in some cases), engineers at Cambridge Analytica created “psychographic” profiles assessing each user’s personality tendencies—where they fell in relation to the so-called Big Five traits of agreeableness, neuroticism, openness to new experiences, extroversion, and conscientiousness. In essence, a little “mini you” thumbnail portrait was built for each user—or, more to the point for Cambridge Analytica’s purposes, each potential voter.

Combining all of this data (the psychographic “mini you” was only a small part of the bigger picture), Cambridge Analytica then went about microtargeting audiences. Let’s be clear: There is nothing terribly different between this approach and standard political targeting practices of the past; the difference is in precisely how micro the political ads could and did become, something we may never exactly know. Cambridge Analytica worked to sway the vote in several key elections. Hired in 2015 by Texas Republican senator Ted Cruz’s presidential campaign (which reportedly was not blown away by the effectiveness of the British firm’s efforts), it was later employed by Brexit strategists and the Trump 2016 general election team. The specter of a perfect and easily manipulable psychological double of each voter, created out of data, frightening though it was, nonetheless was not accurate. But in a sense, the actual effects of Cambridge Analytica’s actions were far more disturbing because they were harder to trace. (The effects of their campaigns are ultimately unmeasurable because there are no records, at least not records available to the public.) Combining machine learning and intensive digital mining, I would argue, crosses a threshold in the history of society’s perilous course from mass persuasion to micropersuasion to hyperpersuasion. Moreover, Facebook users were never asked whether they agreed to sell their intimate data to political marketers to be turned back on themselves as a “psychological warfare” weapon (as a whistleblower would call it), although blanket user agreements often do sanction ongoing experimentation and A/B testing to optimize performance.

Cambridge Analytica’s public face, Alexander Nix, boasted openly in 2016 about his company’s success in strongly swaying support to Cruz, who finished second in the Texas Republican presidential primary, despite his manifest unlikability and unfortunate facial hair. Nix’s talk at the 2016 Concordia Annual Summit, “The Power of Big Data and Psychographics,” still runs on YouTube today as a self-promotional banner in the wind.1717xAlexander Nix, “The Power of Big Data and Psychographics,” September 19, 2016, video, https://www.youtube.com/watch?v=n8Dd5aVXLCc. Yet it was only in early 2018 that the Facebook–Cambridge Analytica data raid became a major political scandal, when whistleblower Christopher Wylie corroborated journalists’ claims about the use of Cambridge Analytica data in the Brexit Leave campaign. Wylie spoke about how he and his fellow engineers sought to create a “loss of agency”—in users—by fomenting their “inner demons” and then manipulating them in order to sway decision-making behavior. This came to concern him.

A recent TED talk by one of the Guardian journalists who exposed Cambridge Analytica’s work, Carole Cadwalladr, included screen shots that Facebook was legally forced to release showing micropersuasive ads that left no trace.1818xCarole Cadwalladr, “Facebook’s Role in Brexit—and the Threat to Democracy,” April 2019, video, TEDx Talk, https://www.ted.com/talks/carole_cadwalladr_facebook_s_role_in_brexit_and_the_threat_to_democracy. By contrast with how relatively easy it was for Marshall McLuhan and Robert K. Merton to study the effects of ads and broadcasts on their intended audiences, it is now much harder to expose and explain the kind of micropersuasion that relies on the algorithmic building of responsive message systems. If a unique message can be uniquely targeted to each unique user via hypertargeting, little or no shareable record is likely to remain.

Persuasive loops today bind tighter than they did in the midcentury moment. We are not just in the whirlpool watching the objects on the walls. We are part of it. In May 2017, a leaked document from Facebook’s Australian division “suggested the company had offered advertisers the ability to target advertisements to teenagers based on real-time extrapolation of their mood.”1919x Luke Stark, “Algorithmic Psychometrics and the Scalable Subject,” Transmissions blog, July 23, 2018, https://sites.library.queensu.ca/transmissions/algorithmic-psychometrics-and-the-scalable-subject/. It’s not the “extrapolation” that is the most grievous cause for alarm, but the “real-time.” Streams of data are reaching tipping points of informational richness to the degree that they need be only sufficiently predictive, not perfect, to be successful.

What is often neglected in this tightening of the persuasive loops is a change in the person who is targeted by them. Just as earlier studies identified a mass consumer who was immersed in the problems of being individualized and being seen as part of a large “mass,” today the user is less individual and less mass. That person comes to resemble the very shifting patterns that are assembled out of the data. That person comes in and out of focus as she triggers responses within different domains where she wanders. That person is destabilized. This is what some scholars refer to as being “datafied.” We are shifting aggregates in a sea of data floes, at one with the whirlpool that spins around us.