Too Much Information   /   Spring 2015   /    Too Much Information

Uneasy in Digital Zion

Manipulation…or product development?

Chad Wellmon and Julia Ticona

Gary Waters/Ikon Images/Getty.

During the summer of 2014, two Cornell University scholars and a researcher from Facebook’s Data Science unit published a paper on what they termed “emotional contagion.” They claimed to show that Facebook’s news feed algorithm, the complex set of instructions that determines what shows up where in a news feed, could influence users’ emotional states. Using a data set of more than 689,000 Facebook accounts, they manipulated users’ news feeds so that some people saw more positive posts and others more negative posts. Over time, a slight change was detected in what users themselves put on Facebook: Those who saw more positive posts posted more positive posts of their own, while those who saw more negative posts posted more negative ones. Emotional contagion, the authors concluded, could spread among people without any direct interaction and “without their awareness.”11xAdam D.I. Kramer, Jamie E. Guillory, and Jeffrey T. Hancock, “Experimental evidence of massive-scale emotional contagion through social networks,” Proceedings of the National Academy of Sciences of the United States of America, 111:24 (2014), 8788–8790.

Some critics lambasted Facebook for its failure to notify users that they were going to be part of a giant experiment on their emotions, but others thought it was cool.22xFor a good discussion of the implications around research ethics and this experiment, see Jaron Lanier; http://www.nytimes.com/2014/07/01/opinion/jaron-lanier-on-lack-of-transparency-in-facebook-study.html?hp&action=click&pgtype=Homepage&module=c-column-top-span-region&region=c-column-top-span-region&WT.nav=c-column-top-span-region&_r=0. Sheryl Sandberg, Facebook’s chief operating officer, just seemed confused. “This was part of ongoing research [that] companies do to test different products, and that was what it was; it was poorly communicated,” she said. “And for that communication we apologize. We never meant to upset you.”33xR. Jai Krishna with Reed Alergotti, “Sandberg: Facebook Study Was ‘Poorly Communicated,’” Digits [blog], Wall Street Journal, July 2, 2014; http://blogs.wsj.com/digits/2014/07/02/facebooks-sandberg-apologizes-for-news-feed-experiment/. Facebook wasn’t experimenting with people; it was improving its product. That’s what businesses do: They serve their customers by better understanding their needs and desires. Some might call it manipulation. Facebook calls it product development.

The cofounder of the online dating site OkCupid, Christian Rudder, responded to the uproar with a snide blog post: “We noticed recently that people didn’t like it when Facebook ‘experimented’ with their news feed…. But, guess what, everybody: If you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.”44xChristian Rudder, “We Experiment on Human Beings!,” OkTrends (blog), July 28, 2014; http://blog.okcupid.com/index.php/we-experiment-on-human-beings/.

Rudder went on to describe some of OkCupid’s experiments, which ranged from removing pictures from profiles to facilitate “blind dates” to altering the algorithm that calculates matches to tell “bad matches” that they were “good matches.” Like Sandberg, Rudder claimed not to understand why people would be so worried about Facebook or OkCupid conducting a few experiments. “We’re only beginning to understand how much we can learn about ourselves and others from the data that is constantly being harvested from us,” said Salon journalist Andrew Leonard after interviewing Rudder. “The more we know, the better armed we are to navigate the future.”55xAndrew Leonard, “OkCupid Founder: ‘I Wish People Exercised More Humanity’ on OkCupid,” Salon, September 12, 2014; http://www.salon.com/2014/09/12/okcupid_founder_i_wish_people_exercised_more_humanity_on_okcupid/.

In their responses, Sandberg and Rudder revealed how either clueless or cynical they are about the central importance of social media platforms to users’ emotional lives. Facebook and OkCupid are not simply efficient tools for sharing baby pictures and finding dates in a new city; they are social spaces in which actual people pursue desires and craft lives. Even in the early days of the ARPANET, the Internet forerunner that was designed as an efficient tool for sharing information, networked technologies were, as psychologist Sherry Turkle has said, “taken up as technologies of relationship.”66xSherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other (New York: Basic Books, 2011), 157.

Saving the Internet From People?

While they acknowledge the extensive experimentation and engineering that goes into improving their sites, Rudder, Sandberg, and other digital media moguls have a vested interest in maintaining the notion that social media like OkCupid and Facebook are just tools that humans use as they see fit. Social media, they insist, are simply means for communicating and connecting with friends and family. Reflecting on OkCupid’s experiments that claimed to show evidence of widespread racial prejudice and a blatant preference for good looks over personality among its users, Rudder concluded that people, not the media, were the problem. “I do wish,” he said, “that people exercised more humanity in using these tools.” Listening to Rudder, one might believe it’s the Internet that needs to be saved from people.77xLeonard, “OkCupid Founder.”

There is something incongruous in these sometimes giddy, sometimes nonchalant responses of social media’s elite to revelations of data collection and experimentation. On the one hand, Sandberg and Rudder readily acknowledge the massive scale on which Facebook and OkCupid operate, as well as their relative secrecy. In fact, both often celebrate such scales of knowledge as beacons of a new age. At the same time, they exhort those who use social media and digital technologies to think of them as mere tools facilitating desires and intentions that have always have been and will remain subject to human control. According to these conflicting accounts, we are both radically determined and radically free.

Confusion about our digital technologies and their use is not limited to the masters of Silicon Valley. A recent survey by the Pew Research Center found that while Americans say they are increasingly worried about using social media to share private information, most are unwilling to change their behavior. Fifty-five percent of those surveyed, for example, said they would share information about themselves in order to use the services and tools of digital technology and Internet companies.88xClaire Cain Miller, “Americans Say They Want Privacy, but Act as if They Don’t,” New York Times, November 12, 2014; http://www.nytimes.com/2014/11/13/upshot/americans-say-they-want-privacy-but-act-as-if-they-dont.html?_r=0&abt=0002&abg=1. The New York Times referred to this as a “paradox,” but it might be more accurately described as an increasingly widespread unease with the ubiquity, power, and demands of digital technologies.

Shifting Boundaries

Such unease is what the co-author of the present article, Julia Ticona, encountered when she interviewed more than seventy working adults, ranging in age from eighteen to sixty-two, about their interactions with their personal devices. Ticona asked her subjects to evaluate not only their own interactions with those tools but also those of friends, colleagues, and family members. The responses often revealed feelings of guilt and shame, as well as curious disconnects between feelings of autonomy and control and those of helplessness and impotence.

The ubiquity and inescapability of the new media environment proved to be a constant theme. Like other subjects, who tended to speak not of particular devices but of digital technology in general, John,99xThis name is a pseudonym, as are all others used to identify the study subjects. a fifty-six-year-old financial manager, described his sense of the enveloping and almost irresistible power of these technologies:

Well, first of all, it’s [technology is] here; you’re not going to change it. It’s here to stay. I can sit here and bitch all day about it, but I’m not going to change it. So I think you’ve got to deal with it. You’ve got to learn how to use it, and if you learn how to use it correctly, it’s a great resource. It’s a great tool.

Similarly, for Lucy, a middle-aged nurse practitioner, the casual conflation of different devices and platforms—cell phones and online social networks—made complete sense within the broader ecology of digital life, which she experienced as a ubiquitous web of connectivity. Like John, she described it as second nature––fixed, immutable, and, quite simply, our new reality.

But it is just this omnipresent character of digital technologies that gives people pause. Among the interview participants, there was a vague sense that traditional boundaries and norms surrounding public discourse and privacy were shifting in ways they couldn’t understand. So while they expressed gratitude, optimism, and even excitement about the new environment, citing access to endless information and the ability to connect, they also admitted to feelings of embarrassment, guilt, and contempt. In our digital environment, said Lucy, we have no shame: “People don’t seem to really care that all their business is out there, whether they’re talking on the phone at the grocery store, whether they’re putting everything on Facebook, like nothing is sacred anymore.”

Despite the unease Lucy and other interviewees said they experienced, they seemed to view themselves as autonomous individuals within the digital environment. The overwhelming majority voiced little awareness that technologies, or the companies that create them, structure digital experiences or put information derived from user engagement with the media to ends that the users themselves can’t imagine. Most saw Facebook not as a company with its own agendas and ends, but simply as another tool to manipulate.

The gap between self-image and reality is clearly wide, as shown by the subjects’ struggle to make sense of their own habits and identities in this new environment. For example, when asked to discuss her experiences with personal technologies, Cate, a thirty-two-year-old medical data entry clerk living outside Buffalo, New York, spoke for many interviewees when she said she felt “sucked into” Facebook or the Web. Lamenting their individual failures of will and self-control, they unwittingly echoed OkCupid’s Rudder in expressing the view that they would be better people if they just exercised more willpower. Cate went on to say:

If I go on Facebook or something, all of a sudden I will think of an old friend that I haven’t seen in a while and I will look them up. I’ll spend half an hour looking through their page, seeing what they are doing. All of a sudden, I’ll look at the clock and be like, “Oh my God. I just spent a half an hour of my life doing this.” I am like, “What is wrong with me?” It makes me feel kind of embarrassed sometimes that I am sucked into it.

Cate wanted to see herself as an autonomous agent, able to use her digital technologies with purpose and control. But she consistently got lost, her attention dissolving as she was drawn into the endless stream of information on her Facebook newsfeed. Mindlessly checking pictures, status-updates, and videos through clicks and scrolls, she had little idea that she was feeding a constant stream of data to Facebook’s servers and making herself a node in the network.

For Cate, the experience of being “sucked in” was even more fraught when her two-year-old son came into the picture:

Sometimes, he will be busy playing and doing something, and I’ll be sitting on the couch with my phone like ten minutes. All of a sudden, I will look up and he is looking at me. I’m like, “Oh. What am I doing?” I’ll throw my phone down and go play with him…. I want him to have my full attention and feel like he is the most important thing, which when it comes down to it, he is…. If he is busy doing something and all of a sudden…I look up and he is looking at me with that look like, “What are you doing?” Then I feel horrible.

 

 

When Cate got “sucked in” while spending time with her son, she was confronted not only with the limits of her own autonomy but with feelings of guilt about the kind of undivided attention she felt a mother ought to give her child. Throughout these episodes, she experienced the online space she gets “sucked into” as separate from her offline life.

Attempts to maintain boundaries between, and control of, these online and offline spaces take different forms. Whereas Cate tried to put her phone out of sight, Salena, a thirty-two-year-old manager at a nonprofit in the suburbs of San Francisco, had developed a plan that relied on technology to limit technology:

[I] get lost in the Internet. I’ll be like, “this is interesting,” and I’ll read it, and then I’ll Wikipedia something else, and then that’ll lead me to something else, and then I just got sucked into the Internet basically…. That’s something that I had to really fight against. Before I go on the Internet now, I have a plan for what I’m going to do, and I stick to it…. I use Asana for this. I’ll be like, “I need to pay my taxes,” and so I’m just going to go and pay my taxes. I’m just trying to be real intentional about everything I do now instead of just getting sucked into things.

Asana is a software program that is described by its developers as bringing the “speed, complexity, and scope of our modern work” under control by managing to-do lists, documents, and calendars.1010x“Email Is Holding Your Team Back: Get Organized with Asana,” Asana, Inc., accessed December 31, 2014; https://asana.com/product. While Asana is typically used as a workplace tool, Salena adapted it to make her Internet use more “intentional.” Like Cate, Salena described her online activities as distinct from her real life offline. To maintain this digital divide, she vigilantly guarded these borders lest she lose herself in the consuming vastness of the Web.

Tess, a forty-year-old substitute teacher living outside Buffalo, sheepishly described another method of self-control designed to keep her tendency to stay on Facebook “for hours” in check. She simply avoided other social media:

Tess: I can stay on it [Facebook] for hours! [laughs]…. It does suck you in. I’m trying to stay away from that pin-in-trest?

Interviewer: Pinterest?

Tess: I’m staying away from that because I heard that can be addicting so I won’t even get into that! [laughs]

Like Tess, many of the people Ticona interviewed described their interactions with digital technology in terms of addiction, which they understood almost exclusively as a subjective experience characterized by a lack of self-control. Addiction, so conceived, is a pathological condition triggered by some latent property of a particular object—a drug, a technological artifact, a game, or, in Tess’s case, a website. By holding to such a strictly individualistic, psychological explanation, people fail to comprehend the complex phenomenology of their predicament.

According to psychiatrist Howard Shaffer, the word addiction doesn’t so much describe a pathology as it does a relationship between a person and “the object of their excessive behavior.” The term describes a “relationship between organisms and objects within their environment.”1111xHoward J. Shaffer, “Understanding the Means and Objects of Addiction: Technology, the Internet, and Gambling,” Journal of Gambling Studies 12, no. 4 (1996), 465−66. Understanding addiction in these more relational and environmental terms requires us to consider addiction as neither a simple subjective pathology nor a latent feature of an object. It is, says Natasha Dow Schüll, a cultural anthropologist at the Massachusetts Institute of Technology, a “relationship that develops through repeated interaction between a subject and object, rather than a property that belongs solely to one or the other.”1212xNatasha Dow Schüll, Addiction by Design: Machine Gambling in Las Vegas (Princeton, NJ: Princeton University Press, 2012), 17. If addiction refers to the ways in which people and objects act together, then in their descriptions of being “sucked into” a separate space of experience, many of Ticona’s interviewees described a blurring of distinctions between subject and object. “There’s no boundaries, no boundaries,” as John somewhat plaintively put it.

We can’t make sense of these addictive relationships without understanding both a particular subject’s experience and the features of an object that might facilitate such repeated interaction. In Tess’s case, Facebook’s design features encouraged her to interact with it. All of those buttons, status updates, constantly updated news feeds, and, perhaps above all, all the numbers that compel us to keep clicking. One more Friend! One more Like! One more Share!

Yet absent from almost all of the interviews was any mention of what the assorted designers, marketers, programmers, and business leaders behind digital and social media technology might be trying to do beyond simply connect people. Any loss of agency or control, the subjects reported, was their own fault. And they explained such negative feelings as guilt, shame, and embarrassment, as resulting not just from a loss of agency but also from the dissolution of the boundaries between their digital and non-digital lives. None of the respondents seemed to see how such an implicitly dualist framework might be inadequate in helping them explain or inhabit the confusing space opened up by the new technologies.

The Danger in Our Dualism

There are, of course, consequences to the widespread confusions that accompany our interaction with digital and social media. Believing that we as individuals are solely responsible for our technology-suffused lives, we risk overlooking the ways in which our individual incapacity to say no to Facebook is a cultural incapacity, one that Facebook is not only keen to exploit but also eager to preserve.

While lost in the vortex of our social media and digital networks, most people don’t seem to notice that they are helping create another, digital self. With our Google searches and our Facebook status updates, likes, and tweets, we are leaving traces of our desires and longings. This networked self lives on as a digital composite to be statistically analyzed and sold to commercial interests. These digital traces persist in databases, searchable by our name but detached from our physical and even temporal being, at a scale that is unprecedented. (There were certainly traces of human lives and longings in print, but those traces were on a different scale, both in terms of who left them and how many traces could be collected and organized.) Facebook defends its policy of requiring real names and prohibiting pseudonyms as a way to ensure online civility, but the company also has a direct financial interest in making sure the digital traces they have collected match the name on our credit cards. Transparency serves capitalism, not just democracy. Facebook and Google turn their users into commodified subjects of data.1313xSiva Vaidhyanathan, The Googlization of Everything (and Why We Should Worry) (Berkeley: University of California Press, 2011).

The experiences of disorientation and loss of control in a hyper-networked world so often described as negative by the interviewees are indirectly celebrated by the technology industry’s champions. For them, the transcendent state of being is connectivity, and transparency is the highest virtue. Our digital technologies enable a networked self, always connected, always open. As the prolific author and high-flying consultant Jeremy Rifkin puts it, “Connecting everyone and everything in a neural network brings the human race out of the age of privacy, a defining characteristic of modernity, and into an age of transparency. While privacy has long been considered a fundamental right, it has never been an inherent right.”1414xRifkin, The Zero Marginal Cost Society: The Internet of Things, the Collaborative Commons, and the Eclipse of Capitalism (New York: Palgrave Macmillan, 2014), 75.

Techno-utopians such as Rifkin present us with a false and harmful dichotomy between total transparency (real names only!) and a disconnected privacy. From their perspective, sharing information online is an unqualified good: It is a virtue to be open and engaged with the world. Withholding information is a social transgression: It is a vice to be guarded and private. To question the merit of total transparency is to advocate a secret and suspicious “untraceable anonymity,” in the words of Harvard computer science researcher Judith Donath, a total-transparency skeptic.1515xJudith S. Donath, “We Need Alter Egos Now More Than Ever,” Wired, April 25, 2014; http://www.wired.com/2014/04/why-we-need-online-alter-egos-now-more-than-ever/.

Our Digital Ecosystem

Putting total transparency and complete anonymity in opposing camps is a tactic of technophiles to obscure the true aims of total transparency and to denigrate concerns about privacy as the petty desire to conceal shameful secrets and guilty longings. In positing this dichotomy, technophiles disregard a more subtle and complex need for limits on what we expose, limits historically provided by the restraints of physical and temporal reality and cultural norms and institutions. Anchored in time and space, the thinking goes, my words and responses don’t circulate as widely or indiscriminately as might my digitized words. The materiality of our digital ecosystem operates at a remove from our bodies, one that ensures that every Facebook comment, every Amazon review, every search, will endure on a server. It allows for a permanence and centralization that is unprecedented in scope and scale. We may experience our Facebook and Twitter feeds and our digital social lives as dynamic and in flux, but on another scale, that of the server, our digital selves are fixed in massive archives we know little about.

Moreover, these new digital archives are neither private nor public—they are corporate, owned by technology companies generally unconstrained by the norms and institutions of democracies. But because many of these services are free to users, they are easily misconstrued as public utilities. Facebook, we tell ourselves, is a public message board that belongs to us. When the people interviewed by Ticona mentioned privacy, they did so not in terms of corporations harvesting data but in terms of an audience of friends and family. Privacy became a question of modesty and humility: “I can’t believe she posted a picture of herself wearing that,” rather than “I’m worried that Facebook is going to sell my information to Walmart.” Yet the desire for multiple and limited publics, the desire for ephemerality, has not been lost. Popular new apps such as SnapChat and YikYak allow users to share messages and pictures with a more limited audience and put an expiration date on those messages.

Even when interviewees were encouraged to discuss the ways in which the design of digital technologies might structure experiences or encourage certain behaviors, they continued to frame their explanations in strictly individual terms that only reinforced the assumed divide between the lives they led on and offline. Jane, a forty-seven-year-old food-service worker employed by a chain fast-food restaurant in Washington, DC, described some of the greatest challenges raised by digital technologies:

Jane: I heard of things being posted, like fights and people abusing animals. It’s just too bad we can’t just go find them and arrest them, and take it down….

Interviewer: What do you think it is that keeps us from doing that?

Jane: It’s probably hard to track them down…. I guess they could use IP addresses, but does that mean anything either?… It could be an Internet café somewhere in New York City; you don’t know. It’s very hard. It would take a lot of manpower, a lot. You’re not just walking the street, monitoring the World Wide Web; that’s crazy…. It’s out of control.

That video-hosting companies might in some way be responsible for the content posted on their sites was not something Jane could imagine. In fact, video-hosting sites do extensive content moderation, often at the expense of their own employees’ emotional health.1616xAdrian Chen, “The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed,” Wired, October 23, 2014; http://www.wired.com/2014/10/content-moderation/. But most people seem to see such problems as ones requiring greater self-control among technology users or even individually focused therapeutic interventions such as “digital detoxes.”

Avoiding technology may sound like a noble feat of asceticism, but it is neither possible nor desirable. Technologies are part of us. They help constitute what it is to be human. To pretend otherwise is naive and self-defeating. Unplugging from our digital devices, as writer Casey N. Cep points out, “doesn’t stop us from experiencing our lives through their lenses, frames, and formats.”1717xCasey N. Cep, “The Pointlessness of Unplugging,” The New Yorker, March 19, 2014; http://www.newyorker.com/culture/culture-desk/the-pointlessness-of-unplugging. The idea of “unplugging” assumes that a brief hiatus from your favorite device or app will have a cleansing effect. But who among us can truly manage a life without technologies? And whether we use social media or not, our lives are already enmeshed in a social reality that is constantly being reshaped by it.

Agency in a Digital Age

To think of social media and digital technologies as elements of a built environment designed to keep us close requires an expanded understanding of human agency in the digital age. In her 2012 book on machine gamblers in Las Vegas, Addiction by Design, Natasha Dow Schüll offers a helpful analogy. Whereas casino operators want gamblers to think that addiction is the result of moral failings or some biological imbalance, they spend their days creating gambling machines designed to solicit and maintain attention. Everything from the screen colors to the lighting and layout of the casino floor is designed to maximize users’ attention.

Facebook, OkCupid, and Google are not so different. They have material interests in incorporating certain features and scripts into their products to keep us attached through repeated interactions. Artist and software designer Benjamin Grosser has recently shown how Facebook’s widespread and fully integrated use of numbers may encourage users to reimagine themselves and friendships in quantitative terms and constantly seek out more clicks and shares.1818xBenjamin Grosser, “What Do Metrics Want? How Quantification Prescribes Social Interaction on Facebook,” Computational Culture, November 9, 2014; http://computationalculture.net/article/what-do-metrics-want. In our new digital environment, our changing conceptions of friendship and intimacy are not just our own; they are manufactured, manipulated, and monetized at unprecedented scales and with unprecedented effect.

When Facebook experiments on its users, it experiments with peoples’ lives, because digital technologies have come to play increasingly central roles, even if there is widespread lack of awareness of the broader environment in which personal interactions with these technologies take place. One of the researchers for Facebook’s “emotional contagion” experiment seemed surprised to hear that people’s emotional lives had become wrapped up in social media: “That was one of the things that caught me off guard, even though maybe in hindsight it shouldn’t have.”1919xJeff Hancock, quoted in Mary L. Gray, “MSR Faculty Summit 2014 Ethics Panel Recap,” August 19, 2014; http://marylgray.org/?p=301.

Are we simply unable not to participate in these systems and technologies? As Italian philosopher Giorgio Agamben put it in a different context, today’s digital human, “deprived of the experience of what he can not do…believes himself capable of everything, and so he repeats his jovial ‘no problem,’ and his irresponsible ‘I can do it,’ precisely when he should instead realize that he has been consigned in unheard-of measure to forces and processes over which he has lost all control. He has become blind not to his capacities but to his incapacities, not to what he can do but to what he cannot, or can not, do.”2020xGiorgio Agamben, Nudities, trans. David Kishik and Stefan Pedatella (Stanford, CA: Stanford University Press, 2011), 44. For a similar comparison, see Evgeny Morozov, “Every Little Byte Counts,” New York Times, May 16, 2014; http://www.nytimes.com/2014/05/18/books/review/the-naked-future-and-social-physics.html?_r=2.

In our contemporary digital ecology, we have struck a Faustian bargain with technology companies. They provide important benefits and services, but at a price that many of us are increasingly uneasy with. The pervasive influence of digital technologies is most obvious in their ability to separate us from what, in Agamben’s formulation, we can not not do. Our digital and physical lives, our online and offline selves, are radically and inseparably intertwined. The traces we leave while we are absorbed in the flow of the network are central to our lives both on and off the screen. To remain ignorant of this is to deny the character of our contemporary social life and indulge in an impoverished understanding of our own agency.

So what can we do? If digital detoxing is not an option, do we just decide which services are worth turning our data lives over to? In Ticona’s interviews, she found not so much a false consciousness or a fundamental paradox as a collective unease about the widening gap between what digital and social media claim to be and what people experience them to be, about the tradeoffs digital technologies and platforms like Facebook require: Give us data and we will connect you.

Our current ethic, the way in which we imagine human, technologically enabled agency (as if there were another kind), is insufficient. “Control” is not just about how much time we spend on our iPhone or on Facebook, but, rather, about mastery of an ethical space, of the way we live within our socio-technological environment. We generally have a hard time imagining the pernicious (or, at least, suspect) effects of Google and Facebook, the ways in which we are experimented on and tracked, because we often imagine them to be impenetrable “black boxes.”2121xTrevor Pinch and Wiebe Bijker, “The Social Construction of Facts and Artifacts: Or How the Sociology of Science and the Sociology of Technology Might Benefit Each Other,” in The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology, ed. Wiebe Bijker, Thomas P. Hughes, and Trevor Pinch (Cambridge, MA: MIT Press, 1987), 21. And, as Ticona’s interviews demonstrate, all responsibility is individualized. The animating assumption is that the self can be fashioned individually through gritty self-determination and autonomy.

An Ethics of Scale

To appreciate the possibilities and limits of human agency in our digital age, we need an ethics of scale that can help us make sense of the ways in which our individual practices are enmeshed in a socio-technical environment that is designed to “suck” us in even as it encourages us to blame ourselves for our failures of will. We need sharper and broader accounts of how the individual use of social media is always part of a much broader data collection and analysis effort.

Inversely, we need to understand how such large-scale aggregation and analysis depend, at the most basic level, on individual interactions. We need accounts that relate the micro and macro and tease out the gaps between the self that is shaped by data processing and the self that is shaped by our daily individual efforts to relate to others both online and offline. We need to zoom from the distant to the close, to toggle back and forth between big and small data, so that we might discern how human agency emerges along this vast and complex scale. We need, that is, a new ethic that will help us perceive the whole and hold dear the individual.2222xOn this need for scale, see Scott Weingart, “The Moral Role of DH in a Data-driven World”; http://www.scottbot.net/HIAL/?p=40944.

Such an ethic might help us better understand that arguments for privacy are not quaint longings for an era now past or manifestations of a self-serving desire to conceal shameful secrets. Our confusions about privacy are inextricable from similar concerns about “the public.” The increasing unease with social media and the aggregation of personal data is a sign of broader transformations in the cultural and normative structures that give shape to our public and private lives. This unease also reflects our individual and social capacities to understand the character of emerging forms of public spaces, and to decide which ones warrant our participation. Social media companies like Facebook imagine a monolithic public sphere that stands transparently opposed to an anonymous private sphere. Facebook wants to be the gatekeeper of our social lives and the central agent of our digitally mediated social interactions. But in order to foster a dynamic and open public sphere, we cannot sustain the mistaken assumption that our online and offline activities take place in separate spheres of experience.

We don’t have to impute malicious motives to Mark Zuckerberg and other tech titans to question their grand designs. The commercially successful corporate giants that structure most of our online engagements should not be seen as public utilities simply because their costs are hidden to us users. We need to develop, use, and encourage alternative spaces for expression and continue to explore ways in which we can build a more equitable and critical ecology for our digital lives.