Part Four: Opacity in Personal Chrono-tech
As a conclusion to this series on the limits of black box metaphors in critiques of obscured technological systems, I want to offer a brief example of an alternative approach. Earlier this year, I presented this material as a lecture. Since then, a new black box has entered the marketplace—Apple’s Watch. I have not yet interacted with Apple’s “most personal device,” but I expect (largely merited) critiques about how the Watch embeds Apple’s system ever deeper in the daily routines of users. With both fewer buttons and less screen real estate with which to interact, the inputs and outputs for this system will probably be more passive and less obtrusive, even as the background software and hardware processes grow more complex. What new routines and rhythms of attention will the Watch afford, and on what algorithmic processes of surveillance, marketing, or communication will this attention depend?
We will need new audits. We will need to know, as with the iPhone, what information this new device is storing and sharing, and with whom. The Watch’s role in collecting medical data should give us particular pause in this regard. But when considering constraints on agency and freedom, we shouldn’t limit our analysis to revealing the processes at work “inside” this device. The processes by which we live with such devices deserve as much attention as the routines at work in the operating system. And we can learn a great deal about this device’s role in our lives without ever peering inside the system.
As a prompt in this direction, I’ll offer a brief tour of objects that, like the Watch, “want” to be a part of our everyday rhythms of attention, yet make “seamful” rather than seamless opacity a foregrounded aspect of our interaction with them.
Take, for example, Sejoon Kim’s Vague Clock. In contrast to Apple’s Watch, it offers the time not "on demand" (with the raise of an arm), but "on exploration" (with the caress of a hand). The clock’s almost opaque fabric makes the reading of time at a glance almost impossible. Instead, the laborer at her desk must get up and not only tap the clock face, but explore it, changing a two-dimensional plane into a three-dimensional form.
The speculative designs of Fiona Dunne and Anthony Raby are also instructive here. Their 2007/08 series of objects entitled DO YOU WANT TO REPLACE THE EXISTING NORMAL? includes The Risk Watch, a watch whose opaque face carries a small nipple in place of any visible marks of temporal passage. When placed to an ear, the nipple activates a small device which speaks a number that “corresponds to the political stability of the country you are in at that time.” Dunne and Raby state about this body of work that “if our desires remain unimaginative and practical, then that is what design will be.” The Risk Watch gives us what we want—a sort of single-app Apple Watch—in a way that invites us to examine both the desires we bring to personal tech, and the processes we trust to grant them.
Dunne and Raby’s approach to opacity might also call to mind the NoPhone, a project launched last year via Kickstarter that reached some unexpected, if modest, financial success. The NoPhone, billed as “a technology-free alternative to constant hand-to-phone contact,” is simply a brick of black plastic molded in the size and shape of an iPhone. In use as a replacement for one’s phone, the device aspires to deliver a different sort of “reveal,” catching the user in the act of relentless phone-checking. Like Ben Grosser’s Facebook Demetricator, the NoPhone calls to mind counter-addiction regimes, but does so with some humor, and a desire to cast human habits into the spotlight.
Another provocative neighbor to Apple’s Watch is the Durr, a product of the Norwegian studio Skrekkøgle. As with the NoPhone, the Durr’s designers create personal technologies that utilize opacity in order to reveal something about the user’s daily activities. In this case, however, the object also introduces a modest new machinic process into the picture. Like the NoPhone or the Vague Clock, the Durr presents a wholly opaque face where a screen or dial might normally reside. Inside the object, however, resides a small vibrating motor that operates at five minute intervals.
For a few months now, I’ve been replacing my usual watch with a Durr for a day or two each week, with enlivening effects. The Durr reveals not only my habits of watch-checking, but the relative speed at which time passes in relation to the intensity and direction of my attention. Checking email, I can’t believe how fast the Durr is going. Traveling across town on foot, the durations seem broad and wide. Five minutes is just long enough to forget the thing in many cases, just too long to be counted by the human attention clock. Its opacity depends in part on me as much as on the device itself. As such, wearing the Durr casts my other machinic attention regimes into new light and invites me to reorient my body accordingly.
I could go on to mention a dozen different life-management and attention-management tools, simple things like www.donothingfor2minutes.com, or “productivity” apps such as Freedom, which disables a device’s internet for set periods of time. Where such efforts serve behavior-modification regimes, they should surely be set in the historical context of disciplinary, labor, or even religious regimes.
Set next to the growing number of algorithm auditing efforts, however, such attention-modification works serve a different function. They show how, in the quest to understand the influence of machinic processes on human agency, there is much to be learned without ever “unboxing” the technologies at hand. As we move forward with the vital work of monitoring and interpreting the multitude of new processes at work behind our technologies of attention, we should take great care not to stop our efforts at the algorithmic reveal. We should insist on the co-presence of at least two other bodies of work in the growing intellectual spaces devoted to critique of algorithms—that of critical race, gender, and labor studies, which reveals the differently-structured life on which the new algorithms depend, and of design, art, and play that casts human action and desire toward interface in new light.
Kevin Hamilton is an artist and researcher at the University of Illinois, Urbana-Champaign, where as an Associate Professor he holds appointments in several academic units across theory, history, and practice of digital media. He is currently at work with Infernal Machine contributor Ned O’Gorman on a history of film in America’s nuclear weapons programs; other recent work includes a collaboration with colleagues at Illinois’ Center for People and Infrastructures on the ethics of algorithms in internet and social media platforms.