Part III: Toward other Hermeneutics
I want to make clear here that I believe we need to keep pushing for new research—new policies and practices that help ensure just algorithmic processes at work inside our infrastructures. (See posts one and two of "Beyond the Reveal.") If our search engines, pricing structures, law enforcement or trade practices depend on or enact unlawful, unethical, or unjust algorithmic processes, we need to have ways of stopping them. We need accountability for these processes, and in some cases that will also mean we need transparency.
But, as urban studies scholar Dietmar Offenhuber points out in Accountability Technologies, accountability isn’t inextricably linked to transparency. In fact, some forms of revelation about opaque processes may do more harm than good to the public. If we make information access a priority over “answerability and enforcement” when it comes to just algorithmic infrastructures, Offenhuber warns, we may not achieve our goals.
So there may be times when “opening the box” might not be the best path to dealing with the possibility of unjust systems. And it is almost certainly the case that our black box metaphors aren’t helping us much in research or advocacy when it comes to charting alternatives.
In my own collaborative work on a Facebook user study, my co-authors and I focused primarily on a question directed to users: “Did you know there’s a black box here, and what do you think it’s doing?” The results of this study have set us on a path to at least learning more about how people make sense of these experiences. But in some ways, our work stands to get stuck on the “reveal,” the first encounter with the existence of a black box. Such reveals are appealing for scholars, artists, and activists—we sometimes like nothing better than to pull back a curtain. But because of our collective habit of establishing new systems to extricate ourselves from old ones, that reveal can set us on a path away from deliberative and deliberate shared social spaces that support our fullest goals for human flourishing.
I confess that at this point, I bring more cautions about black box hermeneutics than I bring alternatives. I’ll conclude this post by at least pointing to a path forward and demonstrating one possible angle of approach.
My critique of black box metaphors so far leads me to the following questions about our work with technologies:
- How else might we deal with the unknown, the obscured or opaque besides “revealing” it?
- Do we have to think of ourselves as outside a system in order to find agency in relation to that system?
- Can interface serve to facilitate an experience that is more than cognitive, and a consciousness not ordered by the computational?
As Beth Novwiskie pointed out in a response to this post in lecture form, we already have at least one rich set of practices for addressing these questions: that of interpretive archival research. Are not the processes by which a corpus of documents come to exist in an archive as opaque as any internet search ranking algorithm? Isn’t part of the scholar’s job to account for that process as she interprets the texts, establishing the meaning of such texts in light of their corporeal life? And aren’t multiple sensoria at work in such a process, only some of which are anticipated by the systems of storage and retrieval at hand? Understood as “paper machines” and technologies in their own right, certainly the histories of how scholars and readers built their lives around epistles, chapbooks, encyclopedias, and libraries have much to offer our struggles to live with unknown algorithms.
We might also, however, look to the realms of art, design, and play for some productive alternatives. Take for example, the latest black box to take techno-consumption by storm—Apple’s iWatch. This object’s use is almost certainly headed in the direction of integration into users’ lives as a facilitator of new daily routines and systems, especially by the quantified self set. Other writers on this blog have already helpfully set the new box in the context of its precedent in meditative practices or contemporary tech labor. But as we work to understand how the new systems involve us in new, opaque processes, a glance at some more intentionally opaque neighbors might be of help. In my next post, I’ll set a few recent objects and experiences next to the iWatch for comparison for how they invite distinct incorporation into the rhythms of daily attention, thought and action.
Kevin Hamilton is an artist and researcher at the University of Illinois, Urbana-Champaign, where as an Associate Professor he holds appointments in several academic units across theory, history, and practice of digital media. He is currently at work with Infernal Machine contributor Ned O’Gorman on a history of film in America’s nuclear weapons programs; other recent work includes a collaboration with colleagues at Illinois’ Center for People and Infrastructures on the ethics of algorithms in internet and social media platforms.