Ned O’Gorman, in his response to my 79 theses, writes:
Of course technologies want. The button wants to be pushed; the trigger wants to be pulled; the text wants to be read—each of these want as much as I want to go to bed, get a drink, or get up out of my chair and walk around, though they may want in a different way than I want. To reserve “wanting” for will-bearing creatures is to commit oneself to the philosophical voluntarianism that undergirds technological instrumentalism.
We’re in interesting and difficult territory here, because what O’Gorman thinks obviously true I think obviously false. In fact, it seems impossible to me that O’Gorman believes what he writes here.
Take for instance the case of the button that “wants to be pushed.” Clearly, O’Gorman does not believe that the button sits there anxiously, as a finger hovers over it, thinking “oh please push me please please please.” Clearly, he knows that the button is merely a piece of plastic that when depressed activates an electrical current that passes through wires on its way to detonating a weapon. Clearly, he knows that an identical button—buttons are, after all, to adopt a phrase from the poet Les Murray, the kind of thing that comes in kinds—might be used to start a toy car. So, what can he mean when he says that the button “wants”?
I am open to correction, but I think he must mean something like this: “That button is designed in such a way—via its physical conformation and its emplacement in contexts of use—that it seems to be asking or demanding to be used in a very specific way.” If that’s what he means, then I fully agree. But to call that “wanting” does gross violence to the term, and obscures the fact that other human beings designed and built that button and placed it in that particular context. It is the desires, the wants, of those “will-bearing” human beings, that have made the button so eminently pushable.
(I will probably want to say something later about the peculiar ontological status of books and texts, but for now just this: Even if I were to say that texts don’t want, I wouldn't thereby be “divesting” them of “meaningfulness,” as O’Gorman claims. That’s a colossal non sequitur.)
I believe I understand why O’Gorman wants to make this argument: The phrases “philosophical voluntarism” and “technological instrumentalism” are the key ones. I assume that by invoking these phrases O’Gorman means to reject the idea that human beings stand in a position of absolute freedom, simply choosing whatever “instruments” seem useful to them for their given project. He wants to avoid the disasters we land ourselves in when we say that Facebook, or the internal combustion engine, or the personal computer, or nuclear power, is “just a tool” and that “what matters is how you use it.” And O’Gorman is right to want to critique this position as both naïve and destructive.
But he is wrong if he thinks that this position is entailed in any way by my theses; and even more wrong to think that this position can be effectively combated by saying that technologies “want.” Once you start to think of technologies as having desires of their own you are well on the way to the Borg Complex: We all instinctively understand that it is precisely because tools don’t want anything that they cannot be reasoned with or argued with. And we can become easily intimidated by the sheer scale of technological production in our era. Eventually, we can end up talking even about what algorithms do as though algorithms aren’t written by humans.
I trust O’Gorman would agree with me that neither pure voluntarism nor purely deterministic defeatism are adequate responses to the challenges posed by our current technocratic regime—or the opportunities offered by human creativity, the creativity that makes technology intrinsic to human personhood. It seems that he thinks the dangers of voluntarism are so great that they must be contested by attributing what can only be a purely fictional agency to tools, whereas I believe that the conceptual confusion this creates leads to a loss of a necessary focus on human responsibility, and an inability to confront the political dimensions of technological modernity.