If you listen to the machine telling you how to get out of it you only get sucked into it more, like a con artist that lulls you into trust by telling you he is conning you. The promised liberation from technology is usually just another technology that you don’t recognize as such. This is one reason why a fuller appreciation of our diverse techniques is so vital.
In the quest to understand the influence of machinic processes on human agency, there is much to be learned without ever “unboxing” the technologies at hand. As we move forward with the vital work of monitoring and interpreting the multitude of new processes at work behind our technologies of attention, we should take great care not to stop our efforts at the algorithmic reveal.
In some ways, our thinking about our technologies and algorithms stands to get stuck on the “reveal,” the first encounter with the existence of a black box. Such reveals are appealing for scholars, artists, and activists––we sometimes like nothing better than to pull back a curtain. But because of our collective habit of establishing new systems to extricate ourselves from old ones, that reveal can set us on a path away from deliberative and deliberate shared social spaces that support our fullest goals for human flourishing.
Maybe, by structuring our engagement with the experience of Facebook’s opaque processes through the black box metaphor, we’ve set ourselves up to construct a new black box, and ignored the ways in which our relations to others, within and without the present system, have been changed by our newfound awareness.
More than a mere Taylorist repeater of actions, the new ideal worker of post-war Human Factors research not only acts but perceives, acting according to learned evaluative routines that correlate sensation to action. The ideal post-War laborer is not a person of a particular physical build, conditioned to perform particular motions, but rather a universalized collection of possible movements, curated and selected according to mathematical principles.
Many of the creators of these technologies want the user to attribute a certain power to these algorithms and have shielded them from the details. Ultimately, I think the most appropriate response is some sort of intellectual humility in the face of technology that we are detached from, without it veering into fear or veneration, or mockery. Only then can we engage with algorithms in the absence of undue emotion and try to see, even if only a bit, what they are actually doing.