We are only now beginning to understand why the unstated norms that shape the design and use of computational algorithms need to be made more explicit—and therefore subject to critical public debate. For now, Google and Facebook don't have mastheads or editorial pages. The names of the people who make judgments about what an algorithm does are hidden. All we have is the veneer of mechanical neutrality and the invocation of an objectivity that operates beyond the human.
The virtual dimensions of assembly may yield insights for how we understand more traditional assemblies and the legal protections that we assign to them.
We need to think more about the process of cultural modeling. How do we model a cultural subset through a data set (a generation, contemporary television), and how do we model a cultural practice or concept through a particular measurement? These aren’t easy questions, but they are the prerequisite for correcting against the journalistic just-so stories of cultural criticism.
We have a difficult time imagining the future of the humanities beyond the anxieties of professors and the failures of university administrators. And when we invoke the humanities, we are actually speaking about a beleaguered and exhausted profession. There are only professors and their disciplines here. And they both are trapped, as Nietzsche would say, in a "castrated" passive tense: "The humanities are compelled . . .." There are no agents in this drama, just put upon, passive professors.
If we think of Facebook and Google and the computations in which we are enmeshed merely as information-processing machines, we concede our world to one end of the scale, a world of abstracted big data and all powerful algorithms. We forget that the internet, like any technology, is both a material infrastructure and something we do.