With the recent NCAA ruling in the Northwestern football case—in which college players were deemed eligible to unionize—the question of work on campus has reared its ugly head once again. However much the ruling was more narrowly about the behemoth that is NCAA sports, it was yet another sign of the increasing professionalization of all things academic. Even those students who were thought to be playing games are working. Whether it's graduate students or student-athletes, there is a lot more work going on on campus than people had previously thought. In the good old days, work was something you did on the side while you studied. Now, it might be the only thing you do.
In response to the decision, voices from inside the university were familiarly hypocritical. There’s just too much money at stake not to preserve the myth of the amateur or the apprentice. (In fact, it’s appalling how much money is in play in a system that still largely uses medieval guild models to understand itself.) Voices in the press, on the other hand, were typically nostalgic. Wouldn’t it be nice if this weren’t the case? Isn’t there some way of restoring the work-study balance so we can imagine real student-athletes and not financially indebted apprentices? Remember Greece?
Rather than banish the idea of work like some unclean vermin out of a Kafka story, we should be taking the opportunity to look it more squarely in the face, and not just in the world of college sports. Work is everywhere on campus today. It's time we accepted this fact and start rethinking higher education accordingly -- starting with that most hallowed distinction between working and studying that informs almost everything else we do.
Conflating work and study might seem to many to bring together categories that are traditionally seen as opposites. That’s why we have two words for them after all. But they're also supposed to be sequentially related. You don't just do one instead of the other. You are supposed to move from the one to the other.
Imagining work as part of the study equation rather than its opposite or its aftermath challenges this equation, especially assumptions about its general level of success. We largely operate with what we could call a mimetic theory of learning in which students imitate, usually poorly, “real” work, after which at some point they presumably stop simulating and start producing. No one knows when this is supposed to happen or how. The failure rates in graduate school, or the difficulties most undergraduates face in moving from school to jobs, testify to how little this process results in putting people to work. And yet we insist on keeping "work" at bay.
My point is that the recent influx of work’s presence in the academy might in fact be good for learning. With all the anxiety about business models entering into the university, whether it’s students who mostly just play sports, graduate students who spend a lot of time teaching, or faculty who are subject to the market demand of student-consumers, the thing we haven't considered is the act of learning and the vast majority of students who engage in it. What happens to them when learning looks a lot more like work?
In the humanities, to take the example I'm most familiar with, when students write papers for a class they are writing papers that are like good papers, but usually are not. There is a theory of learning-by-simulation that governs the practice. (I’ll ignore exams since we all know those are just compromises with the devil of overpopulation.) Students’ papers, and in most cases the grown-up ones they are supposed to emulate, are socially worthless. They don’t produce anything. They are poor copies, in Plato’s sense. Students may or may not drop by to pick them up at the end of the semester and, if they don’t, we throw them in the trash. Yes, that’s true, but telling.
But what about the growth! say the pundits. How else are students supposed to reach their full, expressive intellectual capacities without these exercises? Yes, it is like exercise. You practice to simulate the real game. It’s not like those kids playing basketball in sixth grade aren’t imitating their grown-up peers—that’s what learning is.
But here’s the difference. What the NCAA ruling shows us is that by the time of higher education, these students are no longer imitating. It is work in its own right and is being compensated accordingly. (Well, at least the coaches and administrators are being compensated.) Why not have the same expectations for our students? And certainly for graduate students, who are way past the sell-by date of learning by imitating? Why keep alive the pretense of simulation when in practice there is all sorts of work going on and potential for even more of it?
Because we’ve been doing everything on a simulator deck for so long, it’s hard to imagine what this might look like. But try to picture more assignments and activities that aim to solve real-world problems, even at the simplest of levels. Wikipedia pages are an obvious first step. Why not teach composition using this public space? Why not make new knowledge in the process of honing your writing skills and creating a suite of pages around an overlooked cultural feature? Along the way, students would also learn what it means to edit, a skill most of them very much need.
It would also teach them how to relate to other people’s writing. As we move more and more from a book-based world to a digital infrastructure, why not teach students how to contribute to that process? I don’t mean have them transcribe OCR instead of talking in class (too Matrix-y?). But certainly encoding texts and tagging them with more information that can be used by others are activities that require a good deal of thought and critical judgement. At the same time, this information is useful to others. Think of the boon to historians to have more information available in digital archives. Perhaps students might apply their new-found knowledge about the novel (Jane Austen, no doubt) by writing an algorithm that looks for something textually interesting in the myriad pockets and corners of the Internet. There is so much to analyze out there, why not do it rather than pretend to? I'm imagining something like the Maker Movement, though less artisanal in spirit and more analytical and applied.
The detractors will, of course, have seizures about all of this—they will say it overlooks the disinterested nature of learning and undoubtedly marks the end of the life of the mind. My colleagues in the humanities might argue that it smacks of a science system that prioritizes laboratory doing over seminar simulation. (They've been doing this for a long time.) Prepping for cubicle life is no way to think about Aristotle, they would insist, whither the critical thinkers? Once those ivory towers are opened to the world of work, there will be no end to it.
This is where I think the either/ors have it wrong. First, I’m not a big fan of pretending. We can pretend that work hasn’t infiltrated campus, but that doesn’t make it so. For those on the outside looking in, like the judicial system, we are in a serious state of denial. Ostrich-politics, as they say in Germany, isn’t a very effective or attractive technique (not least because of one's embarrassing posture and the rear-end exposure).
Second, and I think more important, we can do this in ways that are intellectually valuable. We need to get over our fear of the simple—simple things, when brought together over time, add up to much more. Witness Wikipedia. You can imagine any number of smaller-scale forms of student-curated information. In fact, this might be a very useful way to frame a new kind of work-study stance. Instead of apprentices simulating their superiors, students are curators making and sharing knowledge simultaneously at different levels of scale.
I think the most compelling reason is also the least appetizing: in the end we must. Higher education has simply become too expensive to justify itself as a pasture for the world’s elite. Or to put it another way, that’s all it is unless we do something differently. It’s time to get to work.