On our recent return to the United States after a decade away in our other language, my family was struck by a change in American English. The parts of speech were sliding around. Nouns became verbs, verbs became nouns, and both became passive and adjectival. This confused us. If someone sent a text message that read, “I’ve been hammocked on a treed hill,” should we send help? Was getting hammocked like getting jacked? And what dog could tree a hill?
It was “okay” to twist usage; people “got it.” If a moisturizer ad read, “WRINKLE RESULTS IN ONE WEEK!” only a low-value consumer would wonder where on her face the wrinkle would appear.
Process was chic; agency was uncomfortable. Things happened automatically even when we did them. We no longer followed trends; trends simply trended. We didn’t take care of ourselves; we engaged in self-care. We couldn’t check ourselves in, but we could use self check-in. We didn’t obey public health orders; we were in lockdown.
As a family from two countries, we were hyphenated Americans. This sounded like heart trouble. We flew to a non-towered airport, self-concealed in a low-rise urban area, and groaned over the health-care options in our new-employee on-boarding pre-package.
We wondered: Were we condensing phrases to terms because we were typing with our thumbs? Had we come to expect listeners and readers to autocomplete and fill in syntax? Had work jargon saturated private life because Americans worked such long hours? Had a generation told by daycare providers that they were good toy-picker-uppers grown up to make a norm of behaviorist verbing? Had the passive constructions by which one avoids assigning blame (or credit) in the workplace made naming who did what seem rude?
Or did sounding technical have a political flavor? Did it announce, “I believe that science is real,” as some lawn signs in our new neighborhood did, along with other tenets of what apparently was a new, progressive Nicene Creed?
Or were people just preening, using pseudoterms to sound savvy?
What did talking like this do?
If your mom exclaims, “Oh, you picked up all your toys!” you have pleased your mom. If your mom says, “You are a good toy-picker-upper,” it means that your performance met standards. If you have thin hair, it means you don’t have thick hair. If you are follically impaired, you have a hair disability. A toy-picker-upper is an economic phenomenon, and a follically impaired scalp a medical one. Neither is personal.
Self-words used to be reflexive. What we self-did, we did on our own. We applied ourselves to a task. We self-applied ointment. Self-regard was how we saw ourselves. Now, to self-apply means to use an online application. Why do we self-apply online but simply apply on paper? The “self” comes into play when we consent to use an automated process from a third party.
That we may lack choice in the matter is elided, along with the interaction that the process replaces. When we self-do something now, we often take on a task that someone used to get paid for. Conversely, in the case of pejorative self-words like self-treatment, we usurp a task that professionals claim the right to supervise, for a fee. Or, like self-rising flour or self-sealing envelopes, we simply operate, or fail to operate, according to someone else’s design. When we self-comply, self-manage, or self-monitor, we bow to rules imposed by management in entities that we often cannot name. Our agency is restricted to compliance. Somebody, somewhere, as a matter of policy, is presenting compulsion as choice.
Detached from agency, the meanings of new terms drift. Nonprofit organizations alert supporters to “donation opportunities,” though “a chance to give” has half the syllables. Now, “donation opportunity” may also mean the organization’s chance to land a gift from a donor. From there, the donors themselves become “donation opportunities.” A chance to be good ends up as a sticky note on your back.
It seems important to be clear on who is doing what. What happens to ethics when agency is blurred? Using words that sound like terms for processes lets writers off the hook for cruelties that conscience might otherwise repress: It is science, not spleen. The term self-slaughter has come into vogue as a synonym for suicide, though it adds lurid insult to injury for the mortally depressed. Sounding clinical means never having to say you’re sorry.
The impulse to paint actions as process often has a squeamish quality, as if frank interaction were taboo. Physicians call giving their patients bad news “serious illness communication.” Prescribing daily walks is a “home-based, walking exercise behavior change intervention.” Wikipedia defines a photo shoot as “the process taken by creatives and models that results in a predetermined visual objective being obtained.” A battered spouse does not get hit, but rather “experiences concussive events.”
This squeamishness sounds like fear, as though we will be legally challenged, socially shunned, or eliminated by budget cuts unless we can cite—or sound as though we are citing—research or policy. A music reviewer insists that music “tells foundational stories, teaches emotional intelligence, and cements a sense of belonging.” A journalist reviewing board games calls them “playfully partnered activities.” She consults “an intimacy behavior therapist” for the affirmation that games stimulate neural pathways. Even Louis Menand, in The New Yorker, feels compelled to note that “art and literature have cognitive value” in an article on the tensions in today’s universities between the humanities and science, technology, engineering, and mathematics.
The implication is that what we do must be sanctioned by metrics of productivity. We shouldn’t goof around at home unless it’s to grow our neurons. Where does this stern notion come from?
In his magnum opus The Enchantments of Mammon: How Capitalism Became the Religion of Modernity, Eugene McCarraher traces the roots of the idea of self-realization, of finding joy in maximalizing one’s capacities, to management dogmas concocted a century ago at the business schools of Harvard, Wharton, and Sloan. The idea went hand in glove with Taylorism: If workers work harder in their own interest, then employers’ goals ought to seem to align with that interest. Since maximal profits conflict with pay, a more abstract union had to be preached. Thus in the 1920s, Mary Parker Follett, commonly referred to as “the mother of modern management,” promoted the idea of a “crescent self” within each worker that would wax, brighten, and blossom under benign managerial cultivation.
Political theorist Wendy Brown and theologian Kathryn Tanner arrive, from different points, at the same conclusion as McCarraher: The dogma of self-realization underlies our current outlook. We are encouraged to see ourselves, Brown says in Undoing the Demos, “as self-investing human capital,” entrepreneurs of ourselves. In Christianity and the New Spirit of Capitalism, Tanner explains that “what your employer wants—the maximally efficient use of your capacities—is also what you want, what you yourself value, because you see it as part of your own individual efforts at self-realization, not something you are forced into by a foreign power through external imposition.”
The idea that corporate and personal goals are one is inculcated in young minds through teaching methods known as PBL, which stands for process-based learning, problem-based learning, or project-based learning, take your pick. Freed from droning lectures and the mandate not to wiggle, kids split into groups to observe, brainstorm, and experiment their way to reinventing the Pythagorean theorem. In the process, they acquire durable skills and a glorious sense of self-efficacy.
Teachers no longer impart knowledge; they facilitate knowledge acquisition. What students acquire is capacity, not least for teamwork. You no longer just flunk a test on material that you could later master by borrowing a book. You fail to “develop life skills” that are “vital in the marketplace,” among them “working with others.”
In the phrase self-directed learning we see the same elision of agency as in the term to self-apply. Teachers must purchase, follow, and regularly update curricula devised by remote experts. Following soporifically abstract “taxonomies” of “objectives,” teachers use “guiding words” and “question stems” to keep students on track. They encourage the use of “process verbs” that hail from adult work spheres.
Why shape students’ language with “process verbs” and “question stems”? For one thing, it guides kids to return results that fit with assessment criteria. What kind of creativity, innovation, and entrepreneurship do we venerate, exactly?
“Thrifting is probably going to be one of the biggest fashion phenomena of the 2020s,” Vanessa Friedman announces in a 2021 “Here to Help” column in the New York Times. To thrift means to shop for used things but also to sell or consign, as in “thrifted items.” Never mind the ambiguous agency. “Welcome to the age of recommerce. It’s one of my favorite new terms,” Friedman writes. She consults two experts on the best online deals; one says to include “vintage” in your Google search “so you know you’re getting pre-owned.”
Pre-owned is hot. Never mind that the hype flips thrift itself—the attempt to stretch a dollar, to not “grow the economy,” to survive outside the Market—into a billion-dollar boom. Just as repairing and making things at home is now the “DIY (Do-It-Yourself) industry,” which “aims to help customers improve their homes,” thrifting is now big commerce with a quirky name that we can google in order to “buy in.”
And here we see the source of the term-creation meaning-kidnap crisis. To create new terms, twist old ones, and tweak usage makes the new language fragment or term stick out in a “searchable” way. The very irksomeness is saleable: A torqued, awkward phrase or word sticks out and can be tracked as a data point and therefore as a commodity: Call it “thrifting” and be sure to add “vintage.”
Thrift is “a thing” now. It’s having “a moment,” one of two time units that is currently “a thing.” If a moment is a saleable buzz, a decade is a market category. Seen algorithmically, we too are categories: Gen X, Gen Z, constellations of data points that corporate marketing professionals cull to identify us, guide our behaviors, and reap profits.
You might think that to embrace the jostling, competitive language trends of data-gathering would alienate us from ourselves. You might think we would assimilate the notion that we need to “buy in” to the very things that we ourselves make and do. You might even think it could even make us feel that our very lives were “high-value” or “low-value.”
That’s “okay,” though. Growth serves all. If corporate America staged and won the “mommy wars” by getting two workers for the price of one, recouping the second family wage as debt, that only serves to grow the numbers. If corporations want to turn intimate details of women’s health, including images of their genitals, into a booming “fem-tech industry,” we can rejoice; it’s only what females want: “There’s definitely an increasing appetite for anything in the world which is technology, and a realization that female consumer power has arrived,” exults Michelle Tempest of the Candesic health care consultancy, in the New York Times. Exploitation and self-realization are one; the cheetah and the gazelle carcass are one meme.
In a global economy of oligarchs, to portray music and games as tools to increase productivity may constitute what Timothy Snyder, in his 2017 book On Tyranny, called “obeying in advance.” When we meekly appeal to “the numbers,” we affirm corporate authority to gauge the value of our acts, to authenticate or reject our social selves.
But isn’t that “okay” because, after all, doesn’t commerce happen in “the private sector”? Isn’t it only governments that can oppress us?
In Anatole France’s Revolt of the Angels, published in 1914 at the brink of the twentieth century’s European conflagrations, a fallen angel assures us that Satan’s war with God is “all about business.” The twentieth century’s collapses into belligerence grew out of a political tug of war in which both the right and the left insisted on governing society on the basis of productivity as reflected in “hard data.”
The National Socialist German Workers’ Party, a.k.a. the Nazis, believed that its enthusiasm for sports, health, technology, pedagogy, dog breeding, murderous racism, and resolutely inhumane medicine represented social science that would ring in a millennium of enlightened prosperity for those with “good genes.” Vanquishing this lethal ideology required that we ally with the Union of Soviet Socialist Republics, whose pursuit of historical materialism had already killed millions of people in the name of enlightened social engineering.
The problem was not only that the science was wrong. The problem was that many people believed that science had obviated the need to consider the arena known as right and wrong. Given human fallibility, was it not better simply to be guided by metrics of success or failure? At the Nuremberg trials, movers and shakers behind the horrors explained that they were not responsible: They had not done right or wrong, only facilitated a process.
The word for man in my family’s other language, Icelandic, refers, like the English word man, both to the male sex, as in “man and woman,” and to all people, as in “mankind.”
In 1980, Vigdís Finnbogadóttir, a French teacher and theater director, became the first woman to run for the presidency of Iceland. Vigdís was, and still is at ninety-two, a dignified, principled, and physically lovely person. Inevitably, her fleshly garment was “the story”: “The thing” was, she was a woman.
At last, when asked on a televised panel, “Why should people vote for you because you’re a woman?” she replied, “I don’t want anyone to vote for me because I am a woman! I want people to vote for me because I am a man.”
The grammar was clear. She meant a man, one man. Yet she also meant mankind. She demanded that her physical female self be judged as a man like any other.
It was as startling as it was incontrovertible: She was a being of our ancient kind. She was not a category. She was, she was. She demanded, Look at what I say and do.
We do things for, by, and with some reason. Since language is common ground, with a deep, rich soil of historical meaning, we can describe our acts and reasons in common terms. As a child, you picked up your toys not because you were a good toy-picker-upper but because your mother made you. (Trust me, you were not “a self-starter” in this regard.) She made you clean up because it hurts to step on blocks. You obeyed her because you needed her. You did it well in order to please her. You wanted to please her because you loved her. No one knows why we love, but we do.
It is right to act on the basis of love, not hate. It simply is.
If anyone asks you for the data on that, or to run the numbers, REBEL!
You will not be alone.
This article was not self-written. I wrote it here at my desk, with difficulty.