The Cultural Contradictions of Modern Science   /   Fall 2016   /    Book Reviews

Rise of the Machines

Leann Davis Alspaugh

Anxiety has played an unexpectedly large role in the history of word processing.

The arrival of a new computer usually inspires equal parts anticipation (more power! more features!) and trepidation (will I lose all my files?). It doesn’t typically involve a crane and the removal of a window. But it did on the day in 1969 when British novelist Len Deighton took delivery of an IBM MT 72. As Deighton watched workmen pry out a window of his Georgian home and lift the 200-pound magnetic tape typewriter into his second-floor study, he had a moment of doubt: “I was beginning to think that I had chosen a rather unusual way to write books.” Deighton’s second thoughts perhaps had less to do with how the machine was damaging his home than with how its use might change his writing. As digital humanities scholar Matthew Kirschenbaum demonstrates in Track Changes, anxiety has played an unexpectedly large role in the history of word processing.

Initially, word processing equipment was ridiculously expensive. Early adopters such as Deighton—whose 1970 book Bomber Kirschenbaum considers the first novel written on a word processor—and science-fiction writer Jerry Pournelle were willing to pay as much as $10,000 for bulky, noisy machines. For this privilege, they had to absorb highly technical owner’s manuals, work with cumbersome equipment and tiny screens, master complex keystroke commands, outsmart glitch-ridden software, and trust their work to comically limited and finicky storage media. This didn’t comport easily with word processing’s promise of fluent, stress-free work.

In addition, new users learned that publishers were often reluctant to accept manuscripts prepared on a word processor. It was as though their authors had somehow cheated or were less creative than the ink-stained wretches hunched over battered typewriters. Even on the West Coast, the birthplace of word processing software and computer technology, there was suspicion about word processing’s implications for the authenticity of an author’s work. In 1978, Kirschenbaum relates, Bonnie MacBird was working on a screenplay for a science-fiction film when she visited Xerox’s Palo Alto Research Center. There she met the legendary computer developer Alan Kay, originator of the term “personal computer.” (A year later, another key meeting would take place between Kay and one Steve Jobs.) Kay excitedly showed MacBird the Alto system, with its Gypsy interface featuring WYSIWYG—What You See Is What You Get—printing capabilities. This early word processing technology boasted a variety of formatting options and fonts. But for her Tron screenplay, MacBird insisted on using Courier, a font that produces letters resembling those made by a traditional typewriter, because, as she recalled, “only something that looked like it was hot off a writer’s typewriter would be received with interest by the studios.”

Authors also worried that using a word processor might take away autonomy or disrupt the writing process. John Hersey (Hiroshima, A Bell for Adano) contended that “unless you write with a pencil you haven’t chosen the words.” There were also very real fears about the stability of the equipment and the mysterious processes involved in saving one’s work. Jimmy Carter, for instance, saw these fears come to excruciating fruition when he lost most of his post-presidential memoir after pressing the wrong key on his $12,000 Lanier word processor. What Kirschenbaum calls “techno-fatalism” persists even now, in spite of word processing’s ubiquity. We don’t really know how it works, yet we trust it implicitly.

To read the full article online, please login to your account or subscribe to our digital edition ($25 yearly). Prefer print? Order back issues or subscribe to our print edition ($30 yearly).