The Post-Modern Self   /   Spring 2017   /    The Post-modern Self

Knowing Together

The Emergence of the Hive Mind

David Bosworth

Montage of (left) ENIAC (Electronic Numerical Integrator and Computer), Moore School of Electrical Engineering, University of Pennsylvania, 1946; PJF Military Collection/Alamy and (right) bee hive; Shutterstock.

As humming members of the hive and history’s children, how many ghosts are invisibly conspiring in each decision we make?

Homegrown terrorism, global warming, rampant economic inequality, financial corruption and corporate tax-dodging on an epic scale: The American project is surely in need of rescuing. But what sort of person can step in now to save our day? Despite the old narrative preferred by the NRA, the armed descendants of Natty Bumppo and John Wayne—who reliably emerged from their self-reliant solitude in the American wilderness to rescue their communities—are hardly equipped to defend us today against ISIS recruiters, cyber-thieves, or corporate polluters. To a postmillennial generation whose vision of the wilderness is streamed through a smartphone and whose members are (in the digital sense at least) almost never alone, the “lonesome cowboy” of American lore must seem a strange dude, indeed: nostalgically attractive in certain ways, perhaps, but also ludicrously irrelevant to the crises at hand—in no plausible way a moral metaphor for effective action.

All heroic myths exaggerate. But when an old paragon falls out of sync with a people’s changing reality, he becomes less an admirable guide than a figure of fun, his once virtuous parable the butt of populist parodies. In 1605, a similar crisis was poignantly inscribed in the tragicomic character of Don Quixote, who strove to revive medievalism’s romantic rituals and chivalric code in a proto-modern era of Machiavellian schemers and Puritan plain-speakers, provoking in his contemporary readers an intensely ambivalent reaction of nostalgic compassion and condescending laughter.

Four centuries later, we have entered an era as ambivalent and contested as his. In ways that are by turns thrilling, confusing, and frightening, America has been shedding many of the core presumptions of its foundational past as the first modern society, struggling through a cultural transition between ruling common senses a century in the making. We are undeniably post-modern now, in the literal sense that we are living after the height of the modern era, but we are doing so without having achieved as yet a coherent consensus as to what ought to succeed it.

Despite the scapegoating of our culture wars, the skirmishes of which have been the feverish symptoms of this deeper incoherence, the primary forces undermining our self-conception as a modern people have been self-induced and are ironically inseparable from our collective investment in material progress. We do “make sense” through the evidence of our senses, and our radically empowered technological tools, now reflexively embraced by every subset of our fractious body politic, have changed the ways that evidence is arrayed, weighed, and routinely shared. Completing a process that began with the earliest electronic media, the daily use of our digital devices with their search engines, shareware, and wiki-empowered social networking sites has been revising our expectations as to what seems natural, right, and delightful to behold. And it has been doing so in ways that undermine many of modernity’s core beliefs about the good as well as the true and the beautiful—about who a hero is and what he or she ought to do.

In a deliberate play on the word science, the intellectual achievements of which have been the hallmark of the modern era, I have been calling these post-modern ways of assessing the world conscientious thinking. The history of the word—combining a prefix (con) meaning “with” or “together” with a root (scientia) meaning “learning” or “knowledge”—defines both the sort of reasoning it describes and how that reasoning differs from the default practices of the modern mind. Whereas the science of modernity strove to know the world through an isolation of parts and specialization of thought, post-modern thinking now aims to know with. It selects con-scientious methods that can account for the “togetherness” of experience, naturally preferring interrelation over isolation, hybridity over purity, and the authority of consensus over the sovereignty of individual expertise.

Shifting Away from the Autonomous Self

Collaborative, interdisciplinary, multisensory, and multicultural, the conscientious mind strives in multiple ways to combine thought with feeling; the familiar with the foreign; this medium, genre, or discipline with that. In contrast to modern reasoning, whose primary metaphor was the atom (including the social atom of the lonesome hero), the postmodern mind prefers to attend to an entire field of interrelated effects.

This ongoing revision can be found in every discipline, from biology and anthropology to art and physics, and the disputes it has sparked are not limited to strictly intellectual pursuits. The contention between modernity’s scientific and post-modernity’s conscientious modes of reasoning has erupted in every domain of American life, revising the ways classrooms are run, money is made, crimes are committed, mates are chosen, and children are raised. Our default conceptions of space and time, and of the proper relationship between self and society, have been changing at an ever-accelerating rate. From that long list I am most interested here in the last—that is, the shift away from the modern conception of the autonomous self and, with its decline, the unanticipated emergence of the post-modern person in American life.

An Existential Joke

“We’re all in this together…by ourselves,” Lily Tomlin once quipped, her joke encapsulating an enduring tension in the human condition that, ancient or current, each tribe, clan, or nation-state has had to adjudicate. It expresses an ever-present potential conflict between our social allegiances and our self-centered appetites, both of which are native to us, and either of which the prevailing ethos of any society might choose to favor in its own place and time. To an extent unmatched in any other era, the emergence of liberal modernity in the West endorsed a life lived and comprehended (in much the same sense that books then were being silently read) “by ourselves,” and as a consequence, it fashioned over time—first in Europe and then more purely in America—an atomistic scheme that preferred and protected individual rights, intellectual specialization, an aesthetic point of view, and an economy of entrepreneurial free agents.

Initially, however, this assertion of an undomesticated individualism led to widespread anomie and anarchy; as the sorts of devastation initiated by Shakespeare’s villains demonstrated, a feudal conception of community could not withstand the sociopathic machinations of the Machiavellian man. It was only after a toxic period of civil and sectarian warfare that the moral imagination of the West found the cultural means to safely adopt modern reasoning, restraining it in ways that managed to sustain a wholly new version of the existential equilibrium expressed in Tomlin’s one-liner. The Protestant Reformation’s intense insistence on the necessity of moral introspection; the entrepreneurial economy’s creation of and willing submission to contractual law; the mythic invention of a lonesome hero who, though radically independent, always came to the rescue of his endangered community: These newly crafted cultural forms protected the quality of “togetherness” even as they licensed modernity’s emphasis on social, economic, and intellectual atomism.

But even the most astutely stable of social systems fall, and when they do, some of the forces they repressed and ideas they discounted can re-emerge from the basement of the collective unconscious, roughly hewn but newly invigorated. Which is to say that the evolution of human consciousness and culture proves to be cyclical as well as progressive, a feature that the modernized mind, future fixated, has difficulty apprehending. In many obvious ways, our post-modern life bears no plausible likeness to the pre-modern period. Yet on the fundamental issue of how to adjudicate that abiding tension in the human condition between our collective and individual natures, contemporary life increasingly endorses the medieval reading more than the modern one.

The broad subordination of economic labor within huge corporations and government bureaucracies; the merger of individual savings into both private mutual funds and Social Security; the near-instant evocation of mass rage or pity via satellite-relayed images (as provoked, for example, by 9/11, the most recent Haitian earthquake, and the Newtown and Orlando massacres); the proliferation of cell phones, e-mail accounts, and social websites, with their dramatic boost in the sheer quantity of human interactivity and the establishment of so-called online communities: In so many ways, the actual experience of everyday life is tacitly insisting on the reality and the value of “togetherness” over the reality and value of pursuing happiness “by ourselves.”

We are increasingly conducting our lives in a social and informational togetherness, while still evaluating them according to a traditional American ethos based on the value of living “by ourselves.” As in the 1760s, when the colonials still thought they believed in the British monarchy but were already acting less like royal subjects than independent citizens, our customary beliefs and current behaviors no longer cohere, and we await an equivalent revolution in self-conscious recognition that can render our lives harmonious again.

From Closed Book to Facebook

The emergence of modernity’s atomistic worldview was inseparable from the rapid spread of European literacy in the late sixteenth and seventeenth centuries. Unlike conversation or public recitation, the exchange of knowledge while silently reading was occurring “in here,” within the private sphere of the silent reader’s own mind; it was under that reader’s control, in a solitary process that encouraged enhanced analysis, independent thinking, and self-reflection. To read required withdrawing one’s attention from the immediate sensory and social environment, a process that, over time, enhanced the lingering sense (as much a feeling as a thought) that one was, inherently, more apart from than a part of both the natural and the social worlds: that we were, at our core, less in this “together” than “by ourselves.” This new conception of sovereign self-sufficiency was captured in its most optimistic tenor by the Elizabethan poet Sir Edward Dyer:

My mind to me a kingdom is;
Such perfect joy therein I find
That it excels all other bliss
Which God or Nature hath assigned. 11xAs a contemporary of Shakespeare, a friend to Sir Philip Sidney and fellow courtier, Edward Dyer lived at the height of England’s literary renaissance. His poetry was much admired in his time, but little of it has survived. And although “My Mind to Me a Kingdom Is” is commonly attributed to him, his authorship of it is by no means certain.

Not all of Dyer’s contemporaries found this new condition of psychological segregation from the natural and social worlds quite so blissful, and the new internal sense of the self’s rightful authority over the mind’s “kingdom” frequently clashed with the subservient role that most of the educated were forced to play in a still largely feudal political economy.

So it was that even as the print revolution extended the ideological reach of governments, literacy was cultivating the characteristic moods and messages of a newly sovereign self: its private ambitions and discrete point of view, its sense of social estrangement and personal entitlement, its disruptive demands for religious, economic, and political freedoms from a static social order ill-designed to grant them. The double challenge that followed was how to moralize this intellectually powerful but potentially antisocial individualism, turning the Machiavellian man into a responsible citizen, and how to reorganize society to both admit and refine the freedoms that new citizen required.

After Shakespeare’s soliloquies, the literary artifacts that most clearly reveal the psychological evolution of this modern sense of self are the secular diaries and spiritual journals that began to appear in the seventeenth century. These self-reflective instruments of a newly literate middle class were not intended to be read by others, much less broadly published. As complements to the silent reading experience, they supplied instead a separate mental space where the individual could actively explore his hopes and fears, making sense of the world and of himself without threat of ridicule or official censure. As a self-generated version of the bound book, the locked diary is the mundane object that best captures the underlying character of this modernizing self: its sense of segregation and self-containment, its insistence on the dignity of privacy as both natural and right, and its consequent need to strictly control the border between personal expression and public revelation. 22xThe two great cross-Atlantic examples of the emergence of the private diary in the seventeenth century are the ones kept by Samuel Pepys (1633–1703), a member of Parliament, and Samuel Sewall (1652–1730), a judge in the Massachusetts Bay Colony, who participated, much to his later shame, in the Salem witchcraft trials.

Weakened during the television era, that wall has now been obliterated by the Internet and the ethos of transparency that has arisen with it. By 2011, there were already more than 181 million active blogs online, with thousands more added each day.33xFor the number of blogs, see “Buzz in the Blogosphere: Millions More Bloggers and Blog Readers,” Newswire, March 8, 2012, http://www.nielsen.com/us/en/insights/news/2012/buzz-in-the-blogosphere-millions-more-bloggers-and-blog-readers.html. As of the third quarter of 2015, Facebook had 1.55 billion active users, while the individual adults in this cohort had, on average, 338 “friends” each, to whom they were exposing their opinions, photos, and daily activities.44xFor Facebook, see Statista, “Number of Monthly Active Facebook Users Worldwide,” accessed November 30, 2016, http://www.statista.com/statistics/264810/number-of-monthly-active-facebook-users-worldwide/. In the same year, Twitter was trending toward 300 million active users, who, like so many fireflies, were fighting the anonymous night of mass society by lighting up the Internet with their every whimsical opinion in 140 characters or less.55xFor Twitter, see Statista, “Number of Monthly Active Twitter Users Worldwide,” accessed November 30, 2016, http://www.statista.com/statistics/282087/number-of-monthly-active-twitter-users/. Meanwhile, to inform and entertain YouTube’s 1.3 billion viewers, 432,000 hours of new videos, recording activities from car repair to self-mutilation, were being uploaded every day.66xFor YouTube, see Statistic Brain, “YouTube Company Statistics,” accessed November 30, 2016, http://www.statisticbrain.com/youtube-statistics/. The trend is unequivocal and, absent authoritarian intervention, irreversible: Rather than bind, contain, and lock away its innermost thoughts and feelings, the single self today is driven to project them into our highly porous public sphere of virtual information. The modern pursuit of privacy, as encapsulated in the locked diary, has rapidly flipped into the post-modern imperative of publicity, as pursued through the online posting.

Even the old medium must now submit to the new imperative of self-exposure. After re-reading a boxful of her old teenage journals, Sarah Brown began sending painful and embarrassing excerpts to a group of friends online, and this evolved in 2005 into a Brooklyn-based reading series called “Cringe,” in which volunteers would share their adolescent diaries, journals, letters, and poems with an appreciative crowd. Given its own Facebook page, the idea rapidly spread to many other cities. Brown eventually published an anthology of representative excerpts, and others who were drawn to the notion of outing their teenage angst created a performance piece, “Get Mortified,” that has been staged in theaters from coast to coast.

These amateur authors and their audiences are not just laughing at the juvenile insecurities captured by their formerly secret jottings; they are also lightly mocking the ethos of privacy that kept them secret for so long. The adult fondly condescending to the child she once was coexists with the cybercitizen condescending to the declining values of a literate modernity—just as, four centuries ago, Cervantes’s readership was nostalgically laughing at the chivalric code. In the Internet age, when transparency has become imperative and “sharing” the new norm, everyone knows that we are, primarily, all in this together rather than by ourselves.77xFor the history of “Cringe” readings, which, as of 2016, were still occurring in various locales, see Cringe, “Que Sera Sera: Cringe,” accessed November 30, 2016, http://www.queserasera.org/archives/. The “Get Mortified” crew has turned the concept into a multidivisional industry with films, books, and a television interview show on the SundanceTV channel as well as in their live performances. See Mortified Nation, “The Mortified Yearbook,” accessed June 25, 2016, https://web.archive.org/web/20140228205725/http://www.getmortified.com/about/yearbook.php.

From Couple to Cluster, Pair to Pod

What does it mean in the practical sense to have 338 friends, the average number for an adult Facebook user? Although ambitious extroverts in the past did strive to sustain a wide circle of acquaintances, their ways of doing so were constrained by the media that were available for personal communication. Prior to literacy, face-to-face conversations and messages orally conveyed secondhand through travelers were the only ways to interact with family, friends, and colleagues. Close relationships were limited to one’s immediate locale, and could not be easily sustained if one party moved away. Once literacy became widespread and postal services were established, the personal letter greatly extended the reach of social interaction, making tele-communication between family and friends practical for the first time. Years later, the telephone added the spontaneity and emotional richness of the spoken word, transforming conversation itself into a telecommunication. In each of these instances, however, the social connection was primarily point to point, one to one—writer to reader or speaker to auditor. The grammar of interaction strongly enforced by both the weekly letter and the daily phone call was binary; we telecommunicated primarily in couples—a couple of friends, a pair of lovers, siblings, or colleagues.

As their metaphorical names suggest, the Internet and the Web have dramatically expanded the possible patterns for social interaction at a distance. With just a few strokes on one’s keypad, any single digital message, verbal or visual, can be disseminated in multiple directions at once, potentially going viral and lighting up smartphones around the world. One-to-one communication is still common, but as the surge in blogs, podcasts, chatrooms, listservs, interactive games, and networking sites has demonstrated, what people can do, thanks to technological advances, they will do. Intensely social creatures, human beings will, it seems, seize every opportunity to expand the range and variety of their interactions. With that expansion, the psychological shield previously mustered to protect the intimacy of the couple (whether friends or lovers) from the intrusive crowd has begun to dissolve, and the once private dialogue between pairs, like the old monologue of the diary, is increasingly shared with the larger group, clique, posse, or pod. As the old pop song put it, “one” may be “the loneliest number that you’ll ever do.” But, in the digital era especially, “Two can be as bad as one. / It’s the loneliest number since the number one.”88xHarry Nilsson, “One,” on Aerial Ballet (record album), RCA, 1968.

Group texts and e-mails, webpage postings, video and photo exchanges on YouTube, Snapchat, and Instagram, smartphone apps like Google Hemisphere that allow friends to keep track of one another’s geographical location in real time: The technological ease of social connectivity has magnified the urgency to make (and keep in touch with) as many friends as possible. We are becoming the paparazzi of our own lives, each Facebook update and selfie upload a new episode in a “reality TV show” staged online. This trend, too, is measurable in a way. The media research company Nielsen reported that in 2010 Americans spent close to a quarter of their online time using social media, an astounding 43 percent increase from the preceding year. By 2016, time spent on those sites had crept even higher.99xSee Andrew Keen, “Sharing Is a Trap,” Wired, April 3, 2011, http://www.wired.co.uk/article/sharing-is-a-trap. The proportion of online time devoted to social media use had risen to 30 percent by 2016. See Jason Mander, “Social Media Captures 30% of Online Time,” globalwebindex, June 8, 2016, http://www.globalwebindex.net/blog/social-media-captures-30-of-online-time.

Here, the question naturally arises whether more is necessarily better. Doubting the depth and sincerity of the connections made through our social media, critics remind us that the dominant platforms are not as free as they might seem. In the cautionary words of Jaron Lanier, one of the pioneers of virtual reality, “The whole idea of fake friendship is just bait laid by the lords of the cloud to lure hypothetical advertisers,” who will then try to use the voluminous data gleaned to target consumers and induce viral conformity via peer pressure.1010xJaron Lanier, You Are Not a Gadget: A Manifesto (New York, NY: Alfred A. Knopf, 2009), 54. And along with the joys of enhanced sociability—the “bliss” of belonging to the hive mind as opposed to Dyer’s isolated kingdom—familiar forms of bad behavior like bullying, shaming, shunning, slander, exhibitionism, and fraud are also radically empowered online.

Hiding behind an online identity, a vicious mother taunts the rival of her teenage daughter; a callow college freshman, mistaking cruel for cool, posts a cell phone photo of his new roommate caught in a homosexual embrace; a digital sadist, providing both emotional endorsement and practical advice, encourages a distraught woman to commit suicide. The flaws of human nature don’t disappear in the fleshless precincts of the virtual world and, as on many a frontier, may actually be accentuated in the early phases of their development.

For good or for ill, though, the current shift from private monologue and intimate dialogue toward virtually accessible forms of multilogue appears to be irreversible. The emerging consensus that two can seem as lonely as one, for example, is evident in today’s mating scene, where group dating, once a practice largely limited to younger teenagers, has become popular with singles well into their twenties, and where commercial websites are turning the blind date, too, into a collective activity.1111xFor an example of a commercial website set up to facilitate this trend of group dating, see Grouper (www.joingrouper.com).

From Enlightened Self-Interest to Subconscious Conformity

Rather than track new patterns of behavior, my last example of the erosion of atomistic individualism by conscientious thinking concerns a new understanding about the motives and mechanics of individual decision making. Although this discovery has emerged from what might be best classified as sociological research, it has implications that touch on issues vital to psychology, economics, medicine, politics, law, and ethics. The research was conducted by Nicholas Christakis and James Fowler, and was derived from data collected by the Framingham Heart Study, which has been tracking the vital statistics and psychological states of the residents of one Massachusetts town for over five decades. The researchers were initially interested in the impact of social contacts on health habits, and the richness of the Framingham data allowed them to track the long-term behavior of more than 12,000 individuals.

The results, as reported in Connected: The Surprising Power of Our Social Networks and How They Shape Our Lives, were startling, and have further undermined modernity’s presumptions about the individual as a rational and self-reliant decision maker. As clearly tracked on the researchers’ graphs, health habits spread rapidly through the separate social networks of the Framingham population: Whom one knew strongly affected what one chose to do—overeat or not, smoke or not—and highlighted the power of emulation in human behavior. Further study showed that the influence of these social networks was not limited to health decisions, leading the authors to conclude that

our connections affect every aspect of our daily lives…. How we feel, what we know, whom we marry, whether we fall ill, how much money we make, and whether we vote all depend on the ties that bind us. Social networks spread happiness, generosity, and love. They are always there, exerting both subtle and dramatic influence over our choices, actions, thoughts, feelings, even our desires.

More startling still, according to the authors,

our connections do not end with the people we know. Beyond our own social horizons, friends of friends of friends can start chain reactions that eventually reach us, like waves from distant lands that wash up on our shores.1212xNicholas A. Christakis and James H. Fowler, Connected: The Surprising Power of Our Social Networks and How They Shape Our Lives (New York, NY: Little, Brown, 2009), 7.

Our misery or happiness, our good or bad health, and our indifference or commitment to political participation, are not only contagious; according to Christakis and Fowler, they are mysteriously influenced at a distance by the decisions of people we never meet. The persistence of this influence within the social networks could be traced through “three degrees of separation,” so that the habits of a man’s sister’s neighbor’s wife had a statistically significant effect on his own behavior. If she quit smoking, though out of sight and out of mind, his chances of doing the same were increased by nearly a third.1313xIbid., 7.

Rooted in statistical analysis, these results contradict the pristine formulas of modernity’s neoclassical economics, fatally undermining that theory’s conception of the individual consumer as a solitary and rational calculation-machine. And if economic man does not exist as such, then neither can the model of collective behavior based on him. Instead, society’s “invisible hand” (not merely the market’s) influences our decision making in ways that resemble the graphs reported by Christakis and Fowler: sometimes rationally, sometimes not, with the drive to conform (whether productively or self-destructively) serving as a perennial force. And as is suggested by the contagious behavior evident in both the market’s boom-bust cycles and the Framingham data, self-interest, as it is actually practiced, includes a powerful inclination to mimic those around us—a drive to belong that is not only non-rational but frequently subconscious. Although we may be capable of describing our individual behaviors on a daily basis, we remain largely in the dark about our actual motivations and the many external sources that may be shaping them—some, apparently, at a three-degree remove.

The implications of these findings—which are linked to many other recent studies in both the natural and social sciences too numerous to list here—are profound.1414xFor an extensive review of how even the natural sciences have been affected by postmodern logic, see my essay “Conscientious Thinking and the Transformation of the Modern Sciences,” Raritan 33, no. 3, http://raritanquarterly.rutgers.edu/files/article-pdfs/bosworth_raritanxxxiii3_web.pdf. The evidence Christakis and Fowler have uncovered challenges long-trusted presumptions, procedures, and policies—not just the pet theories of our economic establishment but the institutionalized logic of modernity itself. In their discovery that at every level “we are all in this together,” connected and interacting in multiple ways, these findings are especially disturbing to an American nation whose political economy and moral imagination have been rooted in the belief that we are, primarily, pursuing happiness “by ourselves.”

What, then, do freedom and responsibility, reward and punishment, mean—in the courts, in the market, in the home and on the street, in the annals of fame or the accounts of the divine—if each of us is ceaselessly influenced at a subconscious level by the choices of those around us? As humming members of the hive and history’s children, how many ghosts are invisibly conspiring in each decision we make? Who gets the credit, who the blame, and, from both a personal and a public policy perspective, how do we initiate restorative change?

Alternatives to Modernity

The questions that arise from the findings of Connected are troubling ones, and insomuch as the medium of modern thought has been undermined along with its messages, we lack a consensus on how to address them. In the recent past, America has bred two extreme and unattractive critiques of modernity’s scientific worldview: a religious fundamentalism, now evident in all the Abrahamic religions, and a radical postmodernism, most commonly found on campuses and in art studios.1515xFor a more elaborate analysis of the philosophical origins and secret similarities of religious fundamentalism and radical postmodernism, see my essay “Conscientious Thinking: Fundamentalism, Nihilism, and the Problem of Value during the Demise of the Scientific Worldview,” Georgia Review 60, nos. 3–4 (2006): 712–39. Although they have emerged from opposite sides of the cultural spectrum, both are afflicted with the temper of an undiscriminating absolutism. While the members of one group worship the idol of a fail-safe knowledge, imagining that they hold something like the whole truth in their scripture-laden hands, those in the other bow to a fail-safe ignorance, excusing themselves from the obligations of meaningful language and ethical behavior.

What these opposing groups share is a fearful flight from the subtle grading and shading of the middle way. They reject those partial disclosures of human knowing that mark the bounds of our thinking species, and by so doing, they deny the double bind of vulnerability and accountability that calibrates the mental reach of a creature whose freedom and intelligence are both inescapably real and perpetually limited.

Nevertheless, these movements prove diagnostically useful. They have come into being for credible reasons, responding to the very real inadequacies of a worldview in decline. Yes, the rigid beliefs of today’s fundamentalists (whether Christian, Judaic, or Islamic) ironically enact the very idolatry their scripture forbids. But the scientific worldview they oppose is, finally, a dehumanizing philosophy incapable of providing any higher meaning to our lives, and just as nature abhors a material vacuum, human nature abhors a spiritual one. Although the nihilism so playfully pronounced by the radical postmodernist thinker does supply an easy alibi for avoiding the agonizing trials of striving to be truthful, striving to be good, the Enlightenment expectation of a scientific certainty in human affairs has been an exceptionally disastrous delusion, complicit in political crimes on a once unthinkable scale, and our conventional thinking desperately needs a reinfusion of humility.

When taken as a whole—crudely juxtaposed, unrestrained, and unassimilated—these reactionary critiques do seem monstrous. As visions of our collective future, they emerge from the oceanic dark of the collective unconscious like two thunder-footed beasts out of Tokyo Bay. Such menacing and repulsive images are characteristic, however, of every epochal transition in human consciousness and culture, every middle phase in the rudimentary articulation of a new identity.

This was true of early modernity’s Machiavellian man, at once so vital and so venal, as irresistible as he was appalling. And it was also true of his contemporary enemy (and secret brother), that early version of the Protestant self which viewed its own emerging inner powers as potentially appalling: “Was ever heart like mine? / A sty of filth, a trough of washing swill, / A dunghill pit, a puddle of mere slime.”1616xEdward Taylor, “Still I Complain, I Am Complaining Still,” in Major American Poets, ed. Oscar Williams and Edwin Honig (New York, NY: New American Library, 1962), 38.

Yet the rough makings of democracy’s well-tuned citizen—at once materially ambitious and morally self-disciplined, so beautifully aligned to express and refine the powers unleashed by modernity’s new place—abided embryonically in both those figures, the Machiavellian schemer perpetually on the make and the Protestant seeker pathologically obsessed with his own fallen state. Transitional periods such as theirs and ours demand a conscientious integration of those crude-but-vital traits unleashed by their technological innovations, and in the early modern era, the West was awaiting the mutually self-moderating marriage of the Machiavellian man’s egregious self-promotion with the early Puritan’s excessive self-loathing.

Now, however, the poles of concern have been reversed. Whereas in the seventeenth century the challenge confronting the West was how to both license and tame the new social and intellectual atomism spurred by the printed book, the crisis today is how to express and restrain the messy, multifarious togetherness of the digitally interconnected field. What we can do through using our new technologies will have to be disciplined by new communal understandings of what we should do. Today, it is not the rogue individualist who threatens the social order but the sociopathic collective, whether the new gang of cyber-thieves or vandals, the virally vicious digital mob, or, more profoundly given its ever-increasing global power, the post-modern corporation, which has been adopting all the technological tools of togetherness while applying them for purely self-serving ends—which, like the Machiavellian plotter of old, has been usurping the authority of public governance while shirking that authority’s traditional responsibilities.

Populist voices on both the left and the right have decried the ineptitude and injustices of our current nation-state, and not without real reasons. But before we simply celebrate today’s often digitally empowered “disruptions” of its authority, we need to take a very careful look at the ethical nature of the often ad hoc groups that are aiming now to take its place. That sobering exercise might remind us that the success of democratic governance is rooted in qualities of self-restraint, of stoic tolerance and civil discourse, qualities that are all too rare in today’s blogosphere.

Will our communal wisdom ever catch up with our accelerating cleverness? Can we achieve a civilizing conscientiousness appropriate to the age without imploding first into another round of anarchic fury, culture war turning into civil war, as occurred in Europe during the 1600s? I lay no claim to prophetic powers, but on this I will insist: Rather than the threat of Islamic terrorism or even the rise of China’s authoritarian capitalism, the greatest long-term challenge to the survival of American democracy is the necessity that our nation reform itself for a post-modern age that its own inventions have been generating.