Youth Culture   /   Spring 2009   /    Essays

Wither Adulthood?

James Davison Hunter

“Les différents dégrés des Ages” (c. 18th century). Via Wikimedia Commons.

Much has been made of the nature, content, and valorization of contemporary youth culture. The transformations in our notions of childhood and the culture that surrounds it have been staggering and worth all the ink that has been spilled on the subject. To a culture fixated on youth, it is not out of place to once again call attention to the fact that the prominence bestowed upon children and the world they inhabit today would have been unimaginable—in ways both amusing and repugnant—to our forebears, not only in the West but in other parts of the world as well. But understanding youth culture is not only an exercise in analyzing its internal discourse, structure, and development; it is also a matter of understanding the transformations that have been linked to it. Perhaps the most important of these are changes in the meaning of adulthood. Indeed, the poignancy of change in contemporary youth culture in the late modern West becomes even clearer relative to the changing nature of adulthood.

Before Age

It is fair to say that in most societies, the nature of childhood made sense only in relation to adulthood, that is, as a prelude to adulthood. And adulthood, in turn, was not just a matter of chronological age. As John Gillis among many other historians has noted, in Western history prior to the industrial revolution, little attention was paid to distinctions between ages.11xJohn Gillis, Youth and History: Tradition and Change in European Age Relations, 1750–Present (New York: Academic, 1974). While the church kept records of birth, baptism, marriage, and death, average people did not know their own age with any precision—there was little to compel them to keep careful track. There were no age limits on entry into most occupations, and by the same token, no mandatory retirement ages. Vocation depended only upon an ability to carry out the task.

The same can be said of military service. The mercenary armies of the early European states cared only about the ability to perform as a soldier. Age was also irrelevant to political power, since political standing was a privilege of the propertied classes. The relationship of property to power spilled over into the practices of early democracy where landholding, religion, and race, along with age—first set at 21—defined the parameters of suffrage.

As to education, schooling was not universal because literacy was unimportant to the economy, and for this reason, education was not graded according to years. For those who enjoyed the privilege of literacy and learning—a small number indeed— younger and older boys sat together in classes.

In short, children and adults lived daily life together with little regard for the distinctions of age. They saw life less as a progression of chronological years or even distinct periods organized by age clusters (for example, infancy, childhood, adolescence, adulthood, middle age, and old age) than as a fairly nebulous chain of being. After the age of seven, children entered the adult world, and maturity within it occurred gradually and largely imperceptibly. In fact, the chronologization of the life-course into set stages that were more or less age specific only really emerged in popular consciousness in the nineteenth century.

Adulthood Postponed

To understand these cultural transformations, one must begin with the account made famous by French social historian, Philippe Ariès, in his masterwork, Centuries of Childhood. Though medical science had begun to make more nuanced distinctions in periodizing age long before, prior to the seventeenth century, young people between the ages of seven and fifteen were generally not seen as children but rather as diminutive adults. One important reason for this was religious: the age of reason and accountability, according to medieval theology, was seven. Accordingly, as Ariès notes, the art of the period portrayed the young not as children but, rather, as adults in miniature. Their dress, expressions, and mannerisms were all those of grown-ups. But by the seventeenth century, young people were being represented with distinct clothing and appurtenances. Childhood was coming to be recognized as a stage of life that required a certain kind of attention from parents and institutions. The cultural justifications for this were found in the educational theories of Erasmus in the early sixteenth century, Comenius in the early seventeenth century, and Locke in the late seventeenth century, who all condemned the use of punishment and advocated respect for the young, as well as age-appropriate methods in learning. By 1762, when Rousseau published his controversial novel, Émile, childhood was coming to be seen as a time of innocence, playfulness, and happiness, and children themselves as objects of sentiment rather than objects of utility. Children had their own physical and psychological needs, and therefore, they required special care and consideration, including the need to be protected from the harsh realities of the world. Accordingly, they acquired their own food, clothing, recreation, music, and space.

The recognition of childhood as a distinct period of life, its prolongation, and the culture of sentimentality that arose with it, marked an important shift in Western culture. It was coterminous with the advance of modernity, and it was slow in coming. What developed first among the aristocracy and upper classes in the 1600s was, by the end of the nineteenth century, becoming a reality among common people.

There were several factors that created conditions in which a world unique to the young and their needs could emerge. The dramatic decline in infant mortality in the last half of the nineteenth century and the gradual separation of the world of work from the world of domesticity accomplished through industrialization were certainly critical. Probably the main factor making this social transformation possible, however, was the creation of a surplus economy and the relative but expanding economic freedom and security it provided. Agrarian economies, of course, were never more than a few months from starvation. Young people, after the age of six or seven, were required to participate in economically productive labor and, as soon as they were able, to do the labor of a fully grown man or woman. This continued to be true of the early industrial period as well.

It was not until the 1830s that one begins to see efforts to protect children in the workplace. Massachusetts, for example, was viewed as progressive as the first state to impose a ten-hour workday limit on children in 1842.22xSee the Child Labor Public Education Project at the University of Iowa Labor Center’s website for a brief history of child labor laws in the United States: http://www.continuetolearn.uiowa.edu/laborctr/ child_labor/about/us_history.html. Other states followed, though the laws were not consistently enforced. The survival of the family and the community depended upon the labor of the young. Young people who did not contribute to the economic wellbeing of the household were a luxury none but the wealthiest could afford. Yet over a period of centuries—starting with the upper classes, moving to the expanding middle classes, and finally reaching the working classes—the young became less and less necessary for the economic survival of the family and community. Still, as late as 1910, less than 10 percent of all seventeen-year-olds in the U.S. graduated from high school, and most of these were women.33x“Table Bc258–264—Public and Private High School Graduates, By Sex and as a Percentage of all 17-Year-Olds: 1870–1997,” Historical Statistics of the United States Millennial Edition Online: http:// hsus.cambridge.org/HSUSWeb/search/searchTable.do?id=Bc258-264. Not until 1918 did all states have compulsory attendance laws.44xNational Center for Educational Statistics, Digest of Education Statistics, 2004 (U.S. Department of Education Institute of Education Sciences, 2004): http://nces.ed.gov/programs/digest/2004menu_ tables.asp.

The age considered appropriate to childhood continued to expand through the twentieth century. Federal law finally imposed a minimum age for employment and hours of work in 1938. By 1940, half of all seventeen-year-olds graduated from high school. This figure continued to increase to 77 percent in 1969 where it has remained fairly constant ever since.55x“Table Bc258–264.”

It is interesting how language came to reflect this slow social transformation. The word “adult” did not come into usage in the West until the middle of the sixteenth century and didn’t come into the popular vernacular until the 1870s. What existed prior to this point were gender-specific cognates of adulthood: man, gentleman, woman, lady, and so on. Similarly, the word “adolescence” had been in existence since the fifteenth century, and yet it was only in 1904 that the psychologist G. Stanley Hall popularized the “discovery” of adolescence. It was not until 1921 that the word “teenage” came into use and not until 1941 that the word “teenager” became a popular term. These words derived from an earlier word, “teener,” but even that was not in use until the 1890s.66xFor a brief history of the etymology of the word “teenager” see: http://www.etymonline.com/index. php?term=teenage

The point is that it was not just that the period appropriate to childhood and youth expanded or that a distinctive language and culture surrounding it developed. As significantly, if not more so, adulthood and the maturity it implied were postponed. But they were postponed, not nullified. Through the twentieth century, mandatory conscription, the demands of entering the work force in an expanding economy, the traditional moral obligations surrounding marriage and family, and the like, all meant that the responsibilities of adulthood were inevitable.

A Further Postponement

Needless to say, all of the factors that created conditions for the expansion of childhood and adolescence have remained in place to the present. If anything, they have been reinforced and prolonged by other interrelated developments.

Perhaps the most significant change has occurred in the conditions brought about by the further evolution of capitalism. The prelude for this was a massive expansion of higher education in the decades following World War II. The GI Bill, passed in 1944, provided the resources to make it possible for veterans to go to college, and about half of all World War II and Korean War veterans and nearly three-fourths of all Vietnam War veterans took advantage of the opportunity.77xUnited States Department of Veteran Affairs, “History of the Department of Veteran Affairs”: http://www1.va.gov/opa/feature/history/history6.asp. The number of those entering higher education swelled accordingly. Prior to World War II, only 8.4 percent of all 18to 24-yearolds attended college. By 1949, this figure had doubled to 15.2 percent. A decade later the number had grown to 23.8 percent, and by 1975, 40 percent of all 18to 24-yearolds were in higher education.88x“Table Bc523–536—Enrollment in Institutions of Higher Education, By Sex, Enrollment Status, and Type of Institution: 1869–1995,” Historical Statistics of the United States Millennial Edition Online: http://hsus.cambridge.org/HSUSWeb/search/searchTable.do?id=Bc523-536. In short, just as the growth in the number of young people staying in high school postponed the age of entering the workforce for more and more people in the first half of the twentieth century, the growth in the number of young people going to college extended that delay in the second half of the twentieth century.

The expansion of higher education simply created the conditions in labor necessary for a transformation in capitalism from a market dominated by the production of goods through industry and manufacturing to one dominated by the production of knowledge, ideas, and services. The consequences of this massive shift have been gradual but profound.

Consider this contrast. Through the better part of the twentieth century, corporate life—however large or small—was defined by a highly rationalized and compartmentalized division of labor, organized within a vertical structure of authority, oriented toward long-term and stable productivity. In such a framework, employees could envision a secure and enduring future that would last into retirement. Even manual workers could envision the possibility of owning a home and providing for their family with the realistic expectation that their children would have it better than they did. As Roland Marchand argued in his book, Creating the Corporate Soul, the size and power of many corporations—measured by their number of employees, the magnitude of their production, their capital resources, their national scope in distribution, and their capacity for political influence—transformed the configuration of social institutions within which many Americans lived. In terms of the stability of social life, the corporation increasingly occupied the place once inhabited by the church and synagogue, the neighborhood, and the family. As such, it played an ever-larger role as the channel by which the young, especially men, would mature into adulthood. Corporations (or the corporate model applied to other sectors of the economy) provided the stable progression within which life would be lived.

This situation has changed considerably. At the start of the twenty-first century, faith communities, neighborhoods, and families are no stronger than they were in the 1950s, and, arguably, they are considerably weaker as institutions. And while corporations are no less strong than they were at mid-century, for all sorts of complex reasons, a new structure and culture in corporate capitalism has come into being that has vastly different consequences for the young maturing into adulthood.

For one, the bureaucratic structure and ethos of the old corporation had lost its intellectual standing by the time of its greatest institutional strength. Scholars, from Max Weber and Lewis Mumford to Jacques Ellul and Theodore Roszak, had discredited the old model as suffocating to individuality, creativity, and imagination. This critique was not only established among intellectuals and within the counter-culture but, as David Franz has demonstrated, among mainstream management theorists as well.99xDavid Franz, “The Ethics of Incorporation,” Diss. University of Virginia, 2008. At the same time, hierarchical models of power among executives and managers were also being undermined by new sources of foreign capital, hedge funds, and mutual funds—capital flows that required a greater flexibility and dynamism capable of providing greater returns on investment. For these reasons and others, corporations have gradually shed the top-down, pyramidal organizational model in favor of structures that provide flexible and collaborative work environments. For all of the benefits of this transformation in the structure and culture of modern organizations, the cost was a deterioration of the stability of these structures and a weakening of their paternalistic ethos.

The ideal worker of the early and mid-twentieth century offered loyalty to the company and, in return, was rewarded with the expectation of progress up the ladder of responsibility and remuneration. Today, the worker who prospers tends to repudiate dependency, operates well in environments of mobility and ambiguity, and capitalizes on loose and fluid networks of social relationship. As Richard Sennett has argued in The Culture of the New Capitalism, the young adult entering the work force faces greater uncertainty in the job market, greater ambiguity about the skills needed to succeed, greater instability in the job itself, and even less certainty about the path to success.1010xRichard Sennett touches on this idea throughout both The Culture of the New Capitalism (New Haven: Yale University Press: 2006) and The Corrosion of Character (New York: Norton, 1998).

All of these changes have been concurrent with the growth in postgraduate education. In a postindustrial economy, specialized, postgraduate education is both highly prized and much needed, and yet for the young adult entering the workforce, it also represents a response to the problem of uncertainty in the job market and ambiguity in a career path. One simply becomes more marketable, increasing the chances of acquiring a well-paying job and keeping it. The number of those pursuing advanced degrees has swelled accordingly. In the thirty years between 1976 and 2006, the total number of those enrolled in graduate or professional education increased 63 percent (from 1.57 million to 2.57 million).1111xNational Center for Educational Statistics, “Participation in Education: Table 11–1,” Digest of Education Statistics, 2007 (U.S. Department of Education Institute for Education Sciences, 2007): http://nces.ed.gov/programs/coe/2008/section1/table.asp?tableID=872. The greatest part of this growth was from the influx of women into graduate/professional education. But here again, the responsibilities typically associated with adulthood have been deferred through this development.

Yet another related factor in the transformation of adulthood is the delay in marriage. The increased participation in higher and postgraduate education (especially among women) and the decline in traditional sexual mores in the second half of the twentieth century both contributed to this trend. The median age of marriage in 1960 was 22.8 for men and 20.3 for women. Nearly a half century later (2006), the average for men was 27.5 and 25.9 for women.1212xJeffrey Jensen Arnett, Emerging Adulthood: The Winding Road from the Late Teens Through the Twenties (Oxford: Oxford University Press, 2004) 5. At the same time, cohabitation rates increased. From 1960 to 2006, the number of cohabiting unmarried couples increased 1,357 percent, from 439,000 to 6.4 million.1313x“The State of Our Unions 2008: The Social Health of Marriage in America,” The National Marriage Project (2009): http://marriage.rutgers.edu/Publications/SOOU/2008updatepdf. Paralleling this increase was the rise in the age of women at the time of their first child. On top of all of this are some indications that the number of young adults, aged 25 to 34, living with their parents is increasing. One recent survey found that 65 percent of all college graduates expected to move back into their parents’ home upon graduation.1414xAs cited by Alexandra Robbins, “Statistics on the Qarterlife Crisis, Twentysomethings, and Young Adults,” Quarterlife Crisis (2005): http://www.quarterlifecrisis.biz/qc_stats.htm. The point, of course, is that rites and commitments long considered marks of maturity and adulthood have been postponed.

Here again, the further prolongation of childhood and adolescence is reflected linguistically in efforts to find neologisms to describe the change. Terms such as the “odyssey years,” “emerging adulthood,” “boomerang kids,” “peter pan syndrome,” “adultolescence,” “kidults,” “thresholders,” all point to the same phenomenon—a period of high mobility, without stable employment, etc. The ambiguity accompanying this “new” stage in life is even immortalized in popular culture in such productions as the Broadway musical Avenue Q and the film Garden State.

All of this suggests that adulthood is not just being postponed but transformed. The meaning of adulthood is evolving. What are the dynamics of that transformation?

Adulthood as a Normative Ideal

Clearly adulthood has never been a matter defined by chronological age alone. At least in the West and arguably cross-culturally, the meaning of adulthood has long been more spatial than temporal, more hierarchical than linear. Maturity, in other words, was a social status and thus represented a higher place than childhood in the social and cultural hierarchy. It was a matter of honor that conferred respect and admiration, and therefore, it was something to be prized.

The nature of adulthood as a kind of social and moral status is clearest in the rites of passage or initiation that used to inaugurate it. Rites of passage have both a biological dimension and a social dimension, and it has been the social that has mattered most in determining the time of transition. The variation in age for these rites attests to this. Rites of passage are significant in part because they change the status of the young person into an adult; they are even more significant for the meaning they give to the changes that are occurring.

Comparative anthropology has done a good job of showing the wide range and ceremonially complex nature of such rites. Despite the variability, it is clear there is also a relatively common structure they tend to share. For one, rites of passage are never self-invented or administered by the young for the young, but rather are rituals that emanate from the community through its adult members, who guide the process in all its ceremonial detail. Alongside this is the fact that these rites are highly gendered. Boys and girls have their own ritual paths, and, accordingly, adult men and women play different roles in the process.

A second feature is the commonality of a three-stage process. Needless to say, these vary considerably from tribe to tribe and society to society. The first stage is separation. This is a moment of symbolic death to one’s youth, and it is represented through hardship, physical pain, literal separation, tests of endurance, and the like. It is this stage that has made rites of passage legendary—as in the mythical “killing the wild boar with one’s bare hands.” In most cases, the trials are not borne in isolation but are highly ritualized social events. The second stage is a period of transition between statuses—a phase in which the initiate is recognized as neither a child nor an adult. In our own time, this is akin to engagement where one is neither single nor married. The third stage is a ritual reincorporation through which the novices are symbolically reborn and reintegrated into the community, but this time as men and women. The young have been instructed in the accumulated wisdom of the community, and they can now assume their place among the other adults.

The third characteristic of these rites of passage is that the ceremony itself is imbued with sacred significance. In turn, the status one achieves by enduring this ritual also carries a sacred quality. What is sacred is the authority of the group and the ideals implicit with the new identity.

As remote as such rituals may seem to us, they point to something that is very important in social life and human experience. This is that adulthood has long been regarded as a moral ideal, a good to be desired and sought after. The qualities of manhood and womanhood embodied the virtues of the social order and as such were something to aspire to. That there is variation in these virtues goes without saying. Even in the complex traditions and communities that have made up Western civilization, one can find a range of different expressions—communities rooted in honor celebrated excellence and heroism, those rooted in knowledge and understanding celebrated wisdom, those rooted in faith celebrated godliness, and so on. What is not variable, so far as anyone can tell, is what these transitional rites say about the importance and value of becoming an adult.

In Western cultural history, maturity was regarded as a status, and not just a matter of age. Someone who was senior was not necessarily older, but rather one of a higher rank. As John Gillis has observed,

only those with the status of a master or mistress of a household were accorded the full status of maturity. The other members of the household, even those who were of the same or older ages, remained “boys” or “girls,” terms that defined their subordinate place in the household hierarchy rather than indicated their actual age.1515xJohn Gillis, “Life Course and Transitions to Adulthood,” Encyclopedia of Children and Childhood in History and Society (Farmington Hills: Gale, 2008): http://www.faqs.org/childhood/Ke-Me/Life- Course-and-Transitions-to-Adulthood.html.

Maturity, then, was in part defined by the status conferred by occupying a certain position in the social structure.

Wither Adulthood?

These reflections highlight two discernable yet still emerging patterns. The first concerns the continuous postponement of adulthood. As Thomas Szasz once observed, “adulthood is the ever-shrinking period between childhood and old age. It is the apparent aim of modern industrial societies to reduce this period to a minimum.”1616xThomas Szasz, The Second Sin (New York: Doubleday, 1973) 54. It is not just that the period appropriate to childhood can be expanded indefinitely, but that adulthood seems to be a stage that can be pushed off indefinitely as well for there is less and less structurally to compel the transition.

This tendency is rooted in the dynamics of the contemporary political and cultural economy, but it is also reflected in what exists of American rites of passage. Such rites do endure, in a manner of speaking, but they tend to exist only in particular segments of society. In the military, for example, boot camp certainly functions like a traditional rite of passage—it is a highly ritualized process, guided by adults, in which a young person endures separation (certainly confronting his or her mortality), transition, and reintegration, and through it, achieves the status of membership in a group whose authority is well established if not undisputed. One can also find surrogates of these rites in religious communities; for example, in the First Communion and Confirmation within the Christian community or in the Bar Mitzvah and Bat Mitzvah within the Jewish community. Given the highly privatized nature of religious belief and practice in America, these are not capable of integrating all of life. Nor are they oriented toward integration within a cultural mainstream because there is no cultural mainstream anymore within which to be integrated. Graduation exercises are offered up as a kind of rite of passage. Marriage still exists as a rite of passage for some, though given its optional nature and its loss of sacramental meaning for many, it too is less significant than it used to be.

Apart from the military, then, what rites of passage exist also tend to be relatively weak, especially for boys. Symbolic death enacted through physical hardship is not a part of any contemporary rite of passage. The instability of social life due to social and geographic mobility means that the cohesion of community is weak as well, though more for some demographics than others. The often highly ritualized rites of passage one finds in gangs or, by contrast, the ad hoc functional equivalents of rites of passage one finds in binge drinking, fast driving, reckless sexual behavior, or extreme sports tend to demonstrate the larger point. These are, in many respects, negative rites in that they are practiced outside of any mainstream sanction, without the presence, involvement, and guidance of adults, and without the infusion of sacred meaning. All of this suggests that adolescence and young adulthood exist in a state of semi-permanent liminality. Adulthood is simply not inevitable.

Nor is it necessarily desirable. It is here in the changing moral status of maturity that we find the second pattern. Consider this matter first in the gender-specific ways in which maturity has traditionally been understood. What had been taken for granted about masculinity or femininity, about what it meant to be a “real” man or a “true” woman are now a matter of some confusion, dispute, and choice. The recent history of the institutions of moral formation—whether in religious bodies or youth organizations, or among family experts and developmental psychologists—all reflect this breakdown in the old certainties. In sociological terms, gender identity has been deinstitutionalized. As a consequence, powerful interests in the spheres of politics and civil society have contested gender. Indeed, much of the culture war in the United States over the past forty years took form over precisely these issues in the effort to shore up or undermine traditional ideals of gender identity.

What is most striking, in my view, is the disappearance of moral idealism surrounding traditional notions of masculinity and femininity. Outside of the remnants of social conservatism, the ideals that constituted such notions are without intellectual defense. This is perhaps especially true for notions of masculinity. In this regard, consider a contrast. In 1895, Rudyard Kipling published his famous poem, “If,” a celebration of the clearheadedness, confidence, independence, truthfulness, compassion, ambition, hard work, and natural sincerity that comprised the ideal man. Translated into over 25 languages, the poem was wildly popular. Though the qualities associated with manliness were asserted boldly, it was the anxieties rather than confidence of the Victorian age that were crystallized in this paean to manhood. This was an affirmation of qualities and virtues that emerged at precisely the moment when all such qualities and virtues were beginning to be threatened, for doubt, conformity, falsehood, avarice, and pretentiousness were all built into the uncertainties of the late nineteenth and early twentieth centuries. A century later, political theorist Harvey Mansfield published a theoretical equivalent to Kipling’s “If ” in his book Manliness.1717xHarvey C. Mansfield, Manliness (New Haven: Yale University Press, 2006). Though popular among traditionalists, Mansfield’s argument was ridiculed by the intellectual mainstream. “A cry of desperation,” said one; a “cartoonish view of modern masculinity,” said another.1818xRobin Lakoff, a professor of linguistics at Berkeley, and Anne Norton, a political scientist at the University of Pennsylvania, as quoted in Christopher Shea, “The Manly Man’s Man,” The Boston Globe (12 March 2006): http://www.boston.com/news/globe/ideas/articles/2006/03/12/the_manly_mans_man/. As the Saturday Night Live skit “Hans and Franz” shows, traditional ideals of manliness are at best a matter of some good-natured ridicule. There is no likelihood of going back to such ideals, but it isn’t clear where our notions of masculinity and femininity are going either.

Conclusion

For the better part of the twentieth century, people had a pretty good idea of what it meant to be an adult and saw this as something worth pursuing. Steady employment, marriage, and home ownership were some of the outward markings of adulthood; responsibility, commitment, loyalty, and hard work were the inward virtues that made these possible. It was not something one fell into but something that one aspired to. Not to mature into adulthood in these ways was a matter of some shame.

This is far more dubious today. The characteristics that have made adulthood recognizable and desirable have been deinstitutionalized. Adulthood is a destination one cannot quite locate, the passage to which is no longer clear. The loss of social consensus over the meaning of adulthood means that the very category of adulthood has become opaque. Among many, it is also unappealing.

Comedian Bob Newhart has said, “I think you should be a child for as long as you can. I have been successful for 74 years being able to do that. Don't rush into adulthood, it isn't all that much fun.”1919xRebecca Murray, “Bob Newhart Plays Papa to an Oversized Elf,” About.com (2009): http://movies. about.com/cs/elf/a/elfbobnewhart.htm. What he said in jest appears to reflect a new cultural reality in which adulthood has lost much of its appeal.

The new cultural reality is reinforced by dynamics played out at the far end of life as well. As the average life span expands into the eighties, the elderly have more discretionary years and money than ever before in history. But rather than a time of senior citizenship, influential forces in the market orient the elderly back to their youth. It is not an accident that retirement is now at times referred to as a “second adolescence.” It shares some of the same structural features as the first—a time of active sexuality, self-absorption, play, and even identity crisis. Needless to say, this does not describe the experience of the elderly as a whole, but it does describe a significant part and for good reason. Powerful market forces reinforce the idea of retirement and old age as a second adolescence. Just as marketers orient the elderly population toward services that maintain freedom, amusement, and youthfulness, financial advisors insist that their clients create the wealth necessary to pay for it all.

In both structural and cultural terms, then, adulthood is undergoing just as significant a transformation as youth, and these are inextricably linked to each other. The meaning and outcome of these transformations are unclear. Certainly, one of the possibilities is toward the redefinition of adulthood such that it no longer exists as a moral end that one aspires to achieve; rather, it becomes just a life stage of limited duration in between different forms of adolescence. Yet another possibility may be the negation of adulthood altogether, at least as a meaningful social and moral category. Adulthood, in any historically or anthropologically meaningful sense, may yet become obsolete. Still a third possibility may be that we are witnessing a transition toward something new and that economic, political, and cultural changes yet unseen will create conditions for the remaking of adulthood. The malleability in the social and moral meaning of maturity that we witness through history and anthropology make this the likely outcome. It is too early to tell. Whatever they are, these transformations will be enormously consequential even as they unfold.