Coming of Age in Times of Uncertainty. Harry Blatterer
role in the emergence of adulthood as a separately conceived life phase (Pilcher 1995).
Adulthood emerged in public consciousness and entered the cultural vocabulary of everyday life as the achievable (and indeed desirable) end to adolescent immaturity during the Second World War. In the U.S. a fascination with being grown up emerged in popular culture. Reader's Digest, McCall's, and Vital Speeches of the Day were some of the publications with a wide readership that concerned themselves with what it meant to be adult. A 1952 issue of Reader's Digest, for example, invited young readers to complete a quiz in order to find out whether or not they were indeed grown up (Jordan 1978: 197). So, since its entry into the vernacular during the Civil War, adulthood had come to signify something solid to aim for, a life stage that held the promise of fulfilled wishes and achieved aspirations. Accordingly, a number of words, phrases, and practices associated with adulthood as social status began to settle and eventually became taken for granted and commonplace. Directives like “Don't be childish!” and “Grow up!” and turns of phrase such as being “set in your ways” or having “settled down,” are linguistic devices associated with adulthood. They are also figures of speech that enact social asymmetries and put adult “human beings” in a more powerful position vis-à-vis those who, like children, are perceived and treated as “human becomings” (Qvortrup 1994).
“Maturity” acts as a central metaphor encompassing normative achievements and attributes of adulthood. Although the term is most closely associated with biological development, maturity tends to be used to describe individuals' social and psychological competencies and dispositions. While being mature does not necessarily make a person an adult in the eyes of others (a child may be “mature for her age,” just as an adult may be deemed immature), when linked to adulthood, maturity denotes an end state to biological, psychological, and social development. The notion of social maturity adds an extra dimension. It takes as its starting point the premise that adulthood is constituted not so much by the significance individuals attribute to their own attitudes and actions, but by the kinds of social validation these attract. Just as the interpretation of biological and psychological maturity is culturally specific, as Margaret Mead's classic work Coming of Age in Samoa ( 1928) has shown, maturity is subject to socially constructed and acknowledged forms of meaning. Its plural meanings (biological, psychological, and social) are, for example, institutionalized in law. To appropriate the thinking behind James and Prout's (1997) social constructivist stance on childhood: the maturity of adults is a biological fact of life, but the ways in which this maturity is understood and made meaningful is a fact of culture.1
Notions of maturity hold an important place in the self-understanding of entire societies that share the liberal European tradition. The obvious example here is Immanuel Kant's (1724–1804) statement, “Enlightenment is humankind's emergence from self-incurred immaturity” (1975 [1784]). In his critical analysis of this text, Michel Foucault spells out the synonymy between history and individual development. He maintains that Kant defines the historical process “as humanity's passage to its adult status,” to “maturity” (1994: 308–9, 318). Similarly, historian Norman Davies comments, “Europeans reached the ‘age of discretion’…with medieval Christendom seen as the parent and Europe's secular culture as a growing child conceived in the Renaissance” (Davies 1996: 596). Common perspectives of human development from a state of childlike dependence to adult independence parallel our understanding of modernization as a process of emancipation from dogma, tradition, and authority. This direct link between historical process and individual maturation has consequences for the social-scientific appraisal and treatment of young people to this day. The clearest case, again, is Hall's early interpretation of adolescence, where the individual's development was said to recapitulate the historical maturation of the human species as a whole. Along with a new emphasis on personal and social development, certain practices emerged as symbolic and constitutive of adulthood.
Adult Practices
Picture this: a man and a woman in their mid-twenties. The woman holds a baby in her arms; a small child clings to the man's hand. The woman wears an apron, the man his work overalls. A “Sold” sign perches on the fence that surrounds the freshly painted house. A generously sized car sits in the driveway. No one could ever mistake the man and woman in this romanticized picture for adolescents, and few would be tempted to suggest that they were not adults. Many would, as if by reflex, assume the man to be husband to the woman and father to the child. But something about this image jars against the present. Just like the choice of frame for a painting or a photo, so the right time frame too helps integrate representation and reception. With this in mind, I suggest that no period in the history of Western societies has been more conducive to the institutionalization of a particular model of adulthood (of which the above, romanticized image is one possible representation) than the era historian Eric Hobsbawm (1995) has called the “Golden Age,” namely the time between the end of the Second World War and the oil crises of the early 1970s. No period has provided more favorable conditions for this model to become lived experience for a majority; no period has shown a more faultless synthesis of ideal and reality. Following Lee (2001), I call this synthetic construct “standard adulthood.”
After the Second World War the industrialized economies experienced unprecedented affluence and stability. The period from about 1945 to the early 1970s saw a concerted effort by business, government, and unions to prevent a recurrence of the Depression, the harrowing experience of which still haunted decision makers. Although more wealthy nations had their own macroeconomic agenda, public spending, full employment, and universal social security provisions were given priority to ensure internal demand and hence economic expansion. The then-prevailing mode of management and organization, what came to be known as Fordism, has since come to denote more than that: it signifies a once-prevalent “total way of life” that congealed around goals of long-term stability and economic growth (Harvey 1989: 135). Typically, businesses valued employee loyalty, which was generally rewarded with promotions in hierarchically constituted organizations. For employees and families this meant that there were plannable career paths with predictable milestones on the way, and a known destination: retirement on guaranteed government pensions. In the world of work the accumulation of experience with age was viewed as a valuable asset and was seen to increase, rather than inhibit, job security (Lee 2001: ii-13). According to one sociologist's interpretation of the time—characteristically exaggerated for illustrative purposes—these economic and work-related aspects alone, “created a society in which people's lives were as highly standardized as the sheet steel from which the cars were welded together” (Beck 2000: 68).
These social conditions corresponded to a value system that remained unchallenged in its normative validity until the rising discontent of the 1960s. Open same-sex relationships were extremely risqué and hence rare, and same-sex parenthood (as opposed to guardianship) was unimaginable. The heterosexual nuclear family prevailed as the ideal. It is during this time that early marriage and family formation came to be lived experience for many adults.2 Add to this the opportunities provided by the labor market, and a picture emerges that one commentator draws with clarity:
[O]nce ‘adult’ and employed, one could expect to stay ‘the same’ for the rest of one's life in a range of ways; one's identity was stabilized by sharing the work environment with more or less the same people throughout one's working life; the geographical area one lived in would remain the same since the organization one belonged to had set down firm roots in that area; and, even if one were dissatisfied with one's job, one would not have to seek a position with another organization (in another place with different people) because time and effort would bring the reward of career progression. (Lee 2001: 12–13)
Flexibility—first a buzzword in the New Capitalism (Sennett 1998, 2006) and now a taken for granted imperative in all social relations—was as yet a far-off reality. Becoming adult was a matter of following a life course that resembled a veritable march through the institutions of marriage, parenthood, and work. By today's standards these objective markers of adulthood were relatively fixed, achievable, and supported by an overarching value consensus. There was a high degree of fit between norms and social practice. Sharply delineated structures of opportunity rested on culturally and socially reproduced normative foundations that were, for a time, rarely questioned. With fulltime long-term work