The Smart Society. Peter D. Salins
professionalization of teacher training got a huge boost in prestige and public acceptance with the founding in 1887 of the national citadel of teacher training, Columbia University Teachers College. By the early twentieth century, most normal schools had mutated into state teachers colleges, and, after World War II, into comprehensive public liberal arts colleges dominated by their teacher education programs. To this day, a majority of American elementary schoolteachers as well as a large share of high school teachers are graduates of such teacher education programs, the descendants of the normal schools of the common school era.
The common school movement introduced many other familiar features of public schools. While originally dedicated only to making access to elementary school universally mandatory and free, by the end of the nineteenth century the movement had extended its reach to encompass universal secondary education. The common school template fostered the strict assignment of students to classrooms by age (something quite novel in the nineteenth century) and by academic ability. In the wake of the movement’s influence, many other American public school characteristics also came to be nationally consistent. Despite being under the jurisdiction of state education departments, with almost no federal government input until recently, the curricular content of schools, subject for subject, became relatively uniform across the country. This was most likely due to the influence of national (but nongovernmental) accreditation organizations that first arose at the end of the nineteenth century; the standardized doctrines of the teachers colleges; and the sales policies of textbook publishers that profit from nationwide adoption of their offerings. Even the physical specifications of American schools quickly became homogenized, with consistent class and classroom sizes, and a familiar repertoire of ancillary facilities: gymnasiums, auditoriums, cafeterias, and so on.
By the early twentieth century, the common school movement had succeeded in making its vision of access to a professionally delivered, practical education for all Americans a reality in every state. But it took a while for the education establishment to agree on the precise format and academic content of a typical school system’s elementary (and later, secondary) schools. In the case of elementary education, there was always general agreement (even internationally) on the educational foundations to be taught. However, in the early decades of the twentieth century, John Dewey and other “progressive” educators challenged the rigidly structured rote learning and focus on academic content practiced in most American elementary schools at the time—and still pervasive in the rest of the world—charging that it was pedagogically ineffective and, worse, stifled child development and creativity.6 By the middle of the twentieth century, the progressives had won this argument, first in the training of teachers in state colleges, and then in the classrooms of most districts. In a strong backlash, there arose a powerful countermovement in the 1950s against progressivism led by such national figures as Harvard President James Bryant Conant, University of Chicago President Robert Maynard Hutchins, and Admiral Hyman Rickover, all of them calling for higher academic standards and more rigorous instruction.7 This debate continues to the present day, with the teacher training establishment and the National Education Association still promoting progressivism, and a host of prominent critics opposing it, including luminaries such as the late Boston University President John Silber, former Assistant Secretary of Education Chester Finn, and several notable public school chancellors in Chicago, New York, and Washington.
The spread of American secondary schools at the turn of the last century raised other questions. First, at what age should elementary education end? The study of early adolescent psychological development that surfaced at the time, and the high dropout rates of children who went to four-year high schools right after eight years of elementary school, persuaded educators that there had to be a transitional, intermediate institution. Thus, beginning in 1909 in Columbus, Ohio, and 1910 in Berkeley, California, secondary education came to be articulated into two segments: a transitional school for early adolescents (called junior high or middle school) and high school for older teens. But the precise age at which to set cutoffs to begin the transition (ten, eleven, or twelve) or to end it (fourteen or fifteen) has yet to be settled. Given the uncertainty of the subjects to be taught—and how they are to be taught—in the transition years, and the painfulness of teaching anything to early adolescents, few districts have been happy with any of the many variations they have tried out over the years.
More seriously, educators needed to agree on what should be taught—and to whom—in high school. From their widespread introduction in the early twentieth century until the 1960s, the national consensus was that high schools should sort students by academic ability, as measured by standardized aptitude or IQ tests (administered in elementary or middle school), and tailor each student’s coursework to reflect his or her intellectual and occupational capacities. This sorting led inexorably to channeling students of different abilities—and, to a lesser extent, boys and girls—into different academic and, ultimately, career paths. The cognitively gifted were to be prepared for college; the less gifted taught enough to function competently in lower- and mid-level white-collar jobs (such as girls being groomed for secretarial work); and the least academically oriented (mainly boys) trained for vocational or technical trades, either in comprehensive high school vocational tracks or in specialized vocational schools.8
There were certain undeniable benefits to having academically stratified high schools. They could require rigorous courses for college preparatory students, assuring a very high degree of academic readiness for the college-bound, and they supplied the middle and lower tiers of the labor market with an army of competent secretaries, clerks, and skilled tradesmen. They were also reasonably effective in matching the abilities of young people to career paths in which they were likely to succeed. And though American high schools were internally stratified academically, in other respects they were consistent with the country’s democratic, egalitarian values. Unlike stratified secondary school systems in Europe, all local children attended the same high school, and the American high school experience always included a wide range of social and athletic pursuits open to all students—ones in which the non-cognitively or non-socially advantaged could excel. On the negative side, scholastic stratification was bound to reinforce prevailing patterns of class and racial differentiation, especially for African Americans and poor children generally.
PARADIGM SHIFTS
The rapid diffusion across the United States of a standardized national template for universal education, beginning 150 years ago, can be seen for most of American history as a great triumph, and is largely responsible for the United States being, until the day before yesterday, so far ahead of the rest of the world in giving all of its citizens a sound education. But this historical paradigm is no longer adequate. If the United States wants to remain the world’s best-educated country, it must revamp the entire set of institutions and policies, from preschool through college, that constitutes the American educational system.
Over the last five decades, with accelerating force, America’s long-settled educational paradigm has been transformed by four major developments. The first was a new self-consciousness regarding the nation’s international standing in indicators of educational achievement. Until the Cold War, without giving the matter much thought, Americans were supremely complacent about the superiority of their educational system relative to that of any other country. For much of the last century, most of the world was barely literate and even the richer countries of Europe clearly lagged behind the United States in how well their people were educated (as indicated in the data cited earlier). However, the launch of the world’s first space satellite, Sputnik, by the Soviet Union in 1957 was a wake-up call that suddenly cast into doubt America’s vaunted global technological superiority and, by extension, the quality of its schooling in science and mathematics. The immediate result of this new sense of vulnerability was to increase significantly public school funding and efforts in these subjects. The long-term effect was to instill in Americans a gnawing anxiety and inferiority complex about the quality of their schools in any international matchup.
The second, and perhaps most important, paradigm shift affected the way professional educators (and the public) looked at—and took responsibility for—the academic success or failure of public school students. Before the 1960s, everyone unquestioningly believed not only that every American child’s academic (and, therefore, career) prospects were shaped by some combination of their genetic endowment and family status, but also there