The Smart Society. Peter D. Salins
American Education
THE KEYSTONE OF A SMART SOCIETY
Building a smart, high-human-capital society begins with making sure that all children get a good education. A century and a half ago, within decades of its independence, America became the world’s most human-capital-rich—i.e., smartest—country by establishing the world’s best education system and having the world’s most-educated people. That is no longer true. While most young Americans today—across all ethnic and socioeconomic categories—are better educated than their parents or grandparents, they are not necessarily educated well enough for the contemporary, globalized knowledge economy. In contrast to only a few decades ago, they are no longer better educated than their peers in other advanced—and some not-so-advanced—countries. Some of this international comparative deficiency is due to “the rise of the rest,” but, whatever the reason, it is a condition that should rightly concern all Americans. Also, the quality of education across the United States varies far too widely—among Americans of different backgrounds and among the fifty states. How America’s educational shortcomings can be remedied is the subject of this and the following three chapters.
Americans can once again become the best-educated people in the world through a few strategic interventions at key points in the schooling trajectory. One set of reforms should be aimed at closing two key academic performance gaps plaguing the country’s K–12 public schools: the vast gulf separating the achievement of disadvantaged American youngsters of all ethnic groups from that of the vast majority of non-disadvantaged mainstream children (which I discuss as the “Megagap” in chapter 3), and the smaller but much more pervasive one separating the achievement of mainstream students from that of the most privileged American youngsters and their mainstream foreign peers (the “Mainstream” gap discussed in chapter 4). Reforms of higher education should be aimed at enabling all qualified American high school graduates to go to (and complete) college or other relevant post-secondary training, and making sure that this experience is academically or professionally rigorous.
AMERICA’S HISTORIC LEAD
What is considered to be the right number of years to spend in school and what should be taught there obviously changes over time. Two hundred years ago, when Americans first laid the groundwork for a system of universal education—universal in that it was not limited to the children of the rich—being an educated person required about six years of school, what we today would call an elementary education. One hundred years ago, to be well educated meant completing high school. After World War II, the educational gold standard became a college education. Now, increasingly, to be at the educational frontier requires a graduate or professional degree.
Whatever the contemporary educational threshold, Americans always got there first: building the institutional infrastructure, providing for public funding in whole or in part, and establishing the ancillary quality-control processes (like testing and accreditation). As noted in the previous chapter, even before the Colonies won independence from England, American schools were always accessible to all children in their catchment area, at a time when England and continental European countries restricted schooling to children of the aristocracy or the more affluent members of the merchant and professional classes. Throughout the nineteenth and twentieth centuries, from the Land Ordinance of 1785 through the G.I. Bill of 1944, American education for students of all ages expanded in leaps and bounds, always staying far ahead of even the most enlightened European countries and somewhat ahead of America’s neighbor to the north, Canada (table 2.1).
From the time of the revolution through the early decades of the nineteenth century, most American schoolchildren were taught in what educational historians label “district schools” because they were organized and paid for by local school districts, and generally territorially compact enough so that all pupils could walk to the nearest school. Characterized now as “one-room schoolhouses,” these district schools were unquestionably quite primitive by later educational standards, having fairly rudimentary 3-R (reading, writing, arithmetic) curricula, mixing children of all ages and abilities in the same classrooms, and employing teachers whose own education was quite limited. Nevertheless, even these basic educational facilities were revolutionary for the time. They were open and usually free to all local children and they brought together children not only of all ages but of all social classes as well, because all but the very richest families sent their children to them.1
Table 2.1
Public Elementary School Attendance in the United States and Selected Countries, 1870–1900 (percent of all ages 5–19)
Sources: For U.S.: Thomas D. Snyder, ed., 120 Years of American Education: A Statistical Portrait (National Center for Education Statistics, January 1993). For Canada: B. R. Mitchell, International Historical Statistics: The Americas, 1750–2000 (New York: Palgrave Macmillan, 2003). For Europe: B. R. Mitchell, International Historical Statistics: Europe, 1750–2000 (New York: Palgrave Macmillan, 2003).
In one of the most striking educational developments of the young United States, district schools throughout the new states were determined to stifle at an early age the kind of speech-related class distinctions that then and now have plagued the people of England. Strongly influenced by Noah Webster, they consciously and universally propagated a uniform way of writing (including spelling) and speaking that we today recognize as Standard American English.2
Beginning in the 1830s, spurred on by the enormously influential educational reformer from Massachusetts, Horace Mann, and responsive to the growing educational requirements of a rapidly industrializing country, a powerful national education reform movement strove to substantially upgrade the curricular content, teaching effectiveness, and time span of America’s public schools.3 In Mann’s own words,
After the State shall have secured to all its children, that basis of knowledge and morality, which is indispensable to its own security; after it shall have supplied them with the instruments of that individual prosperity, whose aggregate will constitute its own social prosperity; then they may be emancipated from its tutelage, each one to go whithersoever his well-instructed mind shall determine.4
This effort, referred to by education scholars as “the common school movement,” took on great momentum and spread like wildfire across the new nation. Illinois imposed universal schooling along these lines in 1825, followed by New York in 1830, Massachusetts in 1837, and by the 1860s, so had all northern, Midwest, and frontier states. Only in the South was the movement thwarted before the Civil War, where affluent whites sent their children to private schools, and poor whites and blacks (then enslaved) received a meager district school education at best. In the Reconstruction period after the Civil War, unsegregated common schools open to both white and black schoolchildren were established in all southern states, but with the collapse of Reconstruction after 1875 and the adoption of Jim Crow laws throughout the South, these became segregated and remained so until the passage of civil rights legislation in the 1960s.
Many of the characteristic aspects of American schools today, most particularly elementary schools, were set in stone by the common schools of the nineteenth century. Among the most distinctive at the time—and the most enduring—was the belief that teachers should receive specialized training, which resulted in the establishment of “normal schools” to prepare them with supposedly scientific methods of teaching (i.e., “pedagogy”). America’s first normal school was launched in Boston in 1838, and by the end of the century the lion’s share of the national teaching force