Wheat Belly. William Davis, MD

Wheat Belly - William Davis, MD


Скачать книгу
wildly in open plains. Meals of gazelle, boar, fowl, and ibex were rounded out with dishes of wild-growing grain and fruit. Relics like those excavated at the Tell Abu Hureyra settlement in what is now central Syria suggest skilled use of tools such as sickles and mortars to harvest and grind grains, as well as storage pits for stockpiling harvested food. Remains of harvested wheat have been found at archaeological digs in Tell Aswad, Jericho, Nahal Hemar, Navali Cori, and other locales. Wheat was ground by hand, then eaten as porridge. The modern concept of bread leavened by yeast would not come along for several thousand years.

      Natufians harvested wild einkorn wheat and stored seeds to sow in areas of their own choosing the following season. Einkorn wheat eventually became an essential component of the Natufian diet, reducing need for hunting and gathering. The shift from harvesting wild grain to cultivating it from one season to the next was a fundamental change that shaped subsequent human migratory behavior, as well as development of tools, language, and culture. It marked the beginning of agriculture, a lifestyle that required long-term commitment to permanent settlement, a turning point in the course of human civilization. Growing grains and other foods yielded a surplus of food that allowed for occupational specialization, government, and all the elaborate trappings of culture (while, in contrast, the absence of agriculture arrested development of other cultures in a lifestyle of nomadic hunting and gathering).

      Over most of the ten thousand years that wheat has occupied a prominent place in the caves, huts, and adobes, and on the tables of humans, what started out as harvested einkorn, then emmer, followed by cultivated Triticum aestivum, changed gradually and only in fits and starts. The wheat of the seventeenth century was the wheat of the eighteenth century, which in turn was much the same as the wheat of the nineteenth century and the first half of the twentieth century. Riding your oxcart through the countryside during any of these centuries, you’d see fields of five-foot-tall “amber waves of grain” swaying in the breeze. Crude human wheat-breeding efforts yielded hit-and-miss, year-over-year incremental modifications, some successful, most not, and even a discerning eye would be hard-pressed to tell the difference between the wheat of early twentieth-century farming from its centuries of predecessors.

      During the nineteenth and early twentieth centuries, as in many preceding centuries, wheat therefore changed little. The Pillsbury’s Best XXXX flour my grandmother used to make her famous sour cream muffins in 1940 was little different from the flour of her great-grandmother sixty years earlier or, for that matter, from that of a distant relative two or three centuries before that. Grinding of wheat became mechanized in the twentieth century, yielding finer flour on a larger scale, but the basic composition of the flour remained much the same.

      That all ended in the latter half of the twentieth century, when an upheaval in hybridization methods transformed this grain. What now passes for wheat has changed, not through the forces of drought or disease or a Darwinian scramble for survival, but through human intervention.

      Wheat has undergone more drastic transformation than the Real Housewives of Beverly Hills, stretched, sewed, cut, and stitched back together to yield something entirely unique, nearly unrecognizable when compared to the original and yet still called by the same name: wheat.

      Modern commercial wheat production has been intent on delivering features such as increased yield, decreased operation costs, and large-scale production of a consistent commodity. All the while, virtually no questions have been asked about whether these features are compatible with human health. I submit that, somewhere along the way during wheat’s history, perhaps five thousand years ago but more likely sixty years ago, wheat changed in ways that yielded exaggerated adverse effects on human health.

      The result: A loaf of bread, biscuit, or pancake of today is different from its counterpart of a thousand years ago, different even from what our grandmothers made. They might look the same, even taste much the same, but there are fundamental biochemical differences. Small changes in wheat protein structure, for instance, can spell the difference between a devastating immune response to wheat protein versus no immune response at all.

       WHAT HAPPENED TO THE FIRST WHEAT-EATERS?

      After not consuming the seeds of grasses for the first 99.6 percent of our time on this planet, we finally turned to them for sustenance ten thousand years ago. Desperation, caused by a shortage of wild game and plants due to a natural shift in climate, prompted Neolithic hunter-gatherers to view seeds of grasses as food. But we cannot save grass clippings gathered from cutting our lawns to sprinkle on top of a salad with a little vinaigrette; likewise, we found out the hard way that, when ingested, the leaves, stalks, and husks of grasses are tasteless and inedible, wreaking gastrointestinal havoc like nausea, vomiting, abdominal pain, and diarrhea, or passing through the gastrointestinal tract undigested. The grasses of the earth are indigestible to humans (unlike herbivorous ruminants, who possess adaptations that allow them to graze on grasses, such as multi-compartment stomachs and spiral colons that harbor unique microorganisms that break grasses down).

      It must have taken considerable trial and error to figure out that the seeds of grass, removed from the husk, then dried, pulverized with stones, and heated in water, would yield something that could be eaten and provide carbohydrate nourishment. Over time, increased efficiencies in harvesting and grinding allowed grass seeds to play a more prominent role in the human diet.

      So what became of those first humans who turned to the seeds of wheat grass to survive?

      Anthropologists tell us that there was an explosion of tooth decay and tooth abscess; microorganisms of the mouth and colon changed; the maxillary bone and mandible of the skull shrank, resulting in crooked teeth; iron deficiency anemia became common; the frequency of knee arthritis doubled; and bone length and diameter decreased, resulting in a reduced height of five inches in males, three inches in females.1, 2, 3, 4

      The explosion of tooth decay, in particular, is telling: Prior to the consumption of the seeds of grasses, tooth decay was uncommon, affecting only 1 to 3 percent of all teeth recovered. This is extraordinary, as non-grain-eating humans had no fluoridated water or toothpaste, no toothbrushes, no dental floss, no dentists, no dental insurance card, yet had perfectly straight, healthy teeth even to old age. (Yes, ancient humans lived to their fifties, sixties, and seventies, contrary to popular opinion.) When humans first turned to grains—einkorn wheat in the Fertile Crescent, millet in sub-Saharan Africa, and maize and teosinte in Central America—humans developed an explosion of tooth decay: 16 to 49 percent of teeth showed decay and abscess formation, as well as misalignment, even in young people.5

      Living in a wild world, hunting and gathering food, humans needed a full set of intact teeth to survive, sometimes having to eat their food raw, which required prolonged, vigorous chewing. The dental experience with wheat and grains encapsulates much that is wrong with their consumption. The amylopectin A carbohydrate that provides carbohydrate calories may allow survival for another few days or weeks, but it is also responsible for the decline in dental health months to years later—trading near-term survival in exchange for long-term crippling changes in health at a time when mercury fillings and dentures were not an option. Over the centuries, human grain consumers learned they had to take extraordinary steps to preserve their teeth. Today, of course, we have a multi-billion dollar industry delivered by dentists, orthodontists, toothpaste manufacturers, and so forth, all to largely counter the decay and misalignment of teeth that began when humans first mistook the seeds of grasses for food.

      WHEAT BEFORE GENETICISTS GOT HOLD OF IT

      Wheat is uniquely adaptable to environmental conditions, growing in Jericho, 850 feet below sea level, to Himalayan mountainous regions 10,000 feet above sea level. Its latitudinal range is also wide, ranging from as far north as Norway, 65° north latitude, to Argentina, 45° south latitude. Wheat occupies sixty million acres of farmland in the United States, an area equal to the state of Ohio. Worldwide, wheat is grown on an area ten times that figure, or twice the total acreage of Western Europe. After all, Domino’s has lots of pizzas to sell at $5.99.

      The first wild, then cultivated, wheat was einkorn, the great-granddaddy of all subsequent wheat. Einkorn has the simplest genetic code of all wheat, containing only fourteen chromosomes. Circa 3300 BC, hardy, cold-tolerant


Скачать книгу