1 Starving for Progress
In the late 1940s, anglers who fished the waters of the Hudson River near Orangetown, New York, noticed something odd about the trout they were reeling in: every year, the fish were getting larger. Fishermen rarely complain about big fish, but because the creatures in question were being hooked downstream from Lederle Laboratories, a pharmaceutical company, some may have wondered whether the phenomenon was entirely natural. Eventually, the fish stories reached the upper management at Lederle, where they piqued the curiosity of Thomas Jukes, a brilliant biophysicist and expert in the new field of vitamin nutrition, who decided to investigate. Jukes knew that Lederle discharged its factory wastes in great piles near the river. He also knew that one such waste product was a residual mash left over from the fermentation process that Lederle used to make its hot new antibiotic tetracycline. Jukes surmised that the mash was leaching into the river and being eaten by the fish, and that something in the mash—Jukes dubbed it a "new growth factor"—was making them larger.
Initially, Jukes suspected the factor might be vitamin B12, a newly identified nutrient that was known to boost growth in laboratory animals. The vitamin was a byproduct of fermentation, so it was very likely to be in the mash. But when Jukes and a colleague, Robert Stokstad, tested the mash, they found something quite unexpected, and even world-changing: although B12 was indeed present, the new growth factor wasn’t that vitamin but the tetracycline itself. When mixed with cornmeal and fed to baby chickens, even tiny doses of the amber-colored antibiotic boosted growth rates by an unprecedented 25 percent.
Jukes wasn’t sure why this was happening. He speculated (correctly, as it turned out) that the tetracycline was treating the intestinal infections that are routine in closely confined farm animals, and that calories that normally would have been consumed by the chicks’ immune system were going instead to make bigger muscles and bones. In any case, the phenomenon wasn’t limited to baby chickens. Other researchers soon confirmed that low, subtherapeutic doses of tetracycline increased growth in turkeys, calves, and pigs by as much as 50 percent, and later studies showed that antibiotics made cows give more milk and caused pigs to have more litters, more piglets per litter, and piglets with larger birth weights. When the discovery was announced to the world in 1950, Jukes’s new growth factor was the closest thing anyone had ever seen to free meat and a welcome development amid rising concerns over food supplies in war-torn Europe and burgeoning Asia. As the New York Times put it, tetracycline’s "hitherto unsuspected nutritional powers" would have "enormous long-range significance for the survival of the human race in a world of dwindling resources and expanding populations."
Jukes’s discovery would indeed have enormous long-range significance, although not quite in the ways the Times envisioned. By the middle of the twentieth century, the global food system was in the throes of a massive transformation. In even the poorest of nations, thousand-year-old methods of farming and processing were being replaced by a new industrial model of production that could generate far more calories than had been possible even a generation earlier—and which seemed poised to end the cycle of boom and bust that had plagued humanity for eons. But the great revolution was incomplete. For all our great success in industrializing grains and other plants, the more complex biology of our cattle, hogs, chickens, and other livestock defied the mandates of mass production. By the early twentieth century, meat—the food that humans were built for and certainly the food we crave—was still so scarce that populations in Asia, Europe, and even parts of the United States suffered physical and mental stunting, and by the end of World War II, experts were predicting global famine.
Then, abruptly, the story changed. In the aftermath of the war, a string of discoveries by researchers like Thomas Jukes in the new fields of nutrition, microbiology, and genetics rendered it possible to make meat almost as effortlessly as we produced corn or canned goods. We learned to breed animals for greater size and more rapid maturation. We moved our animals from pastures and barnyards and into far more efficient sheds and feed yards. And we boosted their growth with vitamins and amino acids, hormones and antibiotics (it would be years before anyone thought to ask what else these additives might do). This livestock revolution, as it came to be known, unleashed a surge in meat production so powerful that it transformed the entire food sector and, for a brief time, allowed many of us to return to the period of dietary history that had largely defined us as a species— and where the story of the modern food economy properly begins. By most accounts, that narrative started about three million years ago, with Australopithecus, a diminutive ancestor who lived in the prehistoric African forest and ate mainly what could be found there—fruits, leaves, larvae, and bugs. Australopithecus surely ate some meat (probably scavenged from carcasses, as he was too small to do much hunting), but most of his calories came from plants, and this herbivorous strategy was reflected in every element of Australopithecus’s being. His brain and sensory organs were likely optimized to home in on the colors and shapes of edible (and poisonous) plants. His large teeth, powerful jaws, and oversize gut were all adapted to coarse, fibrous plant matter, which is hard to chew and even harder to digest. Even his small size—he stood barely four feet tall and weighed forty pounds—was ideal for harvesting fruit among the branches.
So perfectly did Australopithecus match his herbaceous diet that our story might well have ended there. Instead, between 3 million and 2.4 million years ago, Australopithecus got a shove: the climate began to cool and dry out, and the primeval jungle fragmented into a mosaic of forest and grasslands, which forced our ancestors out of the trees and into a radically new food strategy. In this more open environment, early humans would have found far less in the way of fruits and vegetables but far more in the way of animals, some of which ate our ancestors, and some of which our ancestors began to eat. This still wasn’t really hunting, but scavenging carcasses left by other predators—yet now with an important difference: our ancestors were using stone tools to crack open the leg bones or skulls, which other predators typically left intact, to get at the calorie-rich, highly nutritious marrow and brains. Gradually, their feeding strategies improved. By around 500,000 years ago, the larger, more upright Homo erectus was using crude weapons to hunt rodents, reptiles, even small deer. Erectus was still an omnivore and ate wild fruit, tubers, eggs, bugs, and anything else he could find. But animal food—muscle, fat, and the soft tissues like brains and organs—now made up as much as 65 percent of his total calories, almost the dietary mirror image of Australopithecus.
On one level, this shift away from plants and toward animal food was simple adaptation. All creatures choose feeding strategies that yield the most calories for the least effort (anthropologists call this optimal foraging behavior), and with fewer plant calories available, our ancestors naturally turned to animal foods as the simplest way to replace those calories. But what is significant is this: even if the m...