In June 1822, a man named Alexis St. Martin
was accidentally shot in the stomach at Fort
Mackinac, Michigan. When a surgeon named William
Beaumont arrived on the scene, the situation
was dire. As Beaumont described it, “A large
portion of the side was blown off…[there
was a] perforation made directly into the
cavity of the stomach, through which food
was escaping.” Gross.
Beaumont eventually nursed his patient back
to health, so successfully that St. Martin
was able to one day paddle his family in a
canoe from what is now Wisconsin to Montreal.
Though St. Martin was able to live a relatively
unencumbered life after the accident, the
wound never completely closed up. For the
rest of his life, as Richard Wrangham describes
in his captivating book, “Catching Fire,”
“ … the inner workings of his stomach
were visible from the outside.”
Hi, I’m Justin Dodd. Welcome to Food History.
Other than morbid curiosity, why should we
care about a guy whose guts were publicly
available information? Well, in addition to
being an effective surgeon, Beaumont knew
an opportunity when it arose. In St. Martin,
he had an almost-literal window into the workings
of the human body. The information he gleaned
through observation of his unique subject
gave us insights that might otherwise have
been impossible to learn about the digestive
system. In an indirect way, they might even
help us understand the incredible contributions
made to humanity by cooking. They give us
a jumping off point to discuss some of the
foods that have had the biggest impact in
the history of humanity, all part of an attempt
to answer a fascinating question offered up
by one of our audience members.
It’s a story that takes a couple million
years and involves intrepid volunteers chewing
raw goat meat, international trade, and the
birth of agriculture. Not in that order. Let’s get started!
“Which food has done the most to shape human
development?” When an audience member named
Mike posed that question to us in the comments
of our video about nachos, it got us thinking.
Like a lot of interesting questions, it’s
kind of impossible to answer, but it raises
a number of interesting questions of its own.
How do you define human development? Heck,
how do you define food?
For our purposes, if only to keep this video
from being nine hours long, food is something
you eat. As in, no beverages. Sorry water,
milk, tea, and beer—although, side note,
our next episode is going to be all about
beer, so don’t go crying into your unspecified
fermented beverage just yet. In fact, if you
have a cool fact about beer, drop it in the
comments below.
Even sticking to solids, we’re left with
a crowded field of contenders. We batted ideas
around in the office (back when being in the
office was a thing); we asked the AskFoodHistorians
subreddit for their insight, since, you know,
it’s right there in their name. They pointed
us to some excellent books, which we’ll
pull from throughout this video. We’re even
gonna bring in the authors of one of those
books a little bit later for his perspective.
We’re going to mostly talk about just four
types of food today—tubers, meat, sugar,
and cereal grains (especially wheat and barley)—but
the choices were made with a healthy dose
of humility. You could make a convincing argument
for rice or maize, salt or pepper. No single
ingredient could ever tell the entire history
of human development, or of food’s role
in it, but hopefully each one of our choices
tells us something interesting about the way
that food and humanity have influenced one
another.
OK, back to Beaumont, our inquisitive doctor
from the top of the video. He studied his
hole-y subject on and off for years, often
introducing different foodstuffs tied to a
string directly into St. Martin’s fistula.
Yeah, the hole in his abdomen. He’d then
pull the strings back out to note, among other
things, the time it took to digest various
items. He drew a couple conclusions that would
prove illuminating, even decades later. For
one, tender food with greater surface area—what
he called “minuteness of division”—was
digested faster. And cooked food, including
potatoes and meat, was processed dramatically
faster than raw food.
Almost two centuries later, Professor Richard
Wrangham came to St. Martin’s story from
the perspective of a biological anthropologist—before
he became fascinated with cooking, much of
his work focused on the differences between
human beings and other primates, especially
chimpanzees. Wrangham uses Beaumont’s takeaways
as one piece of what he would eventually call
the “cooking hypothesis,” a fairly persuasive
argument that cooking is perhaps the defining
difference between us and our Homo habilis
ancestors.
The cooking hypothesis says, an ape became
human because it learned to cook. Cooking
transformed us, our ancestors I should say,
partly because it gave a lot of energy, and
that energy was available for new activities.
Like, traveling farther. Like, having babies
faster. Like, having a better immune system
that gave us better defense against diseases.
It also made our food softer, which meant
that the mouth could be smaller, the teeth
could be smaller, the gut could be smaller
because the food was also more digestible.
And at the same time by the way, cooking meant
that fire was being used to heat the food,
and the acquisition of the control of fire
meant that our ancestors could, for the first
time, safely sleep on the ground, defended
by the fire. So they no longer had to adapt
to climbing in trees. So that meant that they
could really fully adapt, for the first time,
to walking and running on the ground.
In a 1999 piece published in Current Anthropology,
Wrangham and his colleagues began to make
this argument with a particular focus on tubers,
rhizomes, and corms—think potatoes, cassava,
taro root, and yams—which they call “underground
storage organs.” Sexy.
They were basically arguing that everyone
had placed too much emphasis on meat-eating
to explain the physiological and social changes
that happened around 1.8 million years ago
and led to the emergence of Homo erectus.
Wrangham’s argument is really about focusing
on a technology (fire) over an ingredient
(meat). But unfortunately for the purposes
of this video, fire isn’t a food (although
there are fire eaters, so maybe it is? No. That's stupid.
Disregard that.) Tubers are basically a way
for us to discuss cooking-with-fire separately
from meat-eating.
And it’s not like the researchers chose
“underground storage organs” arbitrarily.
For one, by the researchers’ estimation,
tubers would’ve been a much more plentiful
food source than meat—one that was unavailable
to many other animals, since they likely would
have required digging sticks to access. They
also pointed towards evidence that ancestors
a million years before the emergence of Homo
erectus may have consumed meat without the
corresponding physiological changes. Finally,
they analyzed the hypothetical impact of increased
meat consumption versus cooking underground
storage organs and determined that cooking
would provide a greater potential increase
in daily energy intake, even over a dramatic
rise in meat-eating. Did you get all that?
The piece also suggested that cooking tubers
could help explain the emergence of male-female
bonding. Cooking changes the site of consumption.
Rather than foraging and eating what you find
in the place you find it, cooking involves
bringing food back to a central location.
It trades the security of eating your food
immediately for the extra efficiency of letting
the fire do some of the digestion for you.
It’s a system fraught with risk for the
one doing the cooking. Cooked food, the researchers
argue, would be a more tempting target for
would-be thieves, given its increased tenderness
and relatively easy locatability. They outline
a hypothetical path to male-female bonding
that centers around females protecting their
tuber-centric resources by forming alliances
with males.
This also, in their understanding, helps explain
the persistent division of labor found between
the sexes. The most traditional division—men
hunt, women gather—is ultimately unworkable
for Wrangham with a raw food diet. He imagines
an unsuccessful hunter returning empty-handed
at nightfall. Even if his female partner had
gathered tubers for him to eat, he would have
an entire night of chewing ahead of him. Amongst
apes, and by Wrangham’s association, amongst
our pre-cooking apelike ancestors, a large
part of the day is spent chewing. (His interest
in methods to ease this burden led Wrangham
to conduct “ … an informal experiment
in which friends and I chewed raw goat meat
…” with avocado leaves to help accelerate
the breakdown of the raw meat. Note to self:
ignore all party invitations from Richard
Wrangham.) Wrangham sees the use of fire,
and the reduction in chewing time it entails,
as a way to free up time in the day (and,
indeed, time at night now illuminated by those
fires). Hunting could then grow from a sporadic
activity brought about by opportunity, as
it generally functions amongst modern primates
like chimpanzees, into a consistent enterprise
diversifying the hominid larder.
Reading Wrangham’s counter-narrative is
fascinating, but it received far from universal
acceptance upon its publication. In a devastating
academic putdown in the comments of that 1999
article, Professor C. Loring Brace thanks
the study’s authors for inviting his comments
despite the fact that, quote, “ … they
were fully aware of the fact that I look upon
their gambit as belonging more to the realm
of anthropological folklore than to that of
science.” He goes on to damn them with the
faint praise that their hypothesis “ … may
not be science, but ... has the makings of
an absolutely charming story.” Savage.
Critically, Brace argues, the researchers
ignored evidence for the use of certain tools
that the hominids they’re discussing did,
in fact, have access to. Those tools would
provide an alternate means of externalizing
digestion. Basically, instead of requiring
fire to help digest meat, you could make a
proto-steak-tartare by cutting meat off a
scavenged auroch and bashing it with a club.
Evolutionary biologists Katherine D. Zink
and Daniel E. Lieberman published a study
in 2016 looking into exactly this kind of
mechanical processing, by having people eat,
among other things, more raw goat. What's with
the eating of the raw goat? Did I miss this viral trend?
They concluded that cooking wasn’t necessary
to bring about the physiological changes that
Wrangham was trying to account for, and that
“ … meat eating was largely dependent
on mechanical processing made possible by
the invention of slicing technology." So in
their understanding, cooking might still have
a place in this pivotal moment of human development,
but our definition of cooking might have to
expand to encompass preparation methods outside
of applying heat.
There’s a big reason this explanation is
appealing. The cooking hypothesis requires
a degree of fire control by hominids dating
back 1.8 million years ago. At the time of
that 1999 piece, the most compelling evidence
for the human control of fire dated back only
about 250,000 years. Intriguingly, the intervening
years have seen new archeological discoveries
that provide evidence for controlled use of
fire well before that time frame, with compelling
evidence pointing as far back as a million
years ago. Unfortunately for Wrangham, that
leaves about 800,000 years of supposed fire
use unaccounted for in the archaeological
record. If we accept even earlier estimates,
going back as far as 1.6 million years, there’s
still a gap in the record. Wrangham suggests
that the absence of evidence isn’t necessarily
evidence of absence. And while he’s logically
correct, fire does have a habit of leaving
visual traces of its existence. You know,
burnt ground, rings of stones, that kind of
thing. As Anna K. Behrensmeyer, paleoecologist
at the Smithsonian National Museum of Natural
History said, back in 1999, "I think there
would be evidence if it were [behind] as important
an evolutionary leap as [Wrangham's team]
suggests."
My point was, we need to have this theory
in order to encourage people to look for the
evidence. And so it's been very exciting seeing
that the evidence for the control of fire
has been pushed farther back. The evidence
is getting in the right direction. Personally
I'm disappointed because I thought within
ten years we would totally nail it. Give us
another ten years and we'll see it all the
way back.
That doesn’t mean we have to throw out Wrangham’s
insights, or his fascinating book, which I
definitely recommend. But it does mean that
we might be better off thinking about a range
of technological advances, from the development
of tools to better hunting practices to the
use of fire. That makes identifying a single
food’s impact a bit tricky. Tubers and other
underground storage organs were probably important,
perhaps critically so, but they may not deserve
any type of singularly exalted status in our
discussion of cooking.
Meat is obviously incredibly important to
our development as a species—along with
cooking, broadly defined, we might very well
say it’s what made us human. But I have
to admit, on a qualitative level, the life
of a hominid from 1.8 million years ago seems
to share more with the life of his apelike
predecessor than it does with mine. They’re
both living outside, they probably can’t
conceive of language or art, like the Matrix
... Maybe we need to come closer to the present
to identify a food that shaped life as we
know it today.
When we began considering the question at
the heart of this video, one of the first
places my mind went was the spice trade. The
search for spices like pepper (which is a
dried berry, incidentally) has undoubtedly
had a huge impact on human development and
exploration. It’s arguably responsible for
permanently connecting Europe to its neighbors
in the east and west. To give just a small
taste (pun very much intended) of the ways
the spice trade shaped the world: The Dutch
famously traded Manhattan to the British for
an Indonesian island with some nutmeg trees
on it. That’s admittedly an oversimplification,
but it speaks to the enormous value placed
on spices at this time. There are equally
fascinating stories we could tell about cumin
or pepper—we’ll have to devote a whole
video to the spice trade at some point—but
it’s hard to say that one spice jumps out
far above the others in importance.
Salt could also make a super-convincing case
for one of the most important foods ever,
but its impact on the world is so far-ranging
and constant throughout history that it’s
a bit hard to tell a single story of its role
in human development. We can’t live without
it, but is there one pivotal change it's responsible
for? It’s hard to say.
Like salt, sugar is both an ingredient we
can buy in the store and a naturally occurring
chemical compound. If we want to be really
annoying, we could crown sugar our most important
food champion hands-down simply for the vital
role photosynthesis plays in the food chain.
Super-simplified 6th grade science version:
plants take in sunlight, create sugar, provide
oxygen. Animals eat plants, life thrives,
Keanu Reeves is born. Everybody’s happy.
As biochemist Albert Szent-Gyorgyi described
it, “What drives life is thus a little electric
current, kept up by the sunshine.”
But if anything drove me to discuss sugar,
it’s actually an oversight we made in a
previous episode. When we discussed chocolate,
we stuck mostly to its earliest history and
the cool technological advances that eventually
gave us things like Hershey’s bars.
But I think a fair criticism we read in the
comments was that we ignored the role that
slavery played in chocolate, from its early
production in Europe’s Central and South
American colonies all the way up to today.
And if this series is mostly about celebrating
the fun and interesting aspects of the intersection
of food and history, I don’t think that
means it can’t also acknowledge the truly
horrific ways food can impact humanity.
Perhaps no food better represents these awful
contradictions than sugar. Sidney Mintz became
the “father of food anthropology” largely
through his seminal work on sugar, “Sweetness
and Power.” Like the title suggests, Mintz
looks at sugar not from a primarily culinary
lens, but, in his words, as “an old commodity”
“basic to the emergence of a global market.”
A major part of that global market was the
triangle trade connecting Western Europe,
Africa, and the so-called “New World”
of the Americas. Sugar was an incredibly valuable
resource for centuries, and the riches it
produced are inextricable from the labor of
enslaved Africans in sugar plantations.
Though evidence of sugarcane domestication
dates back as far as 10,000 years ago in New
Guinea, it didn’t make its way to Europe
in a big way until around the time of the
Crusades, when Christian soldiers returned
home with “sweet salt.” By then, large
swaths of Asia and the Middle East had mastered
the art of growing and refining sugar, using
it to create desserts and medicinal concoctions,
and even sugar sculptures that functioned
as saccharine status symbols.
“The true age of sugar” began when it
was introduced to the new world, according
to Marc Aronson and Marina Budhos’s book,
“Sugar Changed the World.” And while one
could perhaps accuse the authors of some geographical
bias in that pronouncement—it was certainly
serving an important cultural role before
that in Asia—it’s impossible to argue
that the role of sugar changed immeasurably
when it arrived in the New World via Christopher
Columbus. Yes, that Christopher Columbus.
In fact, according to historian Jack A. Goldstone,
“The first documented revolt of African
slaves in the Americas broke out around Christmas
1521 in southeastern Hispaniola on a sugar
estate owned by the eldest son of Christopher
Columbus.”
Growing and harvesting sugar is a labor-intensive
process, and the rise of sugar plantations
caused an intensification of the African slave
trade. Millions of human beings were enslaved
and brought to the New World to work on sugar
plantations—some 5 million to the Caribbean
alone.
Even in the context of slavery, work on the
sugar plantations was particularly back-breaking.
By the estimate of a white Barbados planter
named Edward Littleton, someone forced to
work on the sugar plantations had a lifespan
of somewhere between ten and seventeen years,
on average.
The impact on indigenous populations was also
devastating. Though some indigenous people
initially were pressed into labor on the sugar
plantations, “In the Caribbean the [indegenous]
population became virtually extinct within
a generation” of European contact, according
to Professor Linda A. Newson, through a combination
of brutal treatment and the introduction of
Old World diseases.
In the United States, Louisiana’s sugar
industry rose in step with its reliance on
the labor of enslaved people, making the state
the country’s second-richest in per capita
wealth. There, the inhumane working conditions
led to a pattern of “deaths exceeding births,”
according to historian Michael Tadman.
Even after slavery was abolished, “plantation
labor overshadowed black people’s lives
in the sugar region until well into the 20th
century,” according to John C. Rodrigue.
By the way, these last few quotes come from
a great piece by Khalil Gibran Muhammad in
the New York Times. There’s a link to in
the description below.
The interplay of sugar, wealth, and power
wasn’t limited to the Caribbean or American
South. Many historians argue that the United
States’ annexation of Hawaii was tied closely
to sugar production and the cheap labor it
relied on. The McKinley Tariff, passed in
1890, made Hawaiian sugar uncompetitive in
the American market. Whether the white Hawaiian
planters’ motivation was purely monetary
or was tied into a fear of losing their dominance
over the poorly paid and numerous Asian laborers
working in Hawaii, by 1893 the queen of Hawaii
had been deposed and by 1898 Hawaii was annexed
by the United States.
We continue to feel the impact of sugar in
our world today, whether we look to the inequality
faced by the descendants of enslaved people
or to the deleterious effects of obesity and
other health problems associated with excessive
sugar consumption (health problems, it’s
worth pointing out, that disproportionately
affect African-Americans.)
It’s clear that sugar changed the modern
world—perhaps irrevocably so. But that leaves
a question unanswered: how did the modern
world arise?
In his book, Against the Grain, James C. Scott
lays out the generally-accepted story of civilization:
sedentism, or the practice of living in one
place for a long time, arose from the cultivation
of cereal grains like wheat and barley—especially
the need to irrigate arid climates, which
takes lots of time and labor. Scott then spends
a couple hundred pages dunking on most of
the assumptions underpinning that story, and
questions whether it’s at all appropriate
to view the “civilizing process” as one
of generally uninterrupted progress.
Archaeological evidence actually indicates
that sedentism predates crop field cultivation
of grains by several millennia. Early settlers
probably cultivated wild grains for thousands
of years as part of a diverse food production
strategy, but the shift to a near monoculture
of deliberately domesticated cereal grains
seems to have arisen much later. (Cereal grains,
by the way, just refer to the edible grains
of grasses. They account for about 50% of
worldwide caloric consumption today, for both
human beings and livestock).
Research from landscape archaeologists like
Jennifer Pournelle suggests the “arid”
Southern Mesopotamia was, in fact, vastly
different during the first instances of sedentism
in the region. Rather than a dry area surrounded
by rivers in need of labor-intensive irrigation,
higher sea levels at the time rendered it
“ … a forager’s wetland paradise,”
in Scott’s understanding, full of diverse
food sources. He notes that other early settlements,
from coastal China to Teotihuacan near present-day
Mexico City, also benefited from natural wetland
abundance.
In those regions, incidentally, the dominant
cereal grains would’ve been rice and maize,
respectively. Today, more people rely on rice
for sustenance than any other grain, and corn
is the most produced grain worldwide, by tonnage.
So while we’re going to follow Scott’s
book and focus on wheat and barley for the
insights they can give us about the emergence
of the first known states in Mesopotamia,
it’s worth remembering that the story has
near analogues throughout the world with different
cereal grains. You can practically add a parenthetical
“and rice and corn” to almost everything
I’m going to say about wheat.
Scott’s central premise is that grain domestication
didn’t lead to sedentism as much as it led
to statehood, which he defines along a “stateness”
continuum consisting of things like city walls,
social hierarchy, soldiers, and—critically—taxation.
Grains like wheat are uniquely disposed to
taxation, in Scott’s telling. They are “visible,
divisible, assessable, storable, transportable,
and ‘rationable.’” Tubers may offer
similar caloric density, but they could be
safely hidden underground for years from the
tax-man’s prying eyes. Lentils were grown
widely, but don’t have a determinate harvest—because
they can be picked at various times, they’re
worse candidates for taxation.
Against the Grain draws out some fascinating
connections between grains, taxation, and
statehood. The earliest writings from Mesopotamia,
for example, were almost single-mindedly concerned
with state administration, especially as it
pertains to the rations and taxation of barley.
More than 500 years separate this type of
administrative writing and literary or religious
writing in the archaeological record, suggesting
the critical role state-based accounting played
in the emergence of written language.
The other most frequent topics from early
Mesopotamian tablets pertain to population.
Here, too, Scott shows how grain might’ve
influenced societal priorities. While acknowledging
that slavery and war predate the early grain-based
states, Scott sees highly organized farming
and the rise of states as an incentive for
both. In a society based around agriculture,
an increase in population can provide a more
or less direct increase in food production.
Early legal codes are filled with injunctions
discouraging and punishing people fleeing
the state, and warfare of the time seems less
interested in conquering territory than in
increasing the population to produce for the
state.
Scott sees the emergence of city walls as
the twofold result of a society based around
agriculture. On the one hand, large quantities
of stored grains would need to be protected
from the so-called “barbarians” outside
the city walls. Just as importantly, though,
the walls kept the productive laborers of
a city in. When Scott uses the term “barbarian,”
he does so with tongue in cheek, aware that
hunter-gatherer societies generally existed
alongside early agricultural states, interacting
and trading with them and enjoying a quality
of life that was not necessarily any worse,
in his estimation. Scott views the frequent
collapse of early states not necessarily as
a tragedy, but at times perhaps even an emancipation
from the control of elite rulers within the
state.
Beyond the hours of labor required to maintain
a state store of grain and the drawbacks of
being forced to give up a portion of your
wheat, these early states likely contributed
to the spread of so-called “crowding diseases”
like cholera, smallpox, and measles. Scott
points to a confluence of factors arising
from domesticated grain cultivation that would
carry greater risk of disease: increased population
density (and the greater concentration of
feces that it entails); an increase of potentially
disease-carrying domesticated animals; and
a relatively monocultural diet, whose effects
we can see in comparing early farmers’ skeletal
remains with their hunter-gatherer contemporaries
(according to one study, for example, adult
height decreased during the transition from
hunter-gatherer to agriculture). Of course,
today we know that there’s no serious risk
posed by zoonotic diseases in a highly-connected society.
Given all these drawbacks, why would people
allow themselves to be “civilized” in
the first place? Scott points to climate change
occurring around 3500 to 2500 BCE as one explanation.
As the Mesopotamian region dried up, relatively
low-labor “flood retreat” agriculture,
making use of annual river flooding, was no
longer a viable method of farming. There were
fewer animals to hunt, and fewer crops to
forage. Water now had to be carried or transported
through dug canals, incentivizing people to
live closer to the source, and by extension
to one another. As people relied increasingly
on grain and the security it could afford,
it became a nearly self-fulfilling prophecy
for the state to tax their production, use
the proceeds to help develop new methods to
increase productivity, and continually grow
the state until its eventual collapse due
to disease, drought, warfare or natural disaster.
Man humanity is weird!
Scott doesn’t go as far as an author like
Jared Diamond, who called the move away from
nomadism “the worst mistake in human history,”
but it’s hard not to read his book as anything
but deeply suspicious of the state. His critics
have accused him and his state-skeptical colleagues
of romanticizing the life of the hunter-gatherer.
While modern hunter-gatherer societies can
be seen, in some ways, to evince more egalitarian
principles than their state-bound counterparts,
there is also compelling evidence of high
homicide rates and infant mortality amongst
these populations. And while some interesting
research has indicated that hunter-gatherers
didn’t suffer from famine as frequently
as previously thought, it’s hard to be certain
how food-secure hunter-gatherers were at the
time of early statehood.
For our purposes, though, a moral reading
of the move to agriculture is beside the point.
Whether you view it as an unmitigated boon
to humanity or, in the words of Yuval Noah
Harari, “ … history’s biggest fraud,”
agriculture may not have caused sedentism,
but it certainly accelerated it immeasurably
through grain farming and the taxation that
was born from it.
Any attempt to answer a question as broad
as “what food has done the most to impact
human development?” is likely to reveal
as much about the person answering it as anything
else. If I was making this video in Mexico,
I might say maize. If I were making it 500
years ago, I might say pepper, and also I might say, "what is video?"
So while the real answer may be “there is
no answer,” that would probably feel like
a total copout at this point. Instead, I’ll
reveal my own biases. I’m coming to you
via the Internet, which was created in part
through publicly funded research. I give away
a healthy chunk of every paycheck to taxes,
and taxes are partially responsible for funding
the public transit agency that I use to get
to work everyday—when my life isn’t being
interrupted by a zoonotic virus spreading
rampantly through a highly connected world,
that is. Industrial farming may be killing
our planet, and yet the specialization of
labor that agriculture helped usher in gives
us Keanu Reeves movies. Life is complicated.
If nothing else, the last few months have
shown me how much we rely on one another.
James Scott’s book tells us that that interreliance
didn’t necessarily have to take the form
of state-building, but—perhaps because of
grains like wheat—that is the form it took.
So sitting here in mid-2020, that feels like
as good an answer as any. Plus, you know:
bagels. Thanks for watching.
