Why Did Humans Adopt Agriculture? A Biochemical Hypothesis

The transition from hunter-gatherer societies to agricultural civilizations represents one of the most consequential shifts in human history. Yet despite decades of research, there is no universally accepted explanation for why this transition occurred. A paper published in Australian Biologist by Greg Wadley and Angus Martin of the University of Melbourne’s Department of Zoology proposed a provocative hypothesis: that the psychoactive properties of cereal grains and dairy products, not their nutritional value, were the primary incentive for the adoption of agriculture and the subsequent emergence of civilization.
The Paradox of Agriculture
The conventional narrative of agricultural development assumes it was an inherently progressive step: humans discovered that planting seeds produced crops, leading to food surpluses, larger populations, and eventually civilization. However, paleopathological evidence has systematically challenged this view.
Health deteriorated in populations that adopted cereal agriculture. Hunter-gatherers typically performed less work for equivalent food intake, maintained better health, and experienced less famine than early farmers. Archaeological studies of skeletal remains show that early agricultural populations suffered from increased malnutrition, dental disease, infectious illness, and reduced stature compared to their foraging predecessors.
Moreover, many hunter-gatherer groups were fully aware of agricultural methods but deliberately chose not to adopt them. As researcher Mark Cohen summarized the central puzzle: “If agriculture provides neither better diet, nor greater dietary reliability, nor greater ease, but conversely appears to provide a poorer diet, less reliably, with greater labor costs, why does anyone become a farmer?”
The Discovery of Exorphins
In the late 1970s, researchers investigating possible links between diet and mental illness made a surprising discovery. Psychiatrist F. Curtis Dohan found that symptoms of schizophrenia were partially relieved when patients were placed on diets free of cereals and milk. He also found that individuals with coeliac disease, who have higher than normal intestinal permeability to wheat proteins, were statistically more likely to develop schizophrenia.
Research groups led by Zioudrou and Brantl independently identified opioid-active peptides in wheat, maize, barley, and bovine and human milk. These compounds, named exorphins (for exogenous morphine-like substances), were produced during the digestion of cereal and milk proteins. Cereal exorphins proved stronger than bovine casomorphin, which was in turn stronger than human casomorphin.
Additionally, researchers identified an analogue of MIF-1, a naturally occurring dopaminergic peptide, in wheat and milk. This compound occurs in no other exogenous protein. Subsequent studies showed that exorphins were comparable in potency to morphine and enkephalin, were absorbed through the intestinal wall, and produced measurable effects including analgesia and anxiety reduction.
Mycroft and colleagues estimated that normal daily intake of cereals and milk could produce approximately 150 milligrams of the MIF-1 analogue, noting that such quantities are orally active and that half this amount “has induced mood alterations in clinically depressed subjects.”
Food Intolerance and Addictive Behavior
Research into food intolerance provided additional evidence for the drug-like properties of agricultural staples. In clinical trials, wheat triggered reactions in more than 70 percent of subjects tested, followed by milk at 60 percent. The list of most problematic foods closely matches the foods that became common in the human diet following the adoption of agriculture, in approximate order of prevalence.
Strikingly, approximately 50 percent of food intolerance patients exhibited cravings, addiction, and withdrawal symptoms related to the offending foods. The withdrawal symptoms resembled those associated with conventional drug addictions. Patients frequently craved the exact foods that caused their symptoms, a pattern inconsistent with nutritional need but entirely consistent with chemical dependence.
These foods were not significant in the human diet before agriculture, ruling out any explanation based on evolved nutritional requirements. The pre-agricultural human diet, derived from primate ancestors, consisted primarily of fruits, nuts, vegetables, tubers, and meat, none of which exhibit these pharmacological properties.
The Exorphin Hypothesis
Wadley and Martin proposed that these two unresolved problems, the unexplained adoption of agriculture and the unexplained drug-like properties of agricultural staples, might in fact solve each other. Their hypothesis was as follows:
Climatic changes at the end of the last glacial period increased the size and concentration of wild cereal patches in certain regions. People who consumed significant quantities of these newly available grains discovered their rewarding pharmacological properties. Processing methods including grinding and cooking were developed to increase the palatability and quantity that could be consumed, which in turn amplified exorphin intake.
The chemical reward provided by exorphins created an incentive to protect, cultivate, and settle around cereal patches. This reward, experienced several times daily with each meal, facilitated the behavioral changes necessary for civilization: tolerance of crowded sedentary living, willingness to perform regular labor for the benefit of non-kin, and acceptance of hierarchical social structures.
The Correlation Between Cereals and Civilization
The hypothesis is supported by a striking geographical correlation. To a very good approximation, every civilization that emerged had cereal agriculture as its subsistence base, and wherever cereals were cultivated, civilization appeared. The earliest civilizations in Mesopotamia, Egypt, the Indus Valley, and China all relied primarily on cereal crops.
Groups that practiced vegeculture (cultivation of fruits, tubers, and root vegetables) or no agriculture at all did not develop comparable civilizations. This pattern holds across continents and millennia: major civilizations in southwest Asia, Europe, India, East Asia, Central America, and North Africa all stemmed from cereal-cultivating populations.
The rarer nomadic civilizations were based on dairy farming, the other major source of dietary exorphins. Meanwhile, populations in tropical Africa, Australia, New Guinea, the Pacific, and much of the Americas, which relied on non-cereal agriculture or foraging, did not develop the hierarchical state structures characteristic of cereal-based civilizations.
Cereals as Ideal Behavioral Facilitators
The researchers noted that cereals possess qualities that distinguish them from conventional drugs and make them uniquely suited to facilitating civilization. Cereals serve as both a food source and a pharmacological agent. They can be stored and transported easily. They are consumed in frequent small doses rather than occasional large ones, and they generally do not impair work performance.
Crucially, the desire for exorphins can be confused with ordinary hunger, masking the pharmacological dependence behind what appears to be simple nutritional need. These properties make cereals, in the researchers’ assessment, “the ideal facilitator of civilization” and may also explain why their pharmacological properties went unrecognized for so long.
The alcohol hypothesis proposed by earlier researchers, suggesting that beer production motivated cereal cultivation, is compatible with the exorphin model. However, exorphins are present in all cereal products including bread and porridge, not just fermented beverages, making the exorphin explanation more broadly applicable. The early cultivation of opium poppies alongside cereals further supports the idea that chemical reward played a central role in the agricultural transition.
Implications for Understanding Human Behavior
The exorphin hypothesis does not refute existing models of agricultural origins based on climate change, population pressure, or resource concentration. Rather, it adds a previously unconsidered factor that addresses the central puzzle: why cereal agriculture was adopted despite its apparent disadvantages, and how it led to the profound behavioral changes associated with civilization.
If the hypothesis is correct, the implications extend well beyond prehistory. Modern humans still derive approximately two-thirds of their calories and protein from cereals. Methods of artificial reward have diversified since the Neolithic to include a wide range of pharmacological and non-pharmacological stimuli. The researchers suggested that civilization not only arose from the self-administration of artificial reward but continues to be maintained through this mechanism among contemporary populations.
This perspective reframes fundamental questions about human social organization. The willingness of billions of people to live in crowded conditions, perform repetitive labor, and accept subordinate positions in hierarchical institutions may be less a matter of rational choice or cultural evolution than of continuous, low-level pharmacological modification of behavior through dietary exorphins consumed several times daily from childhood onward.



