It’s not easy to keep track of all the different diets out there, but you still may have heard of the “fast diet,” also known as “intermittent fasting” or the “5:2 diet.” Regardless of what you call it, this popular method of slimming down is based on the assumption that it is normal and healthy for our bodies to periodically experience food shortages. Another popular eating regime, the Paleo diet, has been advocated by nutritionists who believe that modern humans should restrict their foods to those available to, and commonly eaten by, our ancient ancestors. This advice is grounded in the idea that our genomes are not very different from those of our predecessors, and are therefore designed not to facilitate the processing of frequent and/or large quantities of “calorically dense foods,” but, instead, of only the freshest and most natural ingredients. Both the intermittent fasting and Paleo diets are thought to promote not only weight loss, but also improvements in general health.
One of the reasons that eating regimes like these are so appealing is that they sound as though they are grounded in fact–in data collected by doctors, anthropologists, and evolutionary biologists. In reality, however, the logic behind these diets relies on many assumptions. One of these, the focus of a recent project conducted by researchers at the Universities of both Roehampton and Cambridge, is the idea that our hunter-gatherer ancestors frequently experienced food shortages–and even famines. In addition to prompting many a diet, this concept has also influenced theoretical models about the evolution of cognition and various life history traits in humans.
But what if early humans didn’t experience high food insecurity–or, at least, what if the food insecurity they did experience is not that different from what modern agriculturalists are periodically subjected to? Unsurprisingly, this is a question that researchers have previously asked, but their analytical methods may not have allowed them to draw truly informed decisions. For example, past studies have focused predominantly on only a handful of societies that experienced rapid increases in diabetes and obesity rates after swapping traditional diets for those that were more “westernized.” Further, focal groups were characterized according to early ethnographical reports that may not have accurately or fully documented the relationships between various societies and their dietary practices. Finally, and perhaps most importantly, previous work has tended not to account for the fact that contemporary hunter-gatherer groups predominantly live in marginal habitats, and that habitat, in general, is hugely influential in determining both the availability of food and the relationship between locals and the choices they make about sustenance.
Cumulatively, these flaws prompted the researchers behind the current study to conduct a new set of analyses designed to more rigorously “explore relationships between subsistence and famine risk.” Specifically, they wanted to make three comparisons: first, between hunter-gatherers in warmer (more productive) and colder (more marginal) habitats; second, between hunter-gatherers and agriculturalists, regardless of habitat; and, third, between hunter-gatherers and agriculturalists, with habitat quality controlled for.
Rather than focusing on only a subset of societies at opposite ends of the dietary spectrum, the researchers utilized a global sample including 186 different “cultural provinces”; this included 36 hunter-gatherer societies and incorporated data on eight different variables that each measured a different aspect of famine. They used two metrics to account for habitat quality. The first was effective temperature (ET), which is derived from measurements of both the warmest and coldest temperatures of an area, and also reflects plant growth (with lower ETs generally indicating less abundant vegetative resources). The second habitat variable was net primary productivity (NPP), higher values of which are associated with greater availability of food items.
As you might expect, groups living in warmer climates experienced significantly less famine–and also less persistent famines–than those living in colder areas. Because of this, the researchers decided that their subsequent two analyses would only include data from groups living in habitats with an ET at or above 13 degrees C; this ensured that the analyses were comparing apples with apples, so to speak. Their comparison of agriculturalists and hunter-gatherers revealed that food conditions were more favorable for the latter than for the former. Hunter-gatherers living in warm climates were particularly well off, though both types of society were equally likely to face periods of short-term and seasonal starvation (neither of which was very common). Similar patterns were found when this analysis was re-run with both ET and NPP included in order to control for overall habitat quality. Taken together, these results indicate that–contrary to popular belief–hunter-gatherers are actually less likely than agriculturalists to experience famine.
In other words, regularly skipping meals and eating extremely low-calorie diets may not faithfully mimic an actual Paleo diet; rather, this way of eating may more accurately reflect conditions in agricultural societies that periodically experience food shortages associated with drought and flooding. Faced with these sorts of environmental instabilities, hunter-gatherers tend to have more freedom to relocate to more productive habitats. This sort of cultural adaptability was also highlighted by the comparison between warm-climate and cold-climate hunter-gatherers, the latter of which engaged in a variety of behaviors (e.g., development of specialized hunting technology, food storage, infanticide, and trade) that maximized survival during times of want.
Given the great importance of culture in mediating the relationships between humans and food, the authors point out that there is no reason for other researchers to assert that our modern food-related health crises are necessarily related to genes. This is the idea behind the “thrifty genotype” theory, which suggests that a subset of our hunter-gatherer ancestors possessed a certain set of genes that conferred better survival and reproduction rates in the face of unpredictable feast-famine cycles.
There may, indeed, be such a genotype, but it is not necessarily an ancient development (it could have evolved since the dawn of the modern agricultural age) and it need not be the only explanation for the prevalence of diabetes and obesity in western societies. Equally–if not more–logical is the idea that some cultures began to favor both “immediate return behavior” (use of resources as soon as they become available, as generally observed in hunter-gatherers living in warmer climates) and a taste for high-calorie foods.
Centuries ago, when humans were more active and had fewer resources right at their fingertips, this pairing of behaviors would have been adaptive; today, say the authors, it is a liability. They don’t go on to make any recommendations about modern weight loss regimes (after all, they aren’t those types of doctor), but it isn’t hard to read between the lines: Though the intermittent fasting and Paleo diets have undoubtedly worked for some people, they probably aren’t as scientifically grounded as some of their advocates might like you to believe.
Berbesque, J.C., Marlowe, F.W., Shaw, P., and Thompson P. 2014. Hunter-gatherers have less famine than agriculturalists. Biology Letters 10:20130853 (online advance publication).