Category Archives: evolution

Racehorses are reaching the finish line faster

Scientists have just confirmed a pattern that long-time fans of elite horse racing may have noticed over the years: British Horses are getting faster. This is particularly true of equine sprinters—some of whom may be running nearly 13% faster than their predecessors who competed in 1850 (when detailed records were first collected). Over the last 15 years alone, short-distance racers have increased their speed by approximately 0.11% per year, which corresponds to gains of 1.18 seconds, or more than seven horse lengths.

These findings result from analyses of over 616,000 race times run by nearly 70,400 horses in the UK between the years of 1850 and 2012. Researchers from the University of Exeter’s Penryn Campus compiled the massive dataset in the hopes of resolving contradictory patterns reported by previous studies. In particular, they were curious about the suggestion that horses are approaching a speed plateau.

Image courtesy of Froggerlaura

Lead author Patrick Sharman, from the University of Exeter’s Centre for Ecology and Conservation, said: “There has been a general consensus over the last thirty years that horse speeds are stagnating.” However, this seems counterintuitive, given evidence that swiftness is passed down through the generations, and therefore can be selected for—and potentially enhanced—during the breeding process.

Whereas previous studies have predominantly focused on results from middle- and long-distance elite races (8-12 and 14-20 furlongs, respectively), the current dataset also looks at sprint competitions (5-7 furlongs). It also takes into account variations in environmental conditions that can influence horse performance. The full dataset, which focused on races run on flat turf in the UK, included information on race course, ground softness, number of runners, name/age/sex of each horse, race distance, method of timing (automatic or manual), year of race, and, of course, horse speed.

Although the improvements in speed were most noticeable for sprinters, modest increases were also seen amongst middle- and long-distance runners. However, year-on-year changes were not linear, indicating that they have not been consistent over time. Alterations to riding style appear to be responsible for dramatic improvements in performance during the early 1900s and in the final quarter of the 20th century. The first period of rapid improvement seems to have resulted from jockeys’ shortening their stirrups and crouching while riding. The second increase in speed has been associated with further shortening of the stirrups—though increased commercialisation of the sport around this same time may have led to an influx of imported horses that may have improved the quality of the gene pool.

Image courtesy of Anthony92931

Despite this tantalizing connection between genetics and speed, the study’s authors are cautious about definitively declaring that decreases in racing time result from human-caused equine evolution. For one thing, there are certain potentially confounding factors—such as horse diet and individual jockeys’ racing techniques—that they were not able to consider in the study. For another, the field does not currently have enough information on equine genetics to fully understand the impact of our horse husbandry techniques on thoroughbred racing performance.

However, the authors suggest that this could be rectified by a detailed analysis of thoroughbred pedigrees. Alastair Wilson, coauthor of the study, said: ”The next step is to find out how much of this improvement is actually down to genetic change. It could be that breeders are effectively targeting the genes that make horses fast – at least over the shorter sprint distances–and we are seeing the result of this. Alternatively, current improvement could be driven entirely by non-genetic factors, for instance improvements in training. At the moment we just don’t know. However, by analysing the performance data in conjunction with knowledge of how horses are related to each other, we should be able to determine just how important genes are for determining speed, and whether the genetic makeup of the thoroughbred population is changing through time”.

In the meantime, don’t expect sprinters to slow down any time soon. Middle- and long-distance runners may be nearing their maximal speeds, but short-distance racers could continue to break records for years to come.

Sharman, Patrick and Alastair Wilson. 2015. Racehorses are getting faster. Biology Letters 11(6).

A little disturbance could be a good thing

When you want to maximize biodiversity in a habitat, you should aim to minimize human disturbance there–right? That’s certainly the concept at the heart of many conservation initiatives, and is a tenet espoused by the majority of wildlife preservationists. There is, however, a growing literature documenting the ways in which anthropogenic influences may currently give, or have previously have given, certain communities a biodiversity boost.

This has prompted a number of researchers to seek out systems where human disturbances have occurred in a repetitive way over thousands of years. In these places, wildlife would have had the chance not only to adapt to anthropogenic influences, but actually thrive under human disturbance regimes. Exactly this scenario is described by M. Kat Anderson in her 2005 book Tending the Wild, which focuses on the ways that Native Americans interacted with the California wilderness prior to the arrival of the Europeans. A similar discussion can be found in a new review paper by the University of Montana’s Stephen Siebert and Jill Belsky, who describe the ecological impacts of swidden in Bhutan.

The kingdom of Bhutan. Image courtesy of Merriam-Webster.

Swidden is a slash-and-burn technique used to open up forested areas for agricultural activity. Although swidden had disappeared from Bhutan by 1997, it was previously widespread throughout the country; as many as 200 million swidden farmers are thought to continue the practice throughout the remainder of the eastern Himalayas. Two types of swidden were common: tseri, in which crops were grown for 1-2 years before the field was left to grow over with trees and shrubs for 2-8 years; and pangshing, in which crops were grown for 2-3 years before the field was left to grow over with grass for 6-20 years. Tseri was typically used to produce maize, millet, rice, and vegetables, and could be found at lower altitudes; pangshing was better for wheat, buckwheat, barley, and greens, and was used at higher altitudes. Another major difference between the two techniques is the amount of disturbance required to prepare and tend fields. During tseri, disturbance was minimal because plowing was infrequent and crop maintenance was performed by hand; during pangshing, the top 6 cm of soil were removed, dried, and burned.

If you think these techniques sound fairly invasive, then you’re not alone. The Bhutanese government, wanting to preserve the amazing biodiversity that can be found in its country, passed the Bhutan Forestry Act in 1969. This spelled the beginning of the end for swidden, which many people viewed as antiquated, dangerous, and ecologically unsound. However, it may have provided a service to a variety of wildlife species that needed more sunlight than thick tree canopy allowed into the forest, but also more cover than was provided by completely open meadows. In other words, swidden may have acted as an intermediate disturbance–something more intense than, say, the toppling of a single disease-ridden tree, but less intense than the felling of a whole hillside’s worth of trees during a mudslide.

A patchwork habitat resulting from swidden. Here you can see forests in multiple stages of succession, fallow fields, and current growth of rice crops in a swidden landscape in Yunnan Province. Image courtesy of Wikimedia Commons.

Swidden farmers left in their wake a variety of patches of different ages–and, therefore, at different stages of succession. The entire landscape would have been a mosaic of areas with closed canopies, open canopies, and everything in between. These conditions would have provided habitat for shade-lovers, sun-lovers, shy species that prefer to stay hidden in the undergrowth, and bolder ones not afraid to venture out into the open. From a single habitat–forest–the swidden farmers were creating multiple types of new habitat, each with its own niches. This, in turn, would have facilitated biological heterogeneity–probably not just in terms of the number and type of species present, but also the roles of those species in the local food web, and the functions that the entire ecosystem could collectively perform.

It’s difficult to know for sure, because swidden vanished in Bhutan over 30 years ago, and wasn’t well quantified before that time. As a result, the authors can’t make a clear before-and-after comparison that would allow them to see whether biodiversity levels have dropped since the discontinuation of swidden. (With proper permission, however, it would be possible to investigate this experimentally.) In other countries where swidden is still used, researchers have noted that fallow plots provide habitat for different types of species than those found in the adjacent forest; as a result, the swidden-and-forest patchwork is more biodiverse than tracts of uninterrupted forest alone. We also know that fallow-type habitats are used by sambar deer, wild pigs, and gaur–potential prey items for tigers. This suggests that swidden could facilitate the coexistence of predators and humans even in disturbed, densely populated areas.

Ironically, the curtailing of swidden has not led to a uniform decrease in disturbance across Bhutan’s forests, some of which are shown above. Some areas are protected from human activities, but other areas have been subjected to intense logging efforts, which may, over the long run, be more ecologically harmful than traditional slash-and-burn practices. Image courtesy of Rashid Faridi.

The best way to know this for sure is to explore the possibility using rigorous scientific methods. That’s one of the reasons the authors have used Bhutan as a case study. They suggest that the country missed a trick by outlawing swidden without collecting more data and thinking about the situation more critically. Their hope is that conservationists and policy makers will do better elsewhere–not just in places with swidden practitioners, but in any area where humans continue to use traditional land management techniques that have been employed for thousands, if not tens of thousands, of years.

Although we tend to think of humans as separate from nature, we do, in fact, have a long history of interacting with species, communities, ecosystems, and entire landscapes; removing us from the equation may have unintended consequences. The authors advise us to keep that in mind, and to consider the possibility that a little bit of disturbance might actually be a good thing–both for humans and for wildlife.

Stephen F. Siebert and Jill M. Belsky. 2014. Historic livelihoods and land uses as ecological disturbances and their role in enhancing biodiversity: an example from BhutanBiological Conservation 177:82-89.

 

Feast, famine, and the evolution of the human diet

It’s not easy to keep track of all the different diets out there, but you still may have heard of the “fast diet,” also known as “intermittent fasting” or the “5:2 diet.” Regardless of what you call it, this popular method of slimming down is based on the assumption that it is normal and healthy for our bodies to periodically experience food shortages. Another popular eating regime, the Paleo diet, has been advocated by nutritionists who believe that modern humans should restrict their foods to those available to, and commonly eaten by, our ancient ancestors. This advice is grounded in the idea that our genomes are not very different from those of our predecessors, and are therefore designed not to facilitate the processing of frequent and/or large quantities of “calorically dense foods,” but, instead, of only the freshest and most natural ingredients. Both the intermittent fasting and Paleo diets are thought to promote not only weight loss, but also improvements in general health.

Examples of what Berbesque et al. refer to as “calorically dense foods” (image courtesy of Sott.net)

One of the reasons that eating regimes like these are so appealing is that they sound as though they are grounded in fact–in data collected by doctors, anthropologists, and evolutionary biologists. In reality, however, the logic behind these diets relies on many assumptions. One of these, the focus of a recent project conducted by researchers at the Universities of both Roehampton and Cambridge, is the idea that our hunter-gatherer ancestors frequently experienced food shortages–and even famines. In addition to prompting many a diet, this concept has also influenced theoretical models about the evolution of cognition and various life history traits in humans.

But what if early humans didn’t experience high food insecurity–or, at least, what if the food insecurity they did experience is not that different from what modern agriculturalists are periodically subjected to? Unsurprisingly, this is a question that researchers have previously asked, but their analytical methods may not have allowed them to draw truly informed decisions. For example, past studies have focused predominantly on only a handful of societies that experienced rapid increases in diabetes and obesity rates after swapping traditional diets for those that were more “westernized.” Further, focal groups were characterized according to early ethnographical reports that may not have accurately or fully documented the relationships between various societies and their dietary practices. Finally, and perhaps most importantly, previous work has tended not to account for the fact that contemporary hunter-gatherer groups predominantly live in marginal habitats, and that habitat, in general, is hugely influential in determining both the availability of food and the relationship between locals and the choices they make about sustenance.

Samoan data has been included in previous dietary studies because of increases in type II diabetes and obesity in Samoa since the switch from a traditional diet to one that is more “westernized” and agriculture-based (Image courtesy of Faifeau).

Cumulatively, these flaws prompted the researchers behind the current study to conduct a new set of analyses designed to more rigorously “explore relationships between subsistence and famine risk.” Specifically, they wanted to make three comparisons: first, between hunter-gatherers in warmer (more productive) and colder (more marginal) habitats; second, between hunter-gatherers and agriculturalists, regardless of habitat; and, third, between hunter-gatherers and agriculturalists, with habitat quality controlled for.

Rather than focusing on only a subset of societies at opposite ends of the dietary spectrum, the researchers utilized a global sample including 186 different “cultural provinces”; this included 36 hunter-gatherer societies and incorporated data on eight different variables that each measured a different aspect of famine. They used two metrics to account for habitat quality. The first was effective temperature (ET), which is derived from measurements of both the warmest and coldest temperatures of an area, and also reflects plant growth (with lower ETs generally indicating less abundant vegetative resources). The second habitat variable was net primary productivity (NPP), higher values of which are associated with greater availability of food items.

As you might expect, groups living in warmer climates experienced significantly less famine–and also less persistent famines–than those living in colder areas. Because of this, the researchers decided that their subsequent two analyses would only include data from groups living in habitats with an ET at or above 13 degrees C; this ensured that the analyses were comparing apples with apples, so to speak. Their comparison of agriculturalists and hunter-gatherers revealed that food conditions were more favorable for the latter than for the former. Hunter-gatherers living in warm climates were particularly well off, though both types of society were equally likely to face periods of short-term and seasonal starvation (neither of which was very common). Similar patterns were found when this analysis was re-run with both ET and NPP included in order to control for overall habitat quality. Taken together, these results indicate that–contrary to popular belief–hunter-gatherers are actually less likely than agriculturalists to experience famine.

The Paleo diet aims to emulate the eating practices of our ancient predecessors, both in terms of quantity and type of food items eaten. However, hunter-gatherers likely had access to more foods, and to a wider variety of foods, than we give them credit for–though that is no reason to avoid a diet heavy in vitamin-containing ingredients (image courtesy of Activ8).

In other words, regularly skipping meals and eating extremely low-calorie diets may not faithfully mimic an actual Paleo diet; rather, this way of eating may more accurately reflect conditions in agricultural societies that periodically experience food shortages associated with drought and flooding. Faced with these sorts of environmental instabilities, hunter-gatherers tend to have more freedom to relocate to more productive habitats. This sort of cultural adaptability was also highlighted by the comparison between warm-climate and cold-climate hunter-gatherers, the latter of which engaged in a variety of behaviors (e.g., development of specialized hunting technology, food storage, infanticide, and trade) that maximized survival during times of want.

Given the great importance of culture in mediating the relationships between humans and food, the authors point out that there is no reason for other researchers to assert that our modern food-related health crises are necessarily related to genes. This is the idea behind the “thrifty genotype” theory, which suggests that a subset of our hunter-gatherer ancestors possessed a certain set of genes that conferred better survival and reproduction rates in the face of unpredictable feast-famine cycles.

There may, indeed, be such a genotype, but it is not necessarily an ancient development (it could have evolved since the dawn of the modern agricultural age) and it need not be the only explanation for the prevalence of diabetes and obesity in western societies. Equally–if not more–logical is the idea that some cultures began to favor both “immediate return behavior” (use of resources as soon as they become available, as generally observed in hunter-gatherers living in warmer climates) and a taste for high-calorie foods.

It may be the case that our dietary and lifestyle choices–rather than our genes–should get most of the blame for our current struggles with obesity and weight-related health problems (image courtesy of Dolores Smith).

Centuries ago, when humans were more active and had fewer resources right at their fingertips, this pairing of behaviors would have been adaptive; today, say the authors, it is a liability. They don’t go on to make any recommendations about modern weight loss regimes (after all, they aren’t those types of doctor), but it isn’t hard to read between the lines: Though the intermittent fasting and Paleo diets have undoubtedly worked for some people, they probably aren’t as scientifically grounded as some of their advocates might like you to believe.

Berbesque, J.C., Marlowe, F.W., Shaw, P., and Thompson P.  2014. Hunter-gatherers have less famine than agriculturalists. Biology Letters 10:20130853 (online advance publication).