Category Archives: behavior

Behavioral plasticity in pumas may be a boon to smaller predators

In many areas where human activities have resulted in the decline of top predators, smaller carnivores–species such as foxes, raccoons, and rats–benefit from no longer being prey items, as well as from the decreased competition over resources in the habitat. Until recently, this “mesopredator release” was thought to occur only when apex carnivores–things like tigers, wolves, and wild dogs–had been locally extirpated, leaving vacancies in the food web that their smaller brethren could step in to fill. However, a new study on pumas suggests that mesopredator release could happen by another mechanism altogether: changes to the feeding behaviors of top predators.

Scientists from the University of California’s Santa Cruz campus discovered this by collaring and tracking 30 pumas between 2008 and 2013. Every four hours, the collars collected information on the location of the cats, and the resulting data points were plotted on a map. Where points were clustered, the researchers suspected a kill site and went to investigate for signs of predation. For logistical reasons, it wasn’t possible to visit all potential kill sites, so, to overcome this difficulty, the researchers created computer models into which they could plug variables associated with confirmed kills–factors such as how long the pumas stayed there, whether they were there at night or during the day, and how far the cats strayed from the area. These traits were then used to determine the likelihood that the putative kill sites were, in fact, the location of a puma kill.

Puma_face
Close-up of a puma. Image courtesy of Bas Lammers.

What made this information so interesting was that the hunting data could be plotted on maps showing human housing densities. The study area, the Santa Cruz Mountains of California’s Central Coast region, encompassed a range of anthropogenic sites, from rural areas with fewer than 1 house per hectare, to suburban locations with nearly 10. Along this disturbance gradient, the hunting behavior of pumas–specifically, female pumas–varied. As housing density increased, these huntresses spent up to 42% less time consuming prey. Their site fidelity was 36% lower, and the farthest distance traveled from the kill site was up to 31% higher.

To make sense of these numbers, it is necessary to understand a bit about the feeding strategies of apex predators like pumas. They can expend quite a lot of energy while stalking, chasing, and taking down prey. These calories can be replaced by eating whatever has just been caught, but big carnivores don’t have large enough stomachs to accommodate many of their prey items–things like deer, for example–in a single sitting; instead, the hunters have to drag the body somewhere safe and revisit it until they’ve had their fill.

Mountain lion eating. Image courtesy of
Mountain lion eating. Image courtesy of Benjamint444

The current results suggest that this is a much trickier prospect for female pumas living in anthropogenically disturbed areas. These cats weren’t able to spend as much time eating, and seemed to roam much farther from where they had stashed their kills; the low site fidelity values suggest that many of the animals left the area altogether. Given these figures, it is, perhaps, unsurprising that some of the hunters in these areas were estimated to kill as many as 20 more deer per year than pumas living in the most rural areas. These data suggest that predators in high-human-density areas are having to target more prey because they are starting over after being interrupted while feeding on their previous kills.

Males seem to get off easy because they already spend less time at kills; they are adapted to eat quickly and head back out to patrol the borders of the large territories they defend. The size of their home ranges (up to 170 square km) also means that if humans become disruptive in one area, the cats can withdraw to more natural spots for a bit of privacy. Given that male territories only have approximately 16 houses per square km, this isn’t too hard to do. Female pumas, however, don’t have as much flexibility; their territories are smaller (as small as 51 square km), and may comprise only exurban or suburban land; their home ranges sometimes contain as many as 27 houses per square km.

Mountain lion kitten about to be outfitted with a tracking device. Image courtesy of the NPS.

Female pumas must also take care of kittens, a responsibility that requires them to bring down even more prey. To adequately feed their young, mothers may need to make more than a dozen additional kills per year. For females living in high-human-density areas, where disruptions to feeding sessions are already inflating hunting rates, this could be untenable; these mothers could begin to lose weight, suffer poor health, or even be driven to abandon their young. Indeed, the researchers provided anecdotal evidence of the last of these possibilities, suggesting that puma populations in human-disturbed sites may only be viable so long as they are replenished by young pumas migrating in from more rural areas.

While this is bad news for pumas, it is potentially great news for mesopredators. Female pumas are leaving a larger number of kills for longer periods of time, giving scavengers more of an opportunity to swoop in and have a free meal. This increased source of nutrition could allow the ecosystem to sustain larger populations of middle predators and give individual animals the energy boost they need to live longer and/or procreate more successfully. Beneficiaries could include a range of species, from raccoons and foxes all the way up to coyotes. Additional work would need to be conducted to explore whether these species are commoner or more successful in more human-dense areas, and, if so, whether those patterns can be directly attributed to puma behavior rather than other characteristics of anthropogenic environments.

Image courtesy of Tony Hisgett.

As the authors point out, “behavioral responses are often overlooked as ecosystem drivers in modified systems, overshadowed by population declines and extirpations.” Their current study, however, shows that behavioral flexibility can allow species to persist in modified environments–but that this persistence may come at a cost, and have widespread implications for the habitat.

Source material: Smith, Justine A., Yiwei Wang, and Christopher C. Wilmers. 2015. Top carnivores increase their kill rates on prey as a response to human-induced fear. Proceedings B. 282: 20142711.

 

A century of anthropogenic influences on black bear diets in Yosemite National Park

One of the main reasons people visit natural areas such as national parks is to have close encounters with free-living animals. In many such places, humans have developed a bad habit of using food scraps to either draw particular animals closer, or to create places where easy access to appealing foods ensures a reliable stream of animal visitors. Sometimes–as in the case of unattended trash bins in parking lots and behind hotels–there is no intention to feed the animals, but it happens anyway. Regardless of the exact circumstances, this sharing of food can be detrimental both to individual organisms and entire ecosystems; as a result, parks often devote a great deal of time and money to management practices designed to minimize wildlife access to anthropogenic food items.

Whether or not these management schemes work is another question–one at the heart of a recent research project conducted by an international team of scientists working in California’s Yosemite National Park. The researchers used both museum specimens and samples collected from living animals to explore changes in the diets of American black bears (Ursus americanus) in Yosemite between 1890 and 2007. This 117-year period encompasses four major anthropogenic disturbance regimes during which bears had access to varying levels of artificially introduced fish and food scraps; during these regimes, there were also differences in the degree to which bears were either encouraged to eat, or actively prevented from eating, these food items.

An American black bear (Ursus americanus). Image courtesy of Wikimedia Commons.

Because the researchers had no way of observing all the feeding behaviors of every bear included in the study (especially those that lived in the 19th century) they relied on stable isotope analysis to provide information on which types of foods comprised the bears’ diets. Stable isotopes are versions of atoms–in this case carbon and nitrogen–that have extra neutrons but do not undergo radioactive decay. Relationships between “normal” and “heavy” isotopes (12C vs. 13C, 14N vs. 15N) vary among different food sources in different regions, and therefore can be used as a sort of food fingerprint. This information can be extracted from a variety of animal products, including fecal samples, blood plasma, and–as in the current study–bone and hair.

Unsurprisingly, the scientists found that the isotopic composition of Yosemite bear tissues has changed over time, and that this pattern is associated with variations in the consumption of anthropogenic foods. Between the first and second focal periods (1890-1922 and 1923-1971), for example, they saw an increase in 15N associated with the availability of non-native fish in Yosemite-based hatcheries; closure of the last hatchery in 1956 resulted in a subsequent decrease of 15N.

Black bear eating anthropogenic food out of a(n unsecure) food locker at a Yosemite campsite. Image courtesy of the U.S. National Park Service.

Carbon isotopes, on the other hand, were fairly stable early on, but rose significantly during the second and third (1972-1998) study periods. This is predominantly associated with the closure of bear-feeding platforms. Bears that had developed a taste for anthropogenic foods went searching for them at their source, in concession areas and campgrounds. This habit not only increased consumption of 13C, but also led to a variety of human-bear conflicts that sometimes resulting in the killing of “problem” animals.

At the beginning of the fourth focal period (1999-2007), the U.S. government initiated an annual funding scheme designed to improve human-bear relations. Bear-proof trash and storage receptacles were installed throughout Yosemite, outreach programs were designed to teach visitors about the hazards of feeding bears, and particularly aggressive bears have been hazed, relocated, and even occasionally killed. The isotope analysis suggests that these efforts have been effective: Both 15N and 13C levels decreased between the third and fourth period, and are fairly similar to the (more or less) pre-anthropogenic-disturbance values measured at the very beginning of the first focal period.

Yosemite National Park. Image courtesy of Travel Top.

Indeed, plants and animals–black bears’ natural food sources–currently make up the majority (64-92%) of most Yosemite bears’ diets. However, a few sneaky individuals are still finding ways to dine out on anthropogenic foods, as evidenced by the fact that 8-36% of some animals’ meals are coming from human sources.

Preventing these indulgences (not to mention the cravings that drive them) would be beneficial to humans and bears alike. Particularly aggressive bears do occasionally hurt humans and their belongings: Over the past 20 years, there have been 12,000 reported conflicts, 50 injuries, and approximately $3.7 million in property damage. Bears, like other wildlife consuming anthropogenic foods, may also be susceptible to long-term health problems associated with high cholesterol and fat intake–though there is also evidence that high-calorie, high-protein anthropogenic foods increases reproductive success over the short term.

Yosemite has a long history of human-bear interactions, as shown by this archival photo. Image courtesy of De Anza College.

Perhaps even more worrying is our lack of knowledge about the potential ecosystem-level effects of bears’ dietary fluctuations. Changes in feeding preferences can destabilize food webs and disrupt vital ecosystem processes such as seed dispersal and nutrient cycling. Although modern bears may be returning to “normal” eating practices, their behaviors over the past several decades may have had significant long-term impacts on Yosemite and the wildlife that dwell within it. The researchers urge further study, both here and in other anthropogenically impacted systems, to improve our understanding of whether, and how, human nutrients may shape even those landscapes that we often regard as being more or less “undisturbed” wilderness.

Hopkins III, J.B., Koch, P.L., Ferguson, J.M., and Kalinowski, S.T. 2014. The changing anthropogenic diets of American black bears over the past century in Yosemite National Park. Frontiers in Ecology and the Environment 12(2):107-114.

 

Feast, famine, and the evolution of the human diet

It’s not easy to keep track of all the different diets out there, but you still may have heard of the “fast diet,” also known as “intermittent fasting” or the “5:2 diet.” Regardless of what you call it, this popular method of slimming down is based on the assumption that it is normal and healthy for our bodies to periodically experience food shortages. Another popular eating regime, the Paleo diet, has been advocated by nutritionists who believe that modern humans should restrict their foods to those available to, and commonly eaten by, our ancient ancestors. This advice is grounded in the idea that our genomes are not very different from those of our predecessors, and are therefore designed not to facilitate the processing of frequent and/or large quantities of “calorically dense foods,” but, instead, of only the freshest and most natural ingredients. Both the intermittent fasting and Paleo diets are thought to promote not only weight loss, but also improvements in general health.

Examples of what Berbesque et al. refer to as “calorically dense foods” (image courtesy of Sott.net)

One of the reasons that eating regimes like these are so appealing is that they sound as though they are grounded in fact–in data collected by doctors, anthropologists, and evolutionary biologists. In reality, however, the logic behind these diets relies on many assumptions. One of these, the focus of a recent project conducted by researchers at the Universities of both Roehampton and Cambridge, is the idea that our hunter-gatherer ancestors frequently experienced food shortages–and even famines. In addition to prompting many a diet, this concept has also influenced theoretical models about the evolution of cognition and various life history traits in humans.

But what if early humans didn’t experience high food insecurity–or, at least, what if the food insecurity they did experience is not that different from what modern agriculturalists are periodically subjected to? Unsurprisingly, this is a question that researchers have previously asked, but their analytical methods may not have allowed them to draw truly informed decisions. For example, past studies have focused predominantly on only a handful of societies that experienced rapid increases in diabetes and obesity rates after swapping traditional diets for those that were more “westernized.” Further, focal groups were characterized according to early ethnographical reports that may not have accurately or fully documented the relationships between various societies and their dietary practices. Finally, and perhaps most importantly, previous work has tended not to account for the fact that contemporary hunter-gatherer groups predominantly live in marginal habitats, and that habitat, in general, is hugely influential in determining both the availability of food and the relationship between locals and the choices they make about sustenance.

Samoan data has been included in previous dietary studies because of increases in type II diabetes and obesity in Samoa since the switch from a traditional diet to one that is more “westernized” and agriculture-based (Image courtesy of Faifeau).

Cumulatively, these flaws prompted the researchers behind the current study to conduct a new set of analyses designed to more rigorously “explore relationships between subsistence and famine risk.” Specifically, they wanted to make three comparisons: first, between hunter-gatherers in warmer (more productive) and colder (more marginal) habitats; second, between hunter-gatherers and agriculturalists, regardless of habitat; and, third, between hunter-gatherers and agriculturalists, with habitat quality controlled for.

Rather than focusing on only a subset of societies at opposite ends of the dietary spectrum, the researchers utilized a global sample including 186 different “cultural provinces”; this included 36 hunter-gatherer societies and incorporated data on eight different variables that each measured a different aspect of famine. They used two metrics to account for habitat quality. The first was effective temperature (ET), which is derived from measurements of both the warmest and coldest temperatures of an area, and also reflects plant growth (with lower ETs generally indicating less abundant vegetative resources). The second habitat variable was net primary productivity (NPP), higher values of which are associated with greater availability of food items.

As you might expect, groups living in warmer climates experienced significantly less famine–and also less persistent famines–than those living in colder areas. Because of this, the researchers decided that their subsequent two analyses would only include data from groups living in habitats with an ET at or above 13 degrees C; this ensured that the analyses were comparing apples with apples, so to speak. Their comparison of agriculturalists and hunter-gatherers revealed that food conditions were more favorable for the latter than for the former. Hunter-gatherers living in warm climates were particularly well off, though both types of society were equally likely to face periods of short-term and seasonal starvation (neither of which was very common). Similar patterns were found when this analysis was re-run with both ET and NPP included in order to control for overall habitat quality. Taken together, these results indicate that–contrary to popular belief–hunter-gatherers are actually less likely than agriculturalists to experience famine.

The Paleo diet aims to emulate the eating practices of our ancient predecessors, both in terms of quantity and type of food items eaten. However, hunter-gatherers likely had access to more foods, and to a wider variety of foods, than we give them credit for–though that is no reason to avoid a diet heavy in vitamin-containing ingredients (image courtesy of Activ8).

In other words, regularly skipping meals and eating extremely low-calorie diets may not faithfully mimic an actual Paleo diet; rather, this way of eating may more accurately reflect conditions in agricultural societies that periodically experience food shortages associated with drought and flooding. Faced with these sorts of environmental instabilities, hunter-gatherers tend to have more freedom to relocate to more productive habitats. This sort of cultural adaptability was also highlighted by the comparison between warm-climate and cold-climate hunter-gatherers, the latter of which engaged in a variety of behaviors (e.g., development of specialized hunting technology, food storage, infanticide, and trade) that maximized survival during times of want.

Given the great importance of culture in mediating the relationships between humans and food, the authors point out that there is no reason for other researchers to assert that our modern food-related health crises are necessarily related to genes. This is the idea behind the “thrifty genotype” theory, which suggests that a subset of our hunter-gatherer ancestors possessed a certain set of genes that conferred better survival and reproduction rates in the face of unpredictable feast-famine cycles.

There may, indeed, be such a genotype, but it is not necessarily an ancient development (it could have evolved since the dawn of the modern agricultural age) and it need not be the only explanation for the prevalence of diabetes and obesity in western societies. Equally–if not more–logical is the idea that some cultures began to favor both “immediate return behavior” (use of resources as soon as they become available, as generally observed in hunter-gatherers living in warmer climates) and a taste for high-calorie foods.

It may be the case that our dietary and lifestyle choices–rather than our genes–should get most of the blame for our current struggles with obesity and weight-related health problems (image courtesy of Dolores Smith).

Centuries ago, when humans were more active and had fewer resources right at their fingertips, this pairing of behaviors would have been adaptive; today, say the authors, it is a liability. They don’t go on to make any recommendations about modern weight loss regimes (after all, they aren’t those types of doctor), but it isn’t hard to read between the lines: Though the intermittent fasting and Paleo diets have undoubtedly worked for some people, they probably aren’t as scientifically grounded as some of their advocates might like you to believe.

Berbesque, J.C., Marlowe, F.W., Shaw, P., and Thompson P.  2014. Hunter-gatherers have less famine than agriculturalists. Biology Letters 10:20130853 (online advance publication).