A magic conservation number

15 12 2009

Although I’ve already blogged about our recent paper in Biological Conservation on minimum viable population sizes, American Scientist just did a great little article on the paper and concept that I’ll share with you here:

Imagine how useful it would be if someone calculated the minimum population needed to preserve each threatened organism on Earth, especially in this age of accelerated extinctions.

A group of Australian researchers say they have nailed the best figure achievable with the available data: 5,000 adults. That’s right, that many, for mammals, amphibians, insects, plants and the rest.

Their goal wasn’t a target for temporary survival. Instead they set the bar much higher, aiming for a census that would allow a species to pursue a standard evolutionary lifespan, which can vary from one to 10 million years.

That sort of longevity requires abundance sufficient for a species to thrive despite significant obstacles, including random variation in sex ratios or birth and death rates, natural catastrophes and habitat decline. It also requires enough genetic variation to allow adequate amounts of beneficial mutations to emerge and spread within a populace.

“We have suggested that a major rethink is required on how we assign relative risk to a species,” says conservation biologist Lochran Traill of the University of Adelaide, lead author of a Biological Conservation paper describing the projection.

Conservation biologists already have plenty on their minds these days. Many have concluded that if current rates of species loss continue worldwide, Earth will face a mass extinction comparable to the five big extinction events documented in the past. This one would differ, however, because it would be driven by the destructive growth of one species: us.

More than 17,000 of the 47,677 species assessed for vulnerability of extinction are threatened, according to the latest Red List of Threatened Species prepared by the International Union for Conservation of Nature. That includes 21 percent of known mammals, 30 percent of known amphibians, 12 percent of known birds and 70 percent of known plants. The populations of some critically endangered species number in the hundreds, not thousands.

In an effort to help guide rescue efforts, Traill and colleagues, who include conservation biologists and a geneticist, have been exploring minimum viable population size over the past few years. Previously they completed a meta-analysis of hundreds of studies considering such estimates and concluded that a minimum head count of more than a few thousand individuals would be needed to achieve a viable population.

“We don’t have the time and resources to attend to finding thresholds for all threatened species, thus the need for a generalization that can be implemented across taxa to prevent extinction,” Traill says.

In their most recent research they used computer models to simulate what population numbers would be required to achieve long-term persistence for 1,198 different species. A minimum population of 500 could guard against inbreeding, they conclude. But for a shot at truly long-term, evolutionary success, 5,000 is the most parsimonious number, with some species likely to hit the sweet spot with slightly less or slightly more.

“The practical implications are simply that we’re not doing enough, and that many existing targets will not suffice,” Traill says, noting that many conservation programs may inadvertently be managing protected populations for extinction by settling for lower population goals.

The prospect that one number, give or take a few, would equal the minimum viable population across taxa doesn’t seem likely to Steven Beissinger, a conservation biologist at the University of California at Berkeley.

“I can’t imagine 5,000 being a meaningful number for both Alabama beach mice and the California condors. They are such different organisms,” Beissinger says.

Many variables must be considered when assessing the population needs of a given threatened species, he says. “This issue really has to do with threats more than stochastic demography. Take the same rates of reproduction and survival and put them in a healthy environment and your minimum population would be different than in an environment of excess predation, loss of habitat or effects from invasive species.”

But, Beissinger says, Traill’s group is correct for thinking that conservation biologists don’t always have enough empirically based standards to guide conservation efforts or to obtain support for those efforts from policy makers.

“One of the positive things here is that we do need some clear standards. It might not be establishing a required number of individuals. But it could be clearer policy guidelines for acceptable risks and for how many years into the future can we accept a level of risk,” Beissinger says. “Policy people do want that kind of guidance.”

Traill sees policy implications in his group’s conclusions. Having a numerical threshold could add more precision to specific conservation efforts, he says, including stabs at reversing the habitat decline or human harvesting that threaten a given species.

“We need to restore once-abundant populations to the minimum threshold,” Traill says. “In many cases it will make more economic and conservation sense to abandon hopeless-case species in favor of greater returns elsewhere.





Value of a good enemy

25 10 2009

alienpredatorI love these sorts of experiments. Ecology (and considering conservation ecology a special subset of the larger discipline) is a messy business, mainly because ecosystems are complex, non-linear, emergent, interactive, stochastic and meta-stable entities that are just plain difficult to manipulate experimentally. Therefore, making inference of complex ecological processes tends to be enhanced when the simplest components are isolated.

Enter the ‘mini-ecosystem-in-a-box’ approach to ecological research. I’ve blogged before about some clever experiments to examine the role of connectivity among populations in mitigating (or failing to mitigate) extinction risk, and alluded to others indicating how harvest reserves work to maximise population persistence. This latest microcosm experiment is another little gem and has huge implications for conservation.

A fairly long-standing controversy in conservation biology, and in invasive species biology in particular, is whether intact ecosystems are in any way more ‘resilient’ to invasion by alien species (the latter most often being deliberately or inadvertently introduced by humans – think of Australia’s appalling feral species problems; e.g., buffalo, foxes and cats, weeds). Many believe by default that more ‘pristine’ (i.e., less disturbed by humans) communities will naturally provide more ecological checks against invasives because there are more competitors, more specialists and more predators. However, considering the ubiquity of invasives around the world, this assumption has been challenged vehemently.

The paper I’m highlighting today uses the microcosm experimental approach to show how native predators, when abundant, can reduce the severity of an invasion. Using a system of two mosquito species (one ‘native’ – what’s ‘native’ in a microcosm? [another subject] – and one ‘invasive’) and a native midge predator, Juliano and colleagues demonstrate in their paper Your worst enemy could be your best friend: predator contributions to invasion resistance and persistence of natives that predators are something you want to keep around.

In short, they found little evidence of direct competition between the two mosquitoes in terms of abundance when placed together without predators, but when the midges were added, the persistence of the invasive mosquito was reduced substantially. Of course, the midge predators did do their share of damage on the native mosquitoes in terms of reducing the latter’s abundance, but through a type of competitive release from their invasive counterparts, the midges’ reduction of the invasive species left the native mosquito free to develop faster (i.e., more per capita resources).

Such a seemingly academic result has huge conservation implications. In most systems, predators are some of the largest and slowest-reproducing species, so they are characteristically the first to feel the hammer of human damage. From bears to sharks, and tigers to wolves, big, charismatic predators are on the wane worldwide. Juliano and colleagues’ nice experimental work with insects reminds us that keeping functioning native ecosystems intact from all trophic perspectives is imperative.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

This post was chosen as an Editor's Selection for ResearchBlogging.org

ResearchBlogging.orgJuliano, S., Lounibos, L., Nishimura, N., & Greene, K. (2009). Your worst enemy could be your best friend: predator contributions to invasion resistance and persistence of natives Oecologia DOI: 10.1007/s00442-009-1475-x





Life and death on Earth: the Cronus hypothesis

13 10 2009
Cronus

Cronus

Bit of a strange one for you today, but here’s a post I hope you’ll enjoy.

My colleague, Barry Brook, and I recently published a paper in the very new and perhaps controversial online journal , the Journal of Cosmology. Cosmology? According to the journal, ‘cosmology’ is:

“the study and understanding of existence in its totality, encompassing the infinite and eternal, and the origins and evolution of the cosmos, galaxies, stars, planets, earth, life, woman and man”.

The journal publishes papers dealing with ‘cosmology’ and is a vehicle for those who wish to publish on subjects devoted to the study of existence in its totality.

Ok. Quite an aim.

Our paper is part of the November (second ever) issue of the journal entitled Asteroids, Meteors, Comets, Climate and Mass Extinctions, and because we were the first to submit, we managed to secure the first paper in the issue.

Our paper, entitled The Cronus hypothesis – extinction as a necessary and dynamic balance to evolutionary diversification, introduces a new idea in the quest to find that perfect analogy for understanding the mechanisms dictating how life on our planet has waxed and waned over the billions of years since it first appeared.

Gaia

Gaia

In the 1960s, James Lovelock conceived the novel idea of Gaia – that the Earth functions like a single, self-regulating organism where life itself interacts with the physical environment to maintain conditions favourable for life (Gaia was the ancient Greeks’ Earth mother goddess). Embraced, contested, denounced and recently re-invigorated, the idea has evolved substantially since it first appeared. More recently (this year, in fact), Peter Ward countered the Gaia hypothesis with his own Greek metaphor – the Medea hypothesis. Essentially this view holds that life instead ‘seeks’ to destroy itself in an anti-Gaia manner (Medea was the siblicidal wife of Jason of the Argonauts). Ward described his Medea hypothesis as “Gaia’s evil twin”.

One can marvel at the incredible diversity of life on Earth (e.g., conservatively, > 4 million protists, 16600 protozoa, 75000-300000 helminth parasites, 1.5 million fungi, 320000 plants, 4-6 million arthropods, > 6500 amphibians, 10000 birds and > 5000 mammals) and wonder that there might be something in the ‘life makes it easier for life’ idea underlying Gaia. However, when one considers that over 99 % of all species that have ever existed are today extinct, then a Medea perspective might dominate.

Medea

Medea

Enter Cronus. Here we posit a new way of looking at the tumultuous history of life and death on Earth that effectively relegates Gaia and Medea to opposite ends of a spectrum. Cronus (patricidal son of Gaia overthrown by his own son, Zeus, and banished to Hades) treats speciation and extinction as birth and death in a ‘metapopulation’ of species assemblages split into biogeographic realms. Catastrophic extinction events can be brought about via species engineering their surroundings by passively modifying the delicate balance of oxygen, carbon dioxide and methane – indeed, humans might be the next species to fall victim to our own Medean tendencies. But extinction opens up new niches that eventually elicit speciation, and under conditions of relative environmental stability, specialists evolve because they are (at least temporarily) competitive under those conditions. When conditions change again, extinction ensues because not all can adapt quickly enough. Just as all individuals born in a population must eventually die, extinction is a necessary termination.

We think the Cronus metaphor has a lot of advantages over Gaia and Medea. The notion of a community of species as a population of selfish individuals retains the Darwinian view of contestation; self-regulation in Cronus occurs naturally as a result of extinction modifying the course of future evolution. Cronus also makes existing mathematical tools developed for metapopulation theory amenable to broader lines of inquiry.

For example, species as individuals with particular ‘mortality’ (extinction) rates, and lineages with particular ‘birth’ (speciation) rates, could interact and disperse among ‘habitats’ (biogeographical realms). ‘Density’ feedback could be represented as competitive exclusion or symbioses. As species dwindle, feedbacks such as reduced community resilience that further exacerbate extinction risk (Medea-like phase), and stochastic fluctuation around a ‘carrying capacity’ (niche saturation) arising when environmental conditions are relatively stable is the Gaia-like phase. Our Cronus framework is also scale-invariant – it could be applied to microbial diversity on another organism right up to inter-planetary exchange of life (panspermia).

What’s the relevance to conservation? We’re struggling to prevent extinction, so understanding how it works is an essential first step. Without the realisation that extinction is necessary (albeit, at rates preferably slower than they are currently), we cannot properly implement conservation triage, i.e., where do we invest in conservation and why?

We had fun with this, and I hope you enjoy it too.

CJA Bradshaw

ResearchBlogging.orgBradshaw, C.J.A., & Brook, B.W. (2009). The Cronus Hypothesis – extinction as a necessary and dynamic balance to evolutionary diversification Journal of Cosmology, 2, 201-209 Other: http://journalofcosmology.com/Extinction100.html

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





Connectivity paradigm in extinction biology

6 10 2009

networkI’m going to do a double review here of two papers currently online in Proceedings of the Royal Society B: Biological Sciences. I’m lumping them together because they both more or less challenge the pervasive conservation/restoration paradigm that connectivity is the key to reducing extinction risk. It’s just interesting (and slightly amusing) that the two were published in the same journal and at about the same time, but by two different groups.

From our own work looking at the correlates of extinction risk (measured mainly by proxy as threat risk), the range of a population (i.e., the amount of area and number of habitats it covers) is the principal determinant of risk – the smaller your range, the greater your chance of shuffling off this mortal coil (see also here). This is, of course, because a large range usually means that you have some phenotypic plasticity in your habitat requirements, you can probably disperse well, and your not going to succumb to localised ‘catastrophes’ as often. It also probably means (but not always) that your population size increases as your range size increases; as we all know, populations must be beyond their minimum viable population size to have a good chance of persisting random demographic and environmental vagaries.

Well, the two papers in question, ‘Both population size and patch quality affect local extinctions and colonizations‘ by Franzén & Nilssen and ‘Environment, but not migration rate, influences extinction risk in experimental metapopulations‘ by Griffen & Drake, show that connectivity (i.e., the probability that populations are connected via migration) are probably the least important components in the extinction-persistence game.

Using a solitary bee (Andrena hattorfiana) metapopulation in Sweden, Franzén & Nilssen show that population size and food patch quality (measured by number of pollen-producing plants) were directly (but independently) correlated with extinction risk. Bigger populations in stable, high-quality patches persisted more readily. However, connectivity between patches was uncorrelated with risk.

Griffen & Drake took quite a different approach and stacked experimental aquaria full of daphnia (Daphnia magna) on top of one another to influence the amount of light (and hence, amount of food from algal growth) to which the populations had access (it’s interesting to note here that this was unplanned in the experiment – the different algal growth rates related to the changing exposure to light was a serendipitous discovery that allowed them to test the ‘food’ hypothesis!). They also controlled the migration rate between populations by varying the size of holes connecting the aquaria. In short, they found that environmentally influenced (i.e., food-influenced) variation was far more important at dictating population size and fluctuation than migration, showing again that conditions promoting large population size and reducing temporal variability are essential for reducing extinction risk.

So what’s the upshot for conservation? Well, many depressed populations are thought to be recoverable by making existing and fragmented habitat patches more connected via ‘corridors’ of suitable habitat. The research highlighted here suggests that more emphasis should be placed instead on building up existing population sizes and ensuring food availability is relatively constant instead of worrying about how many trickling migrants might be moving back and forth. This essentially means that a few skinny corridors connecting population fragments will probably be insufficient to save our imperilled species.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

ResearchBlogging.org

This post was chosen as an Editor's Selection for ResearchBlogging.org

Franzen, M., & Nilsson, S. (2009). Both population size and patch quality affect local extinctions and colonizations Proceedings of the Royal Society B: Biological Sciences DOI: 10.1098/rspb.2009.1584

Griffen, B., & Drake, J. (2009). Environment, but not migration rate, influences extinction risk in experimental metapopulations Proceedings of the Royal Society B: Biological Sciences DOI: 10.1098/rspb.2009.1153





Wobbling to extinction

31 08 2009

crashI’ve been meaning to highlight for a while a paper that I’m finding more and more pertinent as a citation in my own work. The general theme is concerned with estimating extinction risk of a particular population, species (or even ecosystem), and more and more we’re finding that different drivers of population decline and eventual extinction often act synergistically to drive populations to that point of no return.

In other words, the whole is greater than the sum of its parts.

In other, other words, extinction risk is usually much higher than we generally appreciate.

This might seem at odds with my previous post about the tendency of the stochastic exponential growth model to over-estimate extinction risk using abundance time series, but it’s really more of a reflection of our under-appreciation of the complexity of the extinction process.

In the early days of ConservationBytes.com I highlighted a paper by Fagan & Holmes that described some of the few time series of population abundances right up until the point of extinction – the reason these datasets are so rare is because it gets bloody hard to find the last few individuals before extinction can be confirmed. Most recently, Melbourne & Hastings described in a paper entitled Extinction risk depends strongly on factors contributing to stochasticity published in Nature last year how an under-appreciated component of variation in abundance leads to under-estimation of extinction risk.

‘Demographic stochasticity’ is a fancy term for variation in the probability of births deaths at the individual level. Basically this means that there will be all sorts of complicating factors that move any individual in a population away from its expected (mean) probability of dying or reproducing. When taken as a mean over a lot of individuals, it has generally been assumed that demographic stochasticity is washed out by other forms of variation in mean (population-level) birth and death probability resulting from vagaries of the environmental context (e.g., droughts, fires, floods, etc.).

‘No, no, no’, say Melbourne & Hastings. Using some relatively simple laboratory experiments where environmental stochasticity was tightly controlled, they showed that demographic stochasticity dominated the overall variance and that environmental variation took a back seat. The upshot of all these experiments and mathematical models is that for most species of conservation concern (i.e., populations already reduced below to their minimum viable populations size), not factoring in the appropriate measures of demographic wobble means that most people are under-estimating extinction risk.

Bloody hell – we’ve been saying this for years; a few hundred individuals in any population is a ridiculous conservation target. People must instead focus on getting their favourite endangered species to number at least in the several thousands if the species is to have any hope of persisting (this is foreshadowing a paper we have coming out shortly in Biological Conservationstay tuned for a post thereupon).

Melbourne & Hastings have done a grand job in reminding us how truly susceptible small populations are to wobbling over the line and disappearing forever.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





Classics: the Allee effect

22 12 2008

220px-Vortex_in_draining_bottle_of_waterAs humanity plunders its only home and continues destroying the very life that sustains our ‘success’, certain concepts in ecology, evolution and conservation biology are being examined in greater detail in an attempt to apply them to restoring at least some elements of our ravaged biodiversity.

One of these concepts has been largely overlooked in the last 30 years, but is making a conceptual comeback as the processes of extinction become better quantified. The so-called Allee effect can be broadly defined as a “…positive relationship between any component of individual fitness and either numbers or density of conspecifics” (Stephens et al. 1999, Oikos 87:185-190) and is attributed to Warder Clyde Allee, an American ecologist from the early half of the 20th century, although he himself did not coin the term. Odum referred to it as “Allee’s principle”, and over time, the concept morphed into what we now generally call ‘Allee effects’.

Nonetheless, I’m using Allee’s original 1931 book Animal Aggregations: A Study in General Sociology (University of Chicago Press) as the Classics citation here. In his book, Allee discussed the evidence for the effects of crowding on demographic and life history traits of populations, which he subsequently redefined as “inverse density dependence” (Allee 1941, American Naturalist 75:473-487).

What does all this have to do with conservation biology? Well, broadly speaking, when populations become small, many different processes may operate to make an individual’s average ‘fitness’ (measured in many ways, such as survival probability, reproductive rate, growth rate, et cetera) decline. The many and varied types of Allee effects can work together to drive populations even faster toward extinction than expected by chance alone because of self-reinforcing feedbacks (see also previous post on the small population paradigm). Thus, ignorance of potential Allee effects can bias everything from minimum viable population size estimates, restoration attempts and predictions of extinction risk.

A recent paper in the journal Trends in Ecology and Evolution by Berec and colleagues entitled Multiple Allee effects and population management gives a more specific breakdown of Allee effects in a series of definitions I reproduce here for your convenience:

Allee threshold: critical population size or density below which the per capita population growth rate becomes negative.

Anthropogenic Allee effect: mechanism relying on human activity, by which exploitation rates increase with decreasing population size or density: values associated with rarity of the exploited species exceed the costs of exploitation at small population sizes or low densities (see related post).

Component Allee effect: positive relationship between any measurable component of individual fitness and population size or density.

Demographic Allee effect: positive relationship between total individual fitness, usually quantified by the per capita population growth rate, and population size or density.

Dormant Allee effect: component Allee effect that either does not result in a demographic Allee effect or results in a weak Allee effect and which, if interacting with a strong Allee effect, causes the overall Allee threshold to be higher than the Allee threshold of the strong Allee effect alone.

Double dormancy: two component Allee effects, neither of which singly result in a demographic Allee effect, or result only in a weak Allee effect, which jointly produce an Allee threshold (i.e. the double Allee effect becomes strong).

Genetic Allee effect: genetic-level mechanism resulting in a positive relationship between any measurable fitness component and population size or density.

Human-induced Allee effect: any component Allee effect induced by a human activity.

Multiple Allee effects: any situation in which two or more component Allee effects work simultaneously in the same population.

Nonadditive Allee effects: multiple Allee effects that give rise to a demographic Allee effect with an Allee threshold greater or smaller than the algebraic sum of Allee thresholds owing to single Allee effects.

Predation-driven Allee effect: a general term for any component Allee effect in survival caused by one or multiple predators whereby the per capita predation-driven mortality rate of prey increases as prey numbers or density decline.

Strong Allee effect: demographic Allee effect with an Allee threshold.

Subadditive Allee effects: multiple Allee effects that give rise to a demographic Allee effect with an Allee threshold smaller than the algebraic sum of Allee thresholds owing to single Allee effects.

Superadditive Allee effects: multiple Allee effects that give rise to a demographic Allee effect with an Allee threshold greater than the algebraic sum of Allee thresholds owing to single Allee effects.

Weak Allee effect: demographic Allee effect without an Allee threshold.

For even more detail, I suggest you obtain the 2008 book by Courchamp and colleagues entitled Allee Effects in Ecology and Conservation (Oxford University Press).

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

(Many thanks to Salvador Herrando-Pérez for his insight on terminology)





International Conspiracy to Catch All Tunas

2 11 2008

tuna-660x433Otherwise known as the International Commission for the Conservation of Atlantic Tunas (ICCAT) based in Madrid, ICCAT is charged with “the conservation of tunas and tuna-like species in the Atlantic Ocean and its adjacent seas”. However, according to a paper entitled Impending collapse of bluefin tuna in the northeast Atlantic and Mediterranean to forthcoming in Conservation Letters (read post about the journal here) by Brian MacKenzie of the Technical University of Denmark, they don’t seem to be doing their job very well.

In perhaps the best example of the plundering of the seas for overt profit instead of food provision per se, the north-east Atlantic and Mediterranean population of bluefin tuna (Thunnus thynnus) has been overfished and will continue to decline to near extinction if the harvest isn’t stopped immediately and for several years to come.

Chronically obese probability.

The demand (and money) associated with tuna harvest appears to negate all scientific evidence that the population is in serious trouble – because of us. The Economist recently featured the paper’s results and therein quoted the opinion of independent ICCAT reviewers who described the situation as “an international disgrace” (read full article here).

I want to list MacKenzie et al.’s paper forthcoming in Conservation Letters as a ‘Potential‘ here at ConservationBytes.com, but I doubt it will change the tuna’s situation that much, and it may only ruffle a few European (and Japanese) feathers (scales?). Who knows? Perhaps the paper will result in a massive down-scaling of the harvest and some serious commitment to REAL tuna conservation.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





Classics: Biodiversity Hotspots

25 08 2008

‘Classics’ is a category of posts highlighting research that has made a real difference to biodiversity conservation. All posts in this category will be permanently displayed on the Classics page of ConservationBytes.com

info-chap7-slide-pic03Myers, N., Mittermeier, R.A., Mittermeier, C.G., da Fonseca, G.A.B. & Kent, J. (2000). Biodiversity hotspots for conservation priorities. Nature, 403, 853-858

According to Google Scholar, this paper has over 2500 citations. Even though it was published less than a decade ago, already Myers and colleagues’ ‘hotspots’ concept has become the classic lexicon for, as they defined it, areas with high species endemism and degradation by humans. In other words, these are places on the planet (originally only terrestrial, but the concept has been extended to the marine realm) where at the current rates of habitat loss, exploitation, etc., we stand to lose far more irreplaceable species. The concept has been criticised for various incapacities to account for all types of threats – indeed, many other prioritisation criteria have been proposed (assessed nicely by Brooks et al. 2006 and Orme et al. 2005), but it’s the general idea proposed by Myers and colleagues that has set the conservation policy stage for most countries. One little gripe here – although the concept ostensibly means areas of high endemic species richness AND associated threat, people often take the term ‘hotspot’ to mean just a place with lots of species. Not so. Ah, the intangible concept of biodiversity!

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





The extinction vortex

25 08 2008

One for the Potential list:

vortexFirst coined by Gilpin & Soulé in 1986, the extinction vortex is the term used to describe the process that declining populations undergo when”a mutual reinforcement occurs among biotic and abiotic processes that drives population size downward to extinction” (Brook, Sodhi & Bradshaw 2008).

Although several types of ‘vortices’ were labelled by Gilpin & Soulé, the concept was subsequently simplified by Caughley (1994) in his famous paper on the declining and small population paradigms, but only truly quantified for the first time by Fagan & Holmes (2006) in their Ecology Letters paper entitled Quantifying the extinction vortex.

Fagan and Holmes compiled a small time-series database of ten vertebrate species (two mammals, five birds, two reptiles and a fish) whose final extinction was witnessed via monitoring. They confirmed that the time to extinction scales to the logarithm of population size. In other words, as populations decline, the time elapsing before extinction occurs becomes rapidly (exponentially) smaller and smaller. They also found greater rates of population decline nearer to the time of extinction than earlier in the population’s history, confirming the expectation that genetic deterioration contributes to a general corrosion of individual performance (fitness). Finally, they found that the variability in abundance was also highest as populations approached extinction, irrespective of population size, thus demonstrating indirectly that random environmental fluctuations take over to cause the final extinction regardless of what caused the population to decline in the first place.

What does this mean for conservation efforts? It was fundamentally the first empirical demonstration that the theory of accelerating extinction proneness occurs as populations decline, meaning that all attempts must be made to ensure large population sizes if there is any chance of maintaining long-term persistence. This relates to the minimum viable population size concept that should underscore each and every recovery and target set or desired for any population in trouble or under conservation scrutiny.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





Classics: Declining and small population paradigms

23 08 2008

‘Classics’ is a category of posts highlighting research that has made a real difference to biodiversity conservation. All posts in this category will be permanently displayed on the Classics page of ConservationBytes.com

image0032Caughley, G. (1994). Directions in conservation biology. Journal of Animal Ecology, 63, 215-244.

Cited around 800 times according to Google Scholar, this classic paper demonstrated the essential difference between the two major paradigms dominating the discipline of conservation biology: (1) the ‘declining’ population paradigm, and the (2) ‘small’ population paradigm. The declining population paradigm is the identification and management of the processes that depress the demographic rate of a species and cause its populations to decline deterministically, whereas the small population paradigm is the study of the dynamics of small populations that have declined owing to some (deterministic) perturbation and which are more susceptible to extinction via chance (stochastic) events. Put simply, the forces that drive populations into decline aren’t necessarily those that drive the final nail into a species’ coffin – we must manage for both types of processes  simultaneously , and the synergies between them, if we want to reduce the likelihood of species going extinct.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





Classics: Minimum Viable Population size

21 08 2008

‘Classics’ is a category of posts highlighting research that has made a real difference to biodiversity conservation. All posts in this category will be permanently displayed on the Classics page of ConservationBytes.com

Too-Few-CaloriesShaffer, M.L. (1981). Minimum population sizes for species conservation. BioScience 31, 131–134

Small and isolated populations are particularly vulnerable to extinction through random variation in birth and death rates, variation in resource or habitat availability, predation, competitive interactions and single-event catastrophes, and inbreeding. Enter the concept of the Minimum Viable Population (MVP) size, which was originally defined as the smallest number of individuals required for an isolated population to persist (at some predefined ‘high’ probability) for some ‘long’ time into the future. In other words, the MVP size is the number of individuals in the population that is needed to withstand normal (expected) variation in all the things that affect individual persistence through time. Drop below your MVP size, and suddenly your population’s risk of extinction sky-rockets. In some ways, MVP size can be considered the threshold dividing the ‘small’ and ‘declining’ population paradigms (see Caughley 1994), so that different management strategies can be applied to populations depending on their relative distance to (population-specific) MVP size.

This wonderfully simply, yet fundamental concept of extinction dynamics provides the target for species recovery, minimum reserve size and sustainable harvest if calculated correctly. Indeed, it is a concept underlying threatened species lists worldwide, including the most well-known (IUCN Red List of Threatened Species). While there are a host of methods issues, genetic considerations and policy implementation problems, Shaffer’s original paper spawned an entire generation of research and mathematical techniques in conservation biology, and set the stage for tangible, mathematically based conservation targets.

Want more information? We have published some papers and articles on the subject that elaborate more on the methods, expected ranges, subtleties and implications of the MVP concept that you can access below.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





Conservation that bites

2 07 2008

This new website will post examples of conservation science with real-world impacts to policy that improves biodiversity outcomes. Stay tuned.

CJA Bradshaw