De-extinction is about as sensible as de-death

15 03 2013

Published simultaneously in The Conversation.


On Friday, March 15 in Washington DC, National Geographic and TEDx are hosting a day-long conference on species-revival science and ethics. In other words, they will be debating whether we can, and should, attempt to bring extinct animals back to life – a concept some call “de-extinction”.

The debate has an interesting line-up of ecologists, geneticists, palaeontologists (including Australia’s own Mike Archer), developmental biologists, journalists, lawyers, ethicists and even artists. I have no doubt it will be very entertaining.

But let’s not mistake entertainment for reality. It disappoints me, a conservation scientist, that this tired fantasy still manages to generate serious interest. I have little doubt what the ecologists at the debate will conclude.

Once again, it’s important to discuss the principal flaws in such proposals.

Put aside for the moment the astounding inefficiency, the lack of success to date and the welfare issues of bringing something into existence only to suffer a short and likely painful life. The principal reason we should not even consider the technology from a conservation perspective is that it does not address the real problem – mainly, the reason for extinction in the first place.

Even if we could solve all the other problems, if there is no place to put these new individuals, the effort and money expended is a complete waste. Habitat loss is the principal driver of species extinction and endangerment. If we don’t stop and reverse this now, all other avenues are effectively closed. Cloning will not create new forests or coral reefs, for example. Read the rest of this entry »





Rocking the scientific boat

14 12 2012
© C. Simpson

© C. Simpson

One thing that has simultaneously amused, disheartened, angered and outraged me over the past decade or so is how anyone in their right mind could even suggest that scientists band together into some sort of conspiracy to dupe the masses. While this tired accusation is most commonly made about climate scientists, it applies across nearly every facet of the environmental sciences whenever someone doesn’t like what one of us says.

First, it is essential to recognise that we’re just not that organised. While I have yet to forget to wear my trousers to work (I’m inclined to think that it will happen eventually), I’m still far, far away from anything that could be described as ‘efficient’ and ‘organised’. I can barely keep it together as it is. Such is the life of the academic.

More importantly, the idea that a conspiracy could form among scientists ignores one of the most fundamental components of scientific progress – dissension. And hell, can we dissent!

Yes, the scientific approach is one where successive lines of evidence testing hypotheses are eventually amassed into a concept, then perhaps a rule of thumb. If the rule of thumb stands against the scrutiny of countless studies (i.e., ‘challenges’ in the form of poison-tipped, flaming literary arrows), then it might eventually become a ‘theory’. Some theories even make it to become the hallowed ‘law’, but that is very rare indeed. In the environmental sciences (I’m including ecology here), one could argue that there is no such thing as a ‘law’.

Well-informed non-scientists might understand, or at least, appreciate that process. But few people outside the sciences have even the remotest clue about what a real pack of bastards we can be to each other. Use any cliché or descriptor you want – it applies: dog-eat-dog, survival of the fittest, jugular-slicing ninjas, or brain-eating zombies in lab coats.

Read the rest of this entry »





Conservation catastrophes

22 02 2012

David Reed

The title of this post serves two functions: (1) to introduce the concept of ecological catastrophes in population viability modelling, and (2) to acknowledge the passing of the bloke who came up with a clever way of dealing with that uncertainty.

I’ll start with latter first. It came to my attention late last year that a fellow conservation biologist colleague, Dr. David Reed, died unexpectedly from congestive heart failure. I did not really mourn his passing, for I had never met him in person (I believe it is disingenuous, discourteous, and slightly egocentric to mourn someone who you do not really know personally – but that’s just my opinion), but I did think at the time that the conservation community had lost another clever progenitor of good conservation science. As many CB readers already know, we lost a great conservation thinker and doer last year, Professor Navjot Sodhi (and that, I did take personally). Coincidentally, both Navjot and David died at about the same age (49 and 48, respectively). I hope that the being in one’s late 40s isn’t particularly presaged for people in my line of business!

My friend, colleague and lab co-director, Professor Barry Brook, did, however, work a little with David, and together they published some pretty cool stuff (see References below). David was particularly good at looking for cross-taxa generalities in conservation phenomena, such as minimum viable population sizes, effects of inbreeding depression, applications of population viability analysis and extinction risk. But more on some of that below. Read the rest of this entry »





Better SAFE than sorry

30 11 2011

Last day of November already – I am now convinced that my suspicions are correct: time is not constant and in fact accelerates as you age (in mathematical terms, a unit of time becomes a progressively smaller proportion of the time elapsed since your birth, so this makes sense). But, I digress…

This short post will act mostly as a spruik for my upcoming talk at the International Congress for Conservation Biology next week in Auckland (10.30 in New Zealand Room 2 on Friday, 9 December) entitled: Species Ability to Forestall Extinction (SAFE) index for IUCN Red Listed species. The post also sets a bit of the backdrop to this paper and why I think people might be interested in attending.

As regular readers of CB will know, we published a paper this year in Frontiers in Ecology and the Environment describing a relatively simple metric we called SAFE (Species Ability to Forestall Extinction) that could enhance the information provided by the IUCN Red List of Threatened Species for assessing relative extinction threat. I won’t go into all the detail here (you can read more about it in this previous post), but I do want to point out that it ended up being rather controversial.

The journal ended up delaying final publication because there were 3 groups who opposed the metric rather vehemently, including people who are very much in the conservation decision-making space and/or involved directly with the IUCN Red List. The journal ended up publishing our original paper, the 3 critiques, and our collective response in the same issue (you can read these here if you’re subscribed, or email me for a PDF reprint). Again, I won’t go into an detail here because our arguments are clearly outlined in the response.

What I do want to highlight is that even beyond the normal in-print tête-à-tête the original paper elicited, we were emailed by several people behind the critiques who were apparently unsatisfied with our response. We found this slightly odd, because many of the objections just kept getting re-raised. Of particular note were the accusations that: Read the rest of this entry »





Not magic, but necessary

18 10 2011

In April this year, some American colleagues of ours wrote a rather detailed, 10-page article in Trends in Ecology and Evolution that attacked our concept of generalizing minimum viable population (MVP) size estimates among species. Steve Beissinger of the University of California at Berkeley, one of the paper’s co-authors, has been a particularly vocal adversary of some of the applications of population viability analysis and its child, MVP size, for many years. While there was some interesting points raised in their review, their arguments largely lacked any real punch, and they essentially ended up agreeing with us.

Let me explain. Today, our response to that critique was published online in the same journal: Minimum viable population size: not magic, but necessary. I want to take some time here to summarise the main points of contention and our rebuttal.

But first, let’s recap what we have been arguing all along in several papers over the last few years (i.e., Brook et al. 2006; Traill et al. 2007, 2010; Clements et al. 2011) – a minimum viable population size is the point at which a declining population becomes a small population (sensu Caughley 1994). In other words, it’s the point at which a population becomes susceptible to random (stochastic) events that wouldn’t otherwise matter for a small population.

Consider the great auk (Pinguinus impennis), a formerly widespread and abundant North Atlantic species that was reduced by intensive hunting throughout its range. How did it eventually go extinct? The last remaining population blew up in a volcanic explosion off the coast of Iceland (Halliday 1978). Had the population been large, the small dent in the population due to the loss of those individuals would have been irrelevant.

But what is ‘large’? The empirical evidence, as we’ve pointed out time and time again, is that large = thousands, not hundreds, of individuals.

So this is why we advocate that conservation targets should aim to keep at or recover to the thousands mark. Less than that, and you’re playing Russian roulette with a species’ existence. Read the rest of this entry »





Classics: Effective population size ratio

27 04 2011

Here’s another concise Conservation Classic highlighted in our upcoming book chapter (see previous entries on this book). Today’s entry comes from a colleague of mine, Dick Frankham, who has literally written the book on conservation genetics. I’ve published with Dick a few times – absolutely lovely chap who really knows his field more than almost any other. It is a great pleasure to include one of his seminal works as a Conservation Classic.

This entry is highly related to our work on minimum viable population size, and the controversial SAFE index (more on that later).

Although it had long been recognized that inbreeding and loss of genetic diversity were accentuated in small, isolated populations (Charlesworth & Charlesworth, 1987), genetic hazards were generally considered to be of less consequence to extinction risk than demographic and environmental stochasticity. Frankham (1995) helped overturn this viewpoint, using a meta-analysis to draw together comprehensive evidence on the ratio of genetically effective to actual population size (Ne:N). Read the rest of this entry »





Species’ Ability to Forestall Extinction – AudioBoo

8 04 2011

Here’s a little interview I just did on the SAFE index with ABC AM:


Not a bad job, really.

And here’s another one from Radio New Zealand:


CJA Bradshaw





Classics: demography versus genetics

16 03 2011

Here’s another short, but sweet Conservation Classic highlighted in our upcoming book chapter (see previous entries on this book). Today’s entry comes from long-time quantitative ecology guru, Russ Lande, who is now based at the Silwood Park Campus (Imperial College London).

© IBL

In an influential review, Lande (1988) argued that

“…demography may usually be of more immediate importance than population genetics in determining the minimum viable size of wild populations”.

It was a well-reasoned case, and was widely interpreted to mean that demographic and ecological threats would provide the ‘killer blow’ to threatened species before genetic factors such as inbreeding and fitness effects of loss of genetic diversity had time to exert a major influence on small population dynamics.

Read the rest of this entry »





S.A.F.E. = Species Ability to Forestall Extinction

8 01 2011

Note: I’ve just rehashed this post (30/03/2011) because the paper is now available online (see comment stream). Stay tuned for the media release next week. – CJAB

I’ve been more or less underground for the last 3 weeks. It has been a wonderful break (mostly) from the normally hectic pace of academic life. Thanks for all those who remain despite the recent silence.

© Ezprezzo.com

But I’m back now with a post about a paper we’ve just had accepted in Frontiers in Ecology and Environment. In my opinion it’s a leap forward in how we measure relative threat risk among species, despite some criticism.

I’ve written in past posts about the ‘magic’ minimum number of individuals that should be in a population to reduce the chance of extinction from random events. The so-called ‘minimum viable population (MVP) size’ is basically the abundance of a (connected) population below which random events take over from factors causing sustained declines (Caughley’s distinction between the ‘declining’ and ‘small’ population paradigms).

Up until the last few years, the MVP size was considered to be a population- or species-specific value, and it required very detailed demographic, genetic and biogeographical data to estimate – not something that biologists tend to have at their fingertips for most high-risk species. However, several papers published by our group (Minimum viable population size and global extinction risk are unrelated, Minimum viable population size: a meta-analysis of 30 years of published estimates and Pragmatic population viability targets in a rapidly changing world) have shown that there is in fact little variation in this number among the best-studied species; both demographic and genetic data support a number of around 5000 to avoid crossing the deadly threshold.

Now the fourth paper in this series has just been accepted (sorry, no link yet, but I’ll let you all know as soon as it is available), and it was organised and led by Reuben Clements, and co-written by me, Barry Brook and Bill Laurance.

The idea is fairly simple and it somewhat amazes me that it hasn’t been implemented before. The SAFE (Species Ability to Forestall Extinction) index is simply the distance a population is (in terms of abundance) from its MVP. In the absence of a species-specific value, we used the 5000-individual threshold. Thus, Read the rest of this entry »





Faraway fettered fish fluctuate frequently

27 06 2010

Hello! I am Little Fish

Swimming in the Sea.

I have lots of fishy friends.

Come along with me.

(apologies to Lucy Cousins and Walker Books)

I have to thank my 3-year old daughter and one of her favourite books for that intro. Now to the serious stuff.

I am very proud to announce a new Report in Ecology we’ve just had published online early about a new way of looking at the stability of coral reef fish populations. Driven by one of the hottest young up-and-coming researchers in coral reef ecology, Dr. Camille Mellin (employed through the CERF Marine Biodiversity Hub and co-supervised by me at the University of Adelaide and Julian Caley and Mark Meekan of the Australian Institute of Marine Science), this paper adds a new tool in the design of marine protected areas.

Entitled Reef size and isolation determine the temporal stability of coral reef fish populations, the paper applies a well-known, but little-used mathematical relationship between the logarithms of population abundance and its variance (spatial or temporal) – Taylor’s power law.

Taylor’s power law is pretty straightforward itself – as you raise the abundance of a population by 1 unit on the logarithmic scale, you can expect its associated variance (think variance over time in a fluctuating population to make it easier) to rise by 2 logarithmic units (thus, the slope = 2). Why does this happen? Because a log-log (power) relationship between a vector and its square (remember: variance = standard deviation2) will give a multiplier of 2 (i.e., if xy2, then log10x ~ 2log10y).

Well, thanks for the maths lesson, but what’s the application? It turns out that deviations from the mathematical expectation of a power-law slope = 2 reveal some very interesting ecological dynamics. Famously, Kilpatrick & Ives published a Letter in Nature in 2003 (Species interactions can explain Taylor’s power law for ecological time series) trying to explain why so many real populations have Taylor’s power law slopes < 2. As it turns out, the amount of competition occurring between species reduces the expected fluctuations for a given population size because of a kind of suppression by predators and competitors. Cool.

But that application was more a community-based examination and still largely theoretical. We decided to turn the power law a little on its ear and apply it to a different question – conservation biogeography. Read the rest of this entry »





Fanciful mathematics and ecological fantasy

3 05 2010

© flickr/themadlolscientist

Bear with me here, dear reader – this one’s a bit of a stretch for conservation relevance at first glance, but it is important. Also, it’s one of my own papers so I have the prerogative :-)

As some of you probably know, I dabble quite a bit in population dynamics theory, which basically means examining the mathematics people use to decipher ecological patterns. Why is this important? Well, most models predicting extinction risk, estimating optimal harvest rates, determining minimum viable population size and metapopulation dynamics for species’ persistence rely on good mathematical abstraction to be realistic. Get the maths wrong, and you could end up overharvesting a species (e.g., 99.99 % of fisheries management), underestimating extinction risk from habitat degradation, and getting your predictions wrong about the effects of invasive species. Expressed as an equation itself, (conservation) ecology = mathematics.

A long-standing family of models known as ‘phenomenological’ models (i.e., because they deal with the phenomenon of population size which is an emergent property of the mechanisms of birth, death and immigration) has been used to estimate everything from maximum sustainable yield targets, temporal abundance patterns, wildlife management interventions, extinction risk to epidemiological patterns. The basic form of the model describes the growth response, or the relationship between the population’s rate of change (growth) and its size. The simplest form (known as the Ricker), assumes a linear decline in population growth rate (r) as the number of individuals increases, which basically means that populations can’t grow indefinitely (i.e., they fluctuate around some carrying capacity if unperturbed). Read the rest of this entry »





A magic conservation number

15 12 2009

Although I’ve already blogged about our recent paper in Biological Conservation on minimum viable population sizes, American Scientist just did a great little article on the paper and concept that I’ll share with you here:

Imagine how useful it would be if someone calculated the minimum population needed to preserve each threatened organism on Earth, especially in this age of accelerated extinctions.

A group of Australian researchers say they have nailed the best figure achievable with the available data: 5,000 adults. That’s right, that many, for mammals, amphibians, insects, plants and the rest.

Their goal wasn’t a target for temporary survival. Instead they set the bar much higher, aiming for a census that would allow a species to pursue a standard evolutionary lifespan, which can vary from one to 10 million years.

That sort of longevity requires abundance sufficient for a species to thrive despite significant obstacles, including random variation in sex ratios or birth and death rates, natural catastrophes and habitat decline. It also requires enough genetic variation to allow adequate amounts of beneficial mutations to emerge and spread within a populace.

“We have suggested that a major rethink is required on how we assign relative risk to a species,” says conservation biologist Lochran Traill of the University of Adelaide, lead author of a Biological Conservation paper describing the projection.

Conservation biologists already have plenty on their minds these days. Many have concluded that if current rates of species loss continue worldwide, Earth will face a mass extinction comparable to the five big extinction events documented in the past. This one would differ, however, because it would be driven by the destructive growth of one species: us.

More than 17,000 of the 47,677 species assessed for vulnerability of extinction are threatened, according to the latest Red List of Threatened Species prepared by the International Union for Conservation of Nature. That includes 21 percent of known mammals, 30 percent of known amphibians, 12 percent of known birds and 70 percent of known plants. The populations of some critically endangered species number in the hundreds, not thousands.

In an effort to help guide rescue efforts, Traill and colleagues, who include conservation biologists and a geneticist, have been exploring minimum viable population size over the past few years. Previously they completed a meta-analysis of hundreds of studies considering such estimates and concluded that a minimum head count of more than a few thousand individuals would be needed to achieve a viable population.

“We don’t have the time and resources to attend to finding thresholds for all threatened species, thus the need for a generalization that can be implemented across taxa to prevent extinction,” Traill says.

In their most recent research they used computer models to simulate what population numbers would be required to achieve long-term persistence for 1,198 different species. A minimum population of 500 could guard against inbreeding, they conclude. But for a shot at truly long-term, evolutionary success, 5,000 is the most parsimonious number, with some species likely to hit the sweet spot with slightly less or slightly more.

“The practical implications are simply that we’re not doing enough, and that many existing targets will not suffice,” Traill says, noting that many conservation programs may inadvertently be managing protected populations for extinction by settling for lower population goals.

The prospect that one number, give or take a few, would equal the minimum viable population across taxa doesn’t seem likely to Steven Beissinger, a conservation biologist at the University of California at Berkeley.

“I can’t imagine 5,000 being a meaningful number for both Alabama beach mice and the California condors. They are such different organisms,” Beissinger says.

Many variables must be considered when assessing the population needs of a given threatened species, he says. “This issue really has to do with threats more than stochastic demography. Take the same rates of reproduction and survival and put them in a healthy environment and your minimum population would be different than in an environment of excess predation, loss of habitat or effects from invasive species.”

But, Beissinger says, Traill’s group is correct for thinking that conservation biologists don’t always have enough empirically based standards to guide conservation efforts or to obtain support for those efforts from policy makers.

“One of the positive things here is that we do need some clear standards. It might not be establishing a required number of individuals. But it could be clearer policy guidelines for acceptable risks and for how many years into the future can we accept a level of risk,” Beissinger says. “Policy people do want that kind of guidance.”

Traill sees policy implications in his group’s conclusions. Having a numerical threshold could add more precision to specific conservation efforts, he says, including stabs at reversing the habitat decline or human harvesting that threaten a given species.

“We need to restore once-abundant populations to the minimum threshold,” Traill says. “In many cases it will make more economic and conservation sense to abandon hopeless-case species in favor of greater returns elsewhere.





Raise targets to prevent extinction

12 11 2009

I know I’ve blogged recently about this, but The Adelaidean did a nice little article that I thought I’d reproduce here. The source can be found here.

Adelaidean story Nov 2009





Susceptibility of sharks, rays and chimaeras to global extinction

10 11 2009
tiger shark

© R. Harcourt

Quite some time ago my colleague and (now former) postdoctoral fellow, Iain Field, and I sat down to examine in gory detail the extent of the threat to global populations of sharks, rays and chimaeras (chondrichthyans). I don’t think we quite realised the mammoth task we had set ourselves. Several years and nearly a hundred pages later, we have finally achieved our goal.

Introducing the new paper in Advances in Marine Biology entitled Susceptibility of sharks, rays and chimaeras to global extinction by Iain Field, Mark Meekan, Rik Buckworth and Corey Bradshaw.

The paper covers the following topics:

  • Chondrichthyan Life Historyangel shark
  • Niche breadth
  • Age and growth
  • Reproduction and survival
  • Past and Present Threats
  • Fishing
  • Beach meshing
  • Habitat loss
  • Pollution and non-indigenous species
  • Chondrichthyan Extinction Risk
  • Drivers of threat risk in chondrichthyans and teleosts
  • Global distribution of threatened chondrichthyan taxa
  • Ecological, life history and human-relationship attributes
  • Threat risk analysis
  • Relative threat risk of chondrichthyans and teleosts
  • Implications of Chondrichthyan Species Loss on Ecosystem Structure, Function and Stability
  • Ecosystem roles of predators
  • Predator loss in the marine realm
  • Ecosystem roles of chondrichthyans
  • Synthesis and Knowledge Gaps
  • Role of fisheries in future chondrichthyan extinctions
  • Climate change
  • Extinction synergies
  • Research needs

common skateAs mentioned, quite a long analysis of the state of sharks worldwide. Bottom line? Well, as most of you might already know sharks aren’t doing too well worldwide, with around 52 % listed on the IUCN’s Red List of Threatened Species. This compares interestingly to bony fishes (teleosts) that, although having only 8 % of all species Red-Listed, are generally in higher-threat Red List categories. We found that body size (positively) and geographic range (negatively) correlated with threat risk in both groups, but Red-Listed bony fishes were still more likely to be categorised as threatened after controlling for these effects.

blue sharkIn some ways this sort of goes against the notion that sharks are inherently more extinction-prone than other fish – a common motherhood statement seen at the beginning of almost all papers dealing with shark threats. What it does say though is that because sharks are on average larger and less fecund than your average fish, they tend to bounce back from declines more slowly, so they are more susceptible to rapid environmental change than your average fish. Guess what? We’re changing the environment pretty rapidly.

We also determined the spatial distribution of threat, and found that Red-Listed species are clustered mainly in (1) south-eastern South America; (2) western Europe and the Mediterranean; (3) western Africa; (4) South China Sea and South East Asia and (5) south-eastern Australia.

shark market, Indonesia

© W. White

Now, what are the implications for the loss of these species? As I’ve blogged recently, the reduction in predators in general can be a bad thing for ecosystems, and sharks are probably some of the best examples of ecosystem structural engineers we know (i.e., eating herbivores; ‘controlling’ prey densities, etc.). So, we should be worried when sharks start to disappear. One thing we also discovered is that we still have a rudimentary understanding of how climate change will affect sharks, the ways in which they structure ecosystems, and how they respond to coastal development. Suffice it to say though that generally speaking, things are not rosy if you’re a shark.

We end off with a recommendation we’ve been promoting elsewhere – we should be managing populations using the minimum viable population (MVP) size concept. Making sure that there are a lot of large, well-connected populations around will be the best insurance against extinction.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

ResearchBlogging.orgI.C. Field, M.G. Meekan, R.C. Buckworth, & C.J.A. Bradshaw (2009). Susceptibility of Sharks, Rays and Chimaeras to Global Extinction Advances in Marine Biology, 56, 275-363 : 10.1016/S0065-2881(09)56004-X





Not so ‘looming’ – Anthropocene extinctions

4 11 2009

ABCclip031109

© ABC 2009

Yesterday I was asked to do a quick interview on ABC television (Midday Report) about the release of the 2009 IUCN Red List of Threatened Species. I’ve blogged about the importance of the Red List before, but believe we have a lot more to do with species assessments and getting prioritisation right with respect to minimum viable population size. Have a listen to the interview itself, and read the IUCN’s media release reproduced below.

My basic stance is that we’ve only just started to assess the number of species on the planet (under 50000), yet there are many millions of species still largely under-studied and/or under-described (e.g., extant species richness = > 4 million protists, 16600 protozoa, 75000-300000 helminth parasites, 1.5 million fungi, 320000 plants, 4-6 million arthropods, > 6500 amphibians, 10000 birds and > 5000 mammals – see Bradshaw & Brook 2009 J Cosmol for references). What we’re looking at here is a refinement of knowledge (albeit a small one). We are indeed in the midst of the Anthropocene mass extinction event – there is nothing ‘looming’ about it. We are essentially losing species faster than we can assess them. I believe it’s important to make this clearer to those not working directly in the field of biodiversity conservation.

CJA Bradshaw

Extinction crisis continues apace – IUCN

Gland, Switzerland, 3 November, 2009 (IUCN) – The latest update of the IUCN Red List of Threatened Species™ shows that 17,291 species out of the 47,677 assessed species are threatened with extinction.

The results reveal 21 percent of all known mammals, 30 percent of all known amphibians, 12 percent of all known birds, and 28 percent of reptiles, 37 percent of freshwater fishes, 70 percent of plants, 35 percent of invertebrates assessed so far are under threat.

“The scientific evidence of a serious extinction crisis is mounting,” says Jane Smart, Director of IUCN’s Biodiversity Conservation Group. “January sees the launch of the International Year of Biodiversity. The latest analysis of the IUCN Red List shows the 2010 target to reduce biodiversity loss will not be met. It’s time for governments to start getting serious about saving species and make sure it’s high on their agendas for next year, as we’re rapidly running out of time.”

Of the world’s 5,490 mammals, 79 are Extinct or Extinct in the Wild, with 188 Critically Endangered, 449 Endangered and 505 Vulnerable. The Eastern Voalavo (Voalavo antsahabensis) appears on the IUCN Red List for the first time in the Endangered category. This rodent, endemic to Madagascar, is confined to montane tropical forest and is under threat from slash-and-burn farming.

There are now 1,677 reptiles on the IUCN Red List, with 293 added this year. In total, 469 are threatened with extinction and 22 are already Extinct or Extinct in the Wild. The 165 endemic Philippine species new to the IUCN Red List include the Panay Monitor Lizard (Varanus mabitang), which is Endangered. This highly-specialized monitor lizard is threatened by habitat loss due to agriculture and logging and is hunted by humans for food. The Sail-fin Water Lizard (Hydrosaurus pustulatus) enters in the Vulnerable category and is also threatened by habitat loss. Hatchlings are heavily collected both for the pet trade and for local consumption.

“The world’s reptiles are undoubtedly suffering, but the picture may be much worse than it currently looks,” says Simon Stuart, Chair of IUCN’s Species Survival Commission. “We need an assessment of all reptiles to understand the severity of the situation but we don’t have the $2-3 million to carry it out.”

The IUCN Red List shows that 1,895 of the planet’s 6,285 amphibians are in danger of extinction, making them the most threatened group of species known to date. Of these, 39 are already Extinct or Extinct in the Wild, 484 are Critically Endangered, 754 are Endangered and 657 are Vulnerable.

The Kihansi Spray Toad (Nectophrynoides asperginis) has moved from Critically Endangered to Extinct in the Wild. The species was only known from the Kihansi Falls in Tanzania, where it was formerly abundant with a population of at least 17,000. Its decline is due to the construction of a dam upstream of the Kihansi Falls that removed 90 percent of the original water flow to the gorge. The fungal disease chytridiomycosis was probably responsible for the toad’s final population crash.

The fungus also affected the Rabb’s Fringe-limbed Treefrog (Ecnomiohyla rabborum), which enters the Red List as Critically Endangered. It is known only from central Panama. In 2006, the chytrid fungus (Batrachochytrium dendrobatidis) was reported in its habitat and only a single male has been heard calling since. This species has been collected for captive breeding efforts but all attempts have so far failed.

Of the 12,151 plants on the IUCN Red List, 8,500 are threatened with extinction, with 114 already Extinct or Extinct in the Wild. The Queen of the Andes (Puya raimondii) has been reassessed and remains in the Endangered category. Found in the Andes of Peru and Bolivia, it only produces seeds once in 80 years before dying. Climate change may already be impairing its ability to flower and cattle roam freely among many colonies, trampling or eating young plants.

There are now 7,615 invertebrates on the IUCN Red List this year, 2,639 of which are threatened with extinction. Scientists added 1,360 dragonflies and damselflies, bringing the total to 1,989, of which 261 are threatened. The Giant Jewel (Chlorocypha centripunctata), classed as Vulnerable, is found in southeast Nigeria and southwest Cameroon and is threatened by forest destruction.

Scientists also added 94 molluscs, bringing the total number assessed to 2,306, of which 1,036 are threatened. Seven freshwater snails from Lake Dianchi in Yunnan Province, China, are new to the IUCN Red List and all are threatened. These join 13 freshwater fishes from the same area, 12 of which are threatened. The main threats are pollution, introduced fish species and overharvesting.

There are now 3,120 freshwater fishes on the IUCN Red List, up 510 species from last year. Although there is still a long way to go before the status all the world’s freshwater fishes is known, 1,147 of those assessed so far are threatened with extinction. The Brown Mudfish (Neochanna apoda), found only in New Zealand, has been moved from Near Threatened to Vulnerable as it has disappeared from many areas in its range. Approximately 85-90 percent of New Zealand’s wetlands have been lost or degraded through drainage schemes, irrigation and land development.

“Creatures living in freshwater have long been neglected. This year we have again added a large number of them to the IUCN Red List and are confirming the high levels of threat to many freshwater animals and plants. This reflects the state of our precious water resources. There is now an urgency to pursue our effort but most importantly to start using this information to move towards a wise use of water resources,” says Jean-Christophe Vié, Deputy Head of the IUCN Species Programme.

“This year’s IUCN Red List makes for sobering reading,” says Craig Hilton-Taylor, Manager of the IUCN Red List Unit. “These results are just the tip of the iceberg. We have only managed to assess 47,663 species so far; there are many more millions out there which could be under serious threat. We do, however, know from experience that conservation action works so let’s not wait until it’s too late and start saving our species now.”

The status of the Australian Grayling (Prototroctes maraena), a freshwater fish, has improved as a result of conservation efforts. Now classed as Near Threatened as opposed to Vulnerable, the population has recovered thanks to fish ladders which have been constructed over dams to allow migration, enhanced riverside vegetation and the education of fishermen, who now face heavy penalties if found with this species.





Managing for extinction

9 10 2009

ladderAh, it doesn’t go away, does it? Or at least, we won’t let it.

That concept of ‘how many is enough?’ in conservation biology, the so-called ‘minimum viable population size‘, is enough to drive some conservation practitioners batty.

How many times have we heard the (para-) phrase: “It’s simply impractical to bring populations of critically endangered species up into the thousands”?

Well, my friends, if you’re not talking thousands, you’re wasting everyone’s time and money. You are essentially managing for extinction.

Our new paper out online in Biological Conservation entitled Pragmatic population viability targets in a rapidly changing world (Traill et al.) shows that populations of endangered species are unlikely to persist in the face of global climate change and habitat loss unless they number around 5000 mature individuals or more.

After several meta-analytic, time series-based and genetic estimates of the magic minimum number all agreeing, we can be fairly certain now that if a population is much less than several thousands (median = 5000), its likelihood of persisting in the long run in the face of normal random variation is pretty small.

We conclude essentially that many conservation biologists routinely underestimate or ignore the number of animals or plants required to prevent extinction. In fact, aims to maintain tens or hundreds of individuals, when thousands are actually needed, are simply wasting precious and finite conservation resources. Thus, if it is deemed unrealistic to attain such numbers, we essentially advise that in most cases conservation triage should be invoked and the species in question be abandoned for better prospects

A long-standing idea in species restoration programs is the so-called ‘50/500’ rule; this states that at least 50 adults are required to avoid the damaging effects of inbreeding, and 500 to avoid extinctions due to the inability to evolve to cope with environmental change. Our research suggests that the 50/500 rule is at least an order of magnitude too small to stave off extinction.

This does not necessarily imply that populations smaller than 5000 are doomed. But it does highlight the challenge that small populations face in adapting to a rapidly changing world.

We are battling to prevent a mass extinction event in the face of a growing human population and its associated impact on the planet, but the bar needs to be a lot higher. However, we shouldn’t necessarily give up on critically endangered species numbering a few hundred of individuals in the wild. Acceptance that more needs to be done if we are to stop ‘managing for extinction’ should force decision makers to be more explicit about what they are aiming for, and what they are willing to trade off, when allocating conservation funds.

CJA Bradshaw

(with thanks to Lochran Traill, Barry Brook and Dick Frankham)

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

This post was chosen as an Editor's Selection for ResearchBlogging.orgResearchBlogging.org

Traill, L.W., Brook, B.W., Frankham, R.R., & Bradshaw, C.J.A. (2009). Pragmatic population viability targets in a rapidly changing world Biological Conservation DOI: 10.1016/j.biocon.2009.09.001





Wobbling to extinction

31 08 2009

crashI’ve been meaning to highlight for a while a paper that I’m finding more and more pertinent as a citation in my own work. The general theme is concerned with estimating extinction risk of a particular population, species (or even ecosystem), and more and more we’re finding that different drivers of population decline and eventual extinction often act synergistically to drive populations to that point of no return.

In other words, the whole is greater than the sum of its parts.

In other, other words, extinction risk is usually much higher than we generally appreciate.

This might seem at odds with my previous post about the tendency of the stochastic exponential growth model to over-estimate extinction risk using abundance time series, but it’s really more of a reflection of our under-appreciation of the complexity of the extinction process.

In the early days of ConservationBytes.com I highlighted a paper by Fagan & Holmes that described some of the few time series of population abundances right up until the point of extinction – the reason these datasets are so rare is because it gets bloody hard to find the last few individuals before extinction can be confirmed. Most recently, Melbourne & Hastings described in a paper entitled Extinction risk depends strongly on factors contributing to stochasticity published in Nature last year how an under-appreciated component of variation in abundance leads to under-estimation of extinction risk.

‘Demographic stochasticity’ is a fancy term for variation in the probability of births deaths at the individual level. Basically this means that there will be all sorts of complicating factors that move any individual in a population away from its expected (mean) probability of dying or reproducing. When taken as a mean over a lot of individuals, it has generally been assumed that demographic stochasticity is washed out by other forms of variation in mean (population-level) birth and death probability resulting from vagaries of the environmental context (e.g., droughts, fires, floods, etc.).

‘No, no, no’, say Melbourne & Hastings. Using some relatively simple laboratory experiments where environmental stochasticity was tightly controlled, they showed that demographic stochasticity dominated the overall variance and that environmental variation took a back seat. The upshot of all these experiments and mathematical models is that for most species of conservation concern (i.e., populations already reduced below to their minimum viable populations size), not factoring in the appropriate measures of demographic wobble means that most people are under-estimating extinction risk.

Bloody hell – we’ve been saying this for years; a few hundred individuals in any population is a ridiculous conservation target. People must instead focus on getting their favourite endangered species to number at least in the several thousands if the species is to have any hope of persisting (this is foreshadowing a paper we have coming out shortly in Biological Conservationstay tuned for a post thereupon).

Melbourne & Hastings have done a grand job in reminding us how truly susceptible small populations are to wobbling over the line and disappearing forever.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





Not-so-scary maths and extinction risk

27 08 2009
© P. Horn

© P. Horn

Population viability analysis (PVA) and its cousin, minimum viable population (MVP) size estimation, are two generic categories for mathematically assessing a population’s risk of extinction under particular environmental scenarios (e.g., harvest regimes, habitat loss, etc.) (a personal plug here, for a good overview of general techniques in mathematical conservation ecology, check out our new chapter entitled ‘The Conservation Biologist’s Toolbox…’ in Sodhi & Ehrlich‘s edited book Conservation Biology for All by Oxford University Press [due out later this year]). A long-standing technique used to estimate extinction risk when the only available data for a population are in the form of population counts (abundance estimates) is the stochastic exponential growth model (SEG). Surprisingly, this little beauty is relatively good at predicting risk even though it doesn’t account for density feedback, age structure, spatial complexity or demographic stochasticity.

So, how does it work? Well, it essentially calculates the mean and variance of the population growth rate, which is just the logarithm of the ratio of an abundance estimate in one year to the abundance estimate in the previous year. These two parameters are then resampled many times to estimate the probability that abundance drops below a certain small threshold (often set arbitrarily low to something like < 50 females, etc.).

It is simple (funny how maths can become so straightforward to some people when you couch them in words rather than mathematical symbols), and rather effective. This is why a lot of people use it to prescribe conservation management interventions. You don’t have to be a modeller to use it (check out Morris & Doak’s book Quantitative Conservation Biology for a good recipe-like description).

But (there’s always a but), a new paper just published online in Conservation Letters by Bruce Kendall entitled The diffusion approximation overestimates extinction risk for count-based PVA questions the robustness when the species of interest breeds seasonally. You see, the diffusion approximation (the method used to estimate that extinction risk described above) generally assumes continuous breeding (i.e., there are always some females producing offspring). Using some very clever mathematics, simulation and a bloody good presentation, Kendall shows quite clearly that the diffusion approximation SEG over-estimates extinction risk when this happens (and it happens frequently in nature). He also offers a new simulation method to get around the problem.

Who cares, apart from some geeky maths types (I include myself in that group)? Well, considering it’s used so frequently, is easy to apply and it has major implications for species threat listings (e.g., IUCN Red List), it’s important we estimate these things as correctly as we can. Kendall shows how several species have already been misclassified for threat risk based on the old technique.

So, once again mathematics has the spotlight. Thanks, Bruce, for demonstrating how sound mathematical science can pave the way for better conservation management.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





Hot inbreeding

22 07 2009
inbreeding

© R. Ballen

Sounds really disgusting a little rude, doesn’t it? Well, if you think losing species because of successive bottlenecks from harvesting, habitat loss and genetic deterioration is rude, then the title of this post is appropriate.

I’m highlighting today a paper recently published in Conservation Biology by Kristensen and colleagues entitled Linking inbreeding effects in captive populations with fitness in the wild: release of replicated Drosophila melanogaster lines under different temperatures.

The debate has been around for years – do inbred populations have lower fitness (e.g., reproductive success, survival, dispersal, etc.) than their ‘outbred’ counterparts? Is one of the reasons small populations (below their minimum viable population size) have a high risk of extinction because genetic deterioration erodes fitness?

While there are many species that seem to defy this assumption, the increasing prevalence of Allee effects, and the demonstration that threatened species have lower genetic diversity than non-threatened species, all seem to support the idea. Kristensen & colleagues’ paper uses that cornerstone of genetic guinea pigs, the Drosophila fruit fly, not only to demonstrate inbreeding depression in the lab, but also the subsequent fate of inbred individuals released into the wild.

What they found was quite amazing. Released inbred flies only did poorly (i.e., weren’t caught as frequently meaning that they probably were less successful in finding food and perished) relative to outbred flies when the temperature was warm (daytime). Cold (i.e., night) releases failed to show any difference between inbred and outbred flies.

Basically this means that the environment interacts strongly with the genetic code that signals for particularly performances. When the going is tough (and if you’re an ectothermic fly, extreme heat can be the killer), then genetically compromised individuals do badly. Another reasons to be worried about runaway global climate warming.

Another important point was that the indices of performance didn’t translate universally to the field conditions, so lab-only results might very well give us some incorrect predictions of animal performance when populations reach small sizes and become inbred.

CJA Bradshaw





Classics: Ecological Triage

27 03 2009

It is a truism that when times are tough, only the strongest pull through. This isn’t a happy concept, but in our age of burgeoning biodiversity loss (and economic belt-tightening), we have to make some difficult decisions.In this regard, I suggest Brian Walker’s1992 paper Biodiveristy and ecological redundancy makes the Classics list.

Ecological triage is, of course, taken from the medical term triage used in emergency or wartime situations. Ecological triage refers to the the conservation prioritisation of species that provide unique or necessary functions to ecosystems, and the abandonment of those that do not have unique ecosystem roles or that face almost certain extinction given they fall well below their minimum viable population size (Walker 1992). Financial resources such as investment in recovery programmes, purchase of remaining habitats for preservation, habitat restoration, etc. are allocated accordingly; the species that contribute the most to ecosystem function and have the highest probability of persisting are earmarked for conservation and others are left to their own devices (Hobbs & Kristjanson 2003).

This emotionally empty and accounting-type conservation can be controversial because public favourites like pandas, kakapo and some dolphin species just don’t make the list in many circumstances. As I’ve stated before, it makes no long-term conservation or economic sense to waste money on the doomed and ecologically redundant. Many in the conservation business apply ecological triage without being fully aware of it. Finite pools of money (generally the paltry left-overs from some green-guilty corporation or under-funded government initiative) for conservation mean that we have to set priorities – this is an entire discipline in its own right in conservation biology. Reserve design is just one example of this sacrifice-the-doomed-for-the good-of-the-ecosystem approach.

Walker (1992) advocated that we should endeavour to maintain ecosystem function first, and recommended that we abandon programmes to restore functionally ‘redundant’ species (i.e., some species are more ecologically important than others, e.g., pollinators, prey). But how do you make the choice? The wrong selection might mean an extinction cascade (Noss 1990; Walker 1992) whereby tightly linked species (e.g., parasites-hosts, pollinators-plants, predators-prey) will necessarily go extinct if one partner in the mutualism disappears (see Koh et al. 2004 on co-extinctions). Ecological redundancy is a terribly difficult thing to determine, especially given that we still understand relatively little about how complex ecological systems really work (Marris 2007).

The more common (and easier, if not theoretically weaker) approach is to prioritise areas and not species (e.g., biodiversity hotspots), but even the criteria used for area prioritisation can be somewhat arbitrary and may not necessarily guarantee the most important functional groups are maintained (Orme et al. 2005; Brooks et al. 2006). There are many different ways of establishing ‘priority’, and it depends partially on your predilections.

More recent mathematical approaches such as cost-benefit analyses (Possingham et al. 2002; Murdoch et al. 2007) advocate conservation like a CEO would run a profitable business. In this case the ‘currency’ is biodiversity, and so a fixed financial investment must maximise long-term biodiversity gains (Possingham et al. 2002). This essentially estimates the potential biodiversity saved per dollar invested, and allocates funds accordingly (Wilson et al. 2007). Where the costs outweigh the benefits, conservationists move on to more beneficial goals. Perhaps the biggest drawback with this approach is that it’s particularly data-hungry. When ecosystems are poorly measured, then the investment curve is unlikely to be very realistic.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

(Many thanks to Lochran Traill and Barry Brook for co-developing these ideas with me)








Follow

Get every new post delivered to your Inbox.

Join 5,312 other followers

%d bloggers like this: