What is a ‘mass extinction’ and are we in one now?

13 11 2019

(reproduced from The Conversation)

For more than 3.5 billion years, living organisms have thrived, multiplied and diversified to occupy every ecosystem on Earth. The flip side to this explosion of new species is that species extinctions have also always been part of the evolutionary life cycle.

But these two processes are not always in step. When the loss of species rapidly outpaces the formation of new species, this balance can be tipped enough to elicit what are known as “mass extinction” events.


Read more: Climate change is killing off Earth’s little creatures


A mass extinction is usually defined as a loss of about three quarters of all species in existence across the entire Earth over a “short” geological period of time. Given the vast amount of time since life first evolved on the planet, “short” is defined as anything less than 2.8 million years.

Since at least the Cambrian period that began around 540 million years ago when the diversity of life first exploded into a vast array of forms, only five extinction events have definitively met these mass-extinction criteria.

These so-called “Big Five” have become part of the scientific benchmark to determine whether human beings have today created the conditions for a sixth mass extinction.

An ammonite fossil found on the Jurassic Coast in Devon. The fossil record can help us estimate prehistoric extinction rates. Corey Bradshaw, Author provided

Read the rest of this entry »





Global warming causes the worst kind of extinction domino effect

25 11 2018

Dominos_Rough1-500x303Just under two weeks ago, Giovanni Strona and I published a paper in Scientific Reports on measuring the co-extinction effect from climate change. What we found even made me — an acknowledged pessimist — stumble in shock and incredulity.

But a bit of back story is necessary before I launch into describing what we discovered.

Last year, some Oxbridge astrophysicists (David Sloan and colleagues) published a rather sensational paper in Scientific Reports claiming that life on Earth would likely survive in the face of cataclysmic astrophysical events, such as asteroid impacts, supernovae, or gamma-ray bursts. This rather extraordinary conclusion was based primarily on the remarkable physiological adaptations and tolerances to extreme conditions displayed by tardigrades— those gloriously cute, but tiny (most are around 0.5 mm long as adults) ‘water bears’ or ‘moss piglets’ — could you get any cuter names?

aHR0cDovL3d3dy5saXZlc2NpZW5jZS5jb20vaW1hZ2VzL2kvMDAwLzA5OC81NzMvb3JpZ2luYWwvc3dpbW1pbmctdGFyZGlncmFkZS5qcGc=

Found almost everywhere and always (the first fossils of them date back to the early Cambrian over half a billion years ago), these wonderful little creatures are some of the toughest metazoans (multicellular animals) on the planet. Only a few types of extremophile bacteria are tougher.

So, boil, fry or freeze the Earth, and you’ll still have tardigrades around, concluded Sloan and colleagues.

When Giovanni first read this, and then passed the paper along to me for comment, our knee-jerk reaction as ecologists was a resounding ‘bullshit!’. Even neophyte ecologists know intuitively that because species are all interconnected in vast networks linked by trophic (who eats whom), competitive, and other ecological functions (known collectively as ‘multiplex networks’), they cannot be singled out using mere thermal tolerances to predict the probability of annihilation. Read the rest of this entry »





Why populations can’t be saved by a single breeding pair

3 04 2018

620x349

© Reuters/Thomas Mukoya

I published this last week on The Conversation, and now reproducing it here for CB.com readers.

 

Two days ago, the last male northern white rhino (Ceratotherium simum cottoni) died. His passing leaves two surviving members of his subspecies: both females who are unable to bear calves.

Even though it might not be quite the end of the northern white rhino because of the possibility of implanting frozen embryos in their southern cousins (C. simum simum), in practical terms, it nevertheless represents the end of a long decline for the subspecies. It also raises the question: how many individuals does a species need to persist?

Fiction writers have enthusiastically embraced this question, most often in the post-apocalypse genre. It’s a notion with a long past; the Adam and Eve myth is of course based on a single breeding pair populating the entire world, as is the case described in the Ragnarok, the final battle of the gods in Norse mythology.

This idea dovetails neatly with the image of Noah’s animals marching “two by two” into the Ark. But the science of “minimum viable populations” tells us a different story.

No inbreeding, please

The global gold standard used to assess the extinction risk of any species is the International Union for the Conservation of Nature (IUCN) Red List of Threatened Species. Read the rest of this entry »





50/500 or 100/1000 debate not about time frame

26 06 2014

Not enough individualsAs you might recall, Dick Frankham, Barry Brook and I recently wrote a review in Biological Conservation challenging the status quo regarding the famous 50/500 ‘rule’ in conservation management (effective population size [Ne] = 50 to avoid inbreeding depression in the short-term, and Ne = 500 to retain the ability to evolve in perpetuity). Well, it inevitably led to some comments arising in the same journal, but we were only permitted by Biological Conservation to respond to one of them. In our opinion, the other comment was just as problematic, and only further muddied the waters, so it too required a response. In a first for me, we have therefore decided to publish our response on the arXiv pre-print server as well as here on ConservationBytes.com.

50/500 or 100/1000 debate is not about the time frame – Reply to Rosenfeld

cite as: Frankham, R, Bradshaw CJA, Brook BW. 2014. 50/500 or 100/1000 debate is not about the time frame – Reply to Rosenfeld. arXiv: 1406.6424 [q-bio.PE] 25 June 2014.

The Letter from Rosenfeld (2014) in response to Jamieson and Allendorf (2012) and Frankham et al. (2014) and related papers is misleading in places and requires clarification and correction, as follows: Read the rest of this entry »





Too small to avoid catastrophic biodiversity meltdown

27 09 2013
Chiew Larn

Chiew Larn Reservoir is surrounded by Khlong Saeng Wildlife Sanctuary and Khao Sok National Park, which together make up part of the largest block of rainforest habitat in southern Thailand (> 3500 km2). Photo: Antony Lynam

One of the perennial and probably most controversial topics in conservation ecology is when is something “too small’. By ‘something’ I mean many things, including population abundance and patch size. We’ve certainly written about the former on many occasions (see here, here, here and here for our work on minimum viable population size), with the associated controversy it elicited.

Now I (sadly) report on the tragedy of the second issue – when is a habitat fragment too small to be of much good to biodiversity?

Published today in the journal Science, Luke Gibson (of No substitute for primary forest fame) and a group of us report disturbing results about the ecological meltdown that has occurred on islands created when the Chiew Larn Reservoir of southern Thailand was flooded nearly 30 years ago by a hydroelectric dam.

As is the case in many parts of the world (e.g., Three Gorges Dam, China), hydroelectric dams can cause major ecological problems merely by flooding vast areas. In the case of Charn Liew, co-author Tony Lynam of Wildlife Conservation Society passed along to me a bit of poignant and emotive history about the local struggle to prevent the disaster.

“As the waters behind the dam were rising in 1987, Seub Nakasathien, the Superintendent of the Khlong Saeng Wildlife Sanctuary, his staff and conservationist friends, mounted an operation to capture and release animals that were caught in the flood waters.

It turned out to be distressing experience for all involved as you can see from the clips here, with the rescuers having only nets and longtail boats, and many animals dying. Ultimately most of the larger mammals disappeared quickly from the islands, leaving just the smaller fauna.

Later Seub moved to Huai Kha Khaeng Wildlife Sanctuary and fought an unsuccessful battle with poachers and loggers, which ended in him taking his own life in despair in 1990. A sad story, and his friend, a famous folk singer called Aed Carabao, wrote a song about Seub, the music of which plays in the video. Read the rest of this entry »





Science immortalised in cartoon

1 02 2013

Well, this is a first for me (us).

I’ve never had a paper of ours turned into a cartoon. The illustrious and brilliant ‘First Dog on the Moon‘ (a.k.a. Andrew Marlton) who is chief cartoonist for Australia’s irreverent ‘Crikey‘ online news magazine just parodied our Journal of Animal Ecology paper No need for disease: testing extinction hypotheses for the thylacine using multispecies metamodels that I wrote about a last month here on ConservationBytes.com.

Needless to say, I’m chuffed as a chuffed thing.

Enjoy!

Stripey





Ecology is a Tower of Babel

17 09 2012

The term ‘ecology’ in 16 different languages overlaid on the oil on board ‘The Tower of Babel’ by Flemish Renaissance painter Pieter Bruegel the Elder (1563).

In his song ‘Balada de Babel’, the Spanish artist Luis Eduardo Aute sings several lyrics in unison with the same melody. The effect is a wonderful mess. This is what the scientific literature sounds like when authors generate synonymies (equivalent meaning) and polysemies (multiple meanings), or coin terms to show a point of view. In our recent paper published in Oecologia, we illustrate this problem with regard to ‘density dependence’: a key ecological concept. While the biblical reference is somewhat galling to our atheist dispositions, the analogy is certainly appropriate.

A giant shoal of herring zigzagging in response to a predator; a swarm of social bees tending the multitudinous offspring of their queen; a dense pine forest depriving its own seedlings from light; an over-harvested population of lobsters where individuals can hardly find reproductive mates; pioneering strands of a seaweed colonising a foreign sea after a transoceanic trip attached to the hulk of boat; respiratory parasites spreading in a herd of caribou; or malaria protozoans making their way between mosquitoes and humans – these are all examples of population processes that operate under a density check. The number of individuals within those groups of organisms determines their chances for reproduction, survival or dispersal, which we (ecologists) measure as ‘demographic rates’ (e.g., number of births per mother, number of deaths between consecutive years, or number of immigrants per hectare).

In ecology, the causal relationship between the size of a population and a demographic rate is known as ‘density dependence’ (DD hereafter). This relationship captures the pace at which a demographic rate changes as population size varies in time and/or space. We use DD measurements to infer the operation of social and trophic interactions (cannibalism, competition, cooperation, disease, herbivory, mutualism, parasitism, parasitoidism, predation, reproductive behaviour and the like) between individuals within a population1,2, because the intensity of these interactions varies with population size. Thus, as a population of caribou expands, respiratory parasites will have an easier job to disperse from one animal to another. As the booming parasites breed, increased infestations will kill the weakest caribou or reduce the fertility of females investing too much energy to counteract the infection (yes, immunity is energetically costly, which is why you get sick when you are run down). In turn, as the caribou population decreases, so does the population of parasites3. In cybernetics, such a toing-and-froing is known as ‘feedback’ (a system that controls itself, like a thermostat controls the temperature of a room) – a ‘density feedback’ (Figure 1) is the kind we are highlighting here. Read the rest of this entry »





Conservation catastrophes

22 02 2012

David Reed

The title of this post serves two functions: (1) to introduce the concept of ecological catastrophes in population viability modelling, and (2) to acknowledge the passing of the bloke who came up with a clever way of dealing with that uncertainty.

I’ll start with latter first. It came to my attention late last year that a fellow conservation biologist colleague, Dr. David Reed, died unexpectedly from congestive heart failure. I did not really mourn his passing, for I had never met him in person (I believe it is disingenuous, discourteous, and slightly egocentric to mourn someone who you do not really know personally – but that’s just my opinion), but I did think at the time that the conservation community had lost another clever progenitor of good conservation science. As many CB readers already know, we lost a great conservation thinker and doer last year, Professor Navjot Sodhi (and that, I did take personally). Coincidentally, both Navjot and David died at about the same age (49 and 48, respectively). I hope that the being in one’s late 40s isn’t particularly presaged for people in my line of business!

My friend, colleague and lab co-director, Professor Barry Brook, did, however, work a little with David, and together they published some pretty cool stuff (see References below). David was particularly good at looking for cross-taxa generalities in conservation phenomena, such as minimum viable population sizes, effects of inbreeding depression, applications of population viability analysis and extinction risk. But more on some of that below. Read the rest of this entry »





Not magic, but necessary

18 10 2011

In April this year, some American colleagues of ours wrote a rather detailed, 10-page article in Trends in Ecology and Evolution that attacked our concept of generalizing minimum viable population (MVP) size estimates among species. Steve Beissinger of the University of California at Berkeley, one of the paper’s co-authors, has been a particularly vocal adversary of some of the applications of population viability analysis and its child, MVP size, for many years. While there was some interesting points raised in their review, their arguments largely lacked any real punch, and they essentially ended up agreeing with us.

Let me explain. Today, our response to that critique was published online in the same journal: Minimum viable population size: not magic, but necessary. I want to take some time here to summarise the main points of contention and our rebuttal.

But first, let’s recap what we have been arguing all along in several papers over the last few years (i.e., Brook et al. 2006; Traill et al. 2007, 2010; Clements et al. 2011) – a minimum viable population size is the point at which a declining population becomes a small population (sensu Caughley 1994). In other words, it’s the point at which a population becomes susceptible to random (stochastic) events that wouldn’t otherwise matter for a small population.

Consider the great auk (Pinguinus impennis), a formerly widespread and abundant North Atlantic species that was reduced by intensive hunting throughout its range. How did it eventually go extinct? The last remaining population blew up in a volcanic explosion off the coast of Iceland (Halliday 1978). Had the population been large, the small dent in the population due to the loss of those individuals would have been irrelevant.

But what is ‘large’? The empirical evidence, as we’ve pointed out time and time again, is that large = thousands, not hundreds, of individuals.

So this is why we advocate that conservation targets should aim to keep at or recover to the thousands mark. Less than that, and you’re playing Russian roulette with a species’ existence. Read the rest of this entry »





Species’ Ability to Forestall Extinction – AudioBoo

8 04 2011

Here’s a little interview I just did on the SAFE index with ABC AM:

Not a bad job, really.

And here’s another one from Radio New Zealand:

CJA Bradshaw





Inbreeding does matter

29 03 2010

I’ve been busy with Bill Laurance visiting the University of Adelaide over the last few days, and will be so over the next few as well (and Bill has promised us a guest post shortly), but I wanted to get a post in before the week got away on me.

I’ve come across what is probably the most succinct description of why inbreeding depression is an important aspect of extinctions in free-ranging species (see also previous posts here and here) by Mr. Conservation Genetics himself, Professor Richard Frankham.

Way back in the 1980s (oh, so long ago), Russ Lande produced a landmark paper in Science arguing that population demography was a far more important driver of extinctions than reduced genetic diversity per se. He stated:

“…demography may usually be of more immediate importance than population genetics in determining the minimum viable size of wild populations”

We now know, however, that genetics in fact DO matter, and no one could put it better than Dick Frankham in his latest commentary in Heredity.

I paraphrase some of his main points below:

  • Controversy broke out in the 1970 s when it was suggested that inbreeding was deleterious for captive wildlife, but Ralls and Ballou (1983) reported that 41/44 mammal populations had higher juvenile mortality among inbred than outbred individuals.
  • Crnokrak and Roff (1999) established that inbreeding depression occurred in 90 % of the datasets they examined, and was similarly deleterious across major plant and animal taxa.
  • They estimated that inbreeding depression in the wild has approximately seven times greater impact than in captivity.
  • It is unrealistic to omit inbreeding depression from population viability analysis models.
  • Lande’s contention was rejected when Spielman et al. (2004) found that genetic diversity in 170 threatened taxa was lower than in related non-threatened taxa

Lande might have been incorrect, but his contention spawned the entire modern discipline of conservation genetics. Dick sums up all this so much more eloquently than I’ve done here, so I encourage you to read his article.

CJA Bradshaw

ResearchBlogging.orgFrankham, R. (2009). Inbreeding in the wild really does matter Heredity, 104 (2), 124-124 DOI: 10.1038/hdy.2009.155

Lande, R. (1988). Genetics and demography in biological conservation Science, 241 (4872), 1455-1460 DOI: 10.1126/science.3420403

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to Yahoo BuzzAdd to Newsvine





The elusive Allee effect

8 01 2010

© D. Bishop, Getty Images

In keeping with the theme of extinctions from my last post, I want to highlight a paper we’ve recently had published online early in Ecology entitled Limited evidence for the demographic Allee effect from numerous species across taxa by Stephen Gregory and colleagues. This one is all about Allee effects – well, it’s all about how difficult it is to find them!

If you recall, an Allee effect is a “…positive relationship between any component of individual fitness and either numbers or density of conspecifics” (Stephens et al. 1999, Oikos 87:185-190) and the name itself is attributed to Warder Clyde Allee. There are many different kinds of Allee effects (see previous Allee effects post for Berec and colleagues’ full list of types and definitions), but the two I want to focus on here are component and demographic Allee effects.

Now, the evidence for component Allee effects abounds, but finding real instances of reduced population growth rate at low population sizes is difficult. And this is really what we should be focussing on in conservation biology – a lower-than-expected growth rate at low population sizes means that recovery efforts for rare and endangered species must be stepped up considerably because their rebound potential is lower than it should be.

We therefore queried over 1000 time series of abundance from many different species and lo and behold, the evidence for that little dip in population growth rate at low densities was indeed rare – about 1 % of all time series examined!

I suppose this isn’t that surprising, but what was interesting was that this didn’t depend on sample size (time series where Allee models had highest support were in fact shorter) or variability (they were also less variable). All this seems a little counter-intuitive, but it gels with what’s been assumed or hypothesised before. Measurement error, climate variability and the sheer paucity of low-abundance time series makes their detection difficult. Nonetheless, for those series showing demographic Allee effects, their relative model support was around 12%, suggesting that such density feedback might influence the population growth rate of just over 1 in 10 natural populations. In fact, the many problems with density feedback detections in time series that load toward negative feedback (sometimes spuriously) suggest that even our small sample of Allee time series are probably vastly underestimated. We have pretty firm evidence that inbreeding is prevalent in threatened species, and demographic Allee effects are the mechanism by which such depression can lead a population down the extinction vortex.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

ResearchBlogging.orgGregory, S., Bradshaw, C.J.A., Brook, B.W., & Courchamp, F. (2009). Limited evidence for the demographic Allee effect from numerous species across taxa Ecology DOI: 10.1890/09-1128





The biodiversity extinction numbers game

4 01 2010

© Ferahgo the Assassin

Not an easy task, measuring extinction. For the most part, we must use techniques to estimate extinction rates because, well, it’s just bloody difficult to observe when (and where) the last few individuals in a population finally kark it. Even Fagan & Holmes’ exhaustive search of extinction time series only came up with 12 populations – not really a lot to go on. It’s also nearly impossible to observe species going extinct if they haven’t even been identified yet (and yes, probably still the majority of the world’s species – mainly small, microscopic or subsurface species – have yet to be identified).

So conservation biologists do other things to get a handle on the rates, relying mainly on the species-area relationship (SAR), projecting from threatened species lists, modelling co-extinctions (if a ‘host’ species goes extinct, then its obligate symbiont must also) or projecting declining species distributions from climate envelope models.

But of course, these are all estimates and difficult to validate. Enter a nice little review article recently published online in Biodiversity and Conservation by Nigel Stork entitled Re-assessing current extinction rates which looks at the state of the art and how the predictions mesh with the empirical data. Suffice it to say, there is a mismatch.

Stork writes that the ‘average’ estimate of losing about 100 species per day has hardly any empirical support (not surprising); only about 1200 extinctions have been recorded in the last 400 years. So why is this the case?

As mentioned above, it’s difficult to observe true extinction because of the sampling issue (the rarer the individuals, the more difficult it is to find them). He does cite some other problems too – the ‘living dead‘ concept where species linger on for decades, perhaps longer, even though their essential habitat has been destroyed, forest regrowth buffering some species that would have otherwise been predicted to go extinct under SAR models, and differing extinction proneness among species (I’ve blogged on this before).

Of course, we could just all be just a pack of doomsday wankers vainly predicting the end of the world ;-)

Well, I think not – if anything, Stork concludes that it’s all probably worse than we currently predict because of extinction synergies (see previous post about this concept) and the mounting impact of rapid global climate change. If anything, the “100 species/day” estimate could look like a utopian ideal in a few hundred years. I do disagree with Stork on one issue though – he claims that deforestation isn’t probably as bad as we make it out. I’d say the opposite (see here, here & here) – we know so little of how tropical forests in particular function that I dare say we’ve only just started measuring the tip of the iceberg.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

This post was chosen as an Editor's Selection for ResearchBlogging.org

ResearchBlogging.orgStork, N. (2009). Re-assessing current extinction rates Biodiversity and Conservation DOI: 10.1007/s10531-009-9761-9





Not so ‘looming’ – Anthropocene extinctions

4 11 2009

ABCclip031109

© ABC 2009

Yesterday I was asked to do a quick interview on ABC television (Midday Report) about the release of the 2009 IUCN Red List of Threatened Species. I’ve blogged about the importance of the Red List before, but believe we have a lot more to do with species assessments and getting prioritisation right with respect to minimum viable population size. Have a listen to the interview itself, and read the IUCN’s media release reproduced below.

My basic stance is that we’ve only just started to assess the number of species on the planet (under 50000), yet there are many millions of species still largely under-studied and/or under-described (e.g., extant species richness = > 4 million protists, 16600 protozoa, 75000-300000 helminth parasites, 1.5 million fungi, 320000 plants, 4-6 million arthropods, > 6500 amphibians, 10000 birds and > 5000 mammals – see Bradshaw & Brook 2009 J Cosmol for references). What we’re looking at here is a refinement of knowledge (albeit a small one). We are indeed in the midst of the Anthropocene mass extinction event – there is nothing ‘looming’ about it. We are essentially losing species faster than we can assess them. I believe it’s important to make this clearer to those not working directly in the field of biodiversity conservation.

CJA Bradshaw

Extinction crisis continues apace – IUCN

Gland, Switzerland, 3 November, 2009 (IUCN) – The latest update of the IUCN Red List of Threatened Species™ shows that 17,291 species out of the 47,677 assessed species are threatened with extinction.

The results reveal 21 percent of all known mammals, 30 percent of all known amphibians, 12 percent of all known birds, and 28 percent of reptiles, 37 percent of freshwater fishes, 70 percent of plants, 35 percent of invertebrates assessed so far are under threat.

“The scientific evidence of a serious extinction crisis is mounting,” says Jane Smart, Director of IUCN’s Biodiversity Conservation Group. “January sees the launch of the International Year of Biodiversity. The latest analysis of the IUCN Red List shows the 2010 target to reduce biodiversity loss will not be met. It’s time for governments to start getting serious about saving species and make sure it’s high on their agendas for next year, as we’re rapidly running out of time.”

Of the world’s 5,490 mammals, 79 are Extinct or Extinct in the Wild, with 188 Critically Endangered, 449 Endangered and 505 Vulnerable. The Eastern Voalavo (Voalavo antsahabensis) appears on the IUCN Red List for the first time in the Endangered category. This rodent, endemic to Madagascar, is confined to montane tropical forest and is under threat from slash-and-burn farming.

There are now 1,677 reptiles on the IUCN Red List, with 293 added this year. In total, 469 are threatened with extinction and 22 are already Extinct or Extinct in the Wild. The 165 endemic Philippine species new to the IUCN Red List include the Panay Monitor Lizard (Varanus mabitang), which is Endangered. This highly-specialized monitor lizard is threatened by habitat loss due to agriculture and logging and is hunted by humans for food. The Sail-fin Water Lizard (Hydrosaurus pustulatus) enters in the Vulnerable category and is also threatened by habitat loss. Hatchlings are heavily collected both for the pet trade and for local consumption.

“The world’s reptiles are undoubtedly suffering, but the picture may be much worse than it currently looks,” says Simon Stuart, Chair of IUCN’s Species Survival Commission. “We need an assessment of all reptiles to understand the severity of the situation but we don’t have the $2-3 million to carry it out.”

The IUCN Red List shows that 1,895 of the planet’s 6,285 amphibians are in danger of extinction, making them the most threatened group of species known to date. Of these, 39 are already Extinct or Extinct in the Wild, 484 are Critically Endangered, 754 are Endangered and 657 are Vulnerable.

The Kihansi Spray Toad (Nectophrynoides asperginis) has moved from Critically Endangered to Extinct in the Wild. The species was only known from the Kihansi Falls in Tanzania, where it was formerly abundant with a population of at least 17,000. Its decline is due to the construction of a dam upstream of the Kihansi Falls that removed 90 percent of the original water flow to the gorge. The fungal disease chytridiomycosis was probably responsible for the toad’s final population crash.

The fungus also affected the Rabb’s Fringe-limbed Treefrog (Ecnomiohyla rabborum), which enters the Red List as Critically Endangered. It is known only from central Panama. In 2006, the chytrid fungus (Batrachochytrium dendrobatidis) was reported in its habitat and only a single male has been heard calling since. This species has been collected for captive breeding efforts but all attempts have so far failed.

Of the 12,151 plants on the IUCN Red List, 8,500 are threatened with extinction, with 114 already Extinct or Extinct in the Wild. The Queen of the Andes (Puya raimondii) has been reassessed and remains in the Endangered category. Found in the Andes of Peru and Bolivia, it only produces seeds once in 80 years before dying. Climate change may already be impairing its ability to flower and cattle roam freely among many colonies, trampling or eating young plants.

There are now 7,615 invertebrates on the IUCN Red List this year, 2,639 of which are threatened with extinction. Scientists added 1,360 dragonflies and damselflies, bringing the total to 1,989, of which 261 are threatened. The Giant Jewel (Chlorocypha centripunctata), classed as Vulnerable, is found in southeast Nigeria and southwest Cameroon and is threatened by forest destruction.

Scientists also added 94 molluscs, bringing the total number assessed to 2,306, of which 1,036 are threatened. Seven freshwater snails from Lake Dianchi in Yunnan Province, China, are new to the IUCN Red List and all are threatened. These join 13 freshwater fishes from the same area, 12 of which are threatened. The main threats are pollution, introduced fish species and overharvesting.

There are now 3,120 freshwater fishes on the IUCN Red List, up 510 species from last year. Although there is still a long way to go before the status all the world’s freshwater fishes is known, 1,147 of those assessed so far are threatened with extinction. The Brown Mudfish (Neochanna apoda), found only in New Zealand, has been moved from Near Threatened to Vulnerable as it has disappeared from many areas in its range. Approximately 85-90 percent of New Zealand’s wetlands have been lost or degraded through drainage schemes, irrigation and land development.

“Creatures living in freshwater have long been neglected. This year we have again added a large number of them to the IUCN Red List and are confirming the high levels of threat to many freshwater animals and plants. This reflects the state of our precious water resources. There is now an urgency to pursue our effort but most importantly to start using this information to move towards a wise use of water resources,” says Jean-Christophe Vié, Deputy Head of the IUCN Species Programme.

“This year’s IUCN Red List makes for sobering reading,” says Craig Hilton-Taylor, Manager of the IUCN Red List Unit. “These results are just the tip of the iceberg. We have only managed to assess 47,663 species so far; there are many more millions out there which could be under serious threat. We do, however, know from experience that conservation action works so let’s not wait until it’s too late and start saving our species now.”

The status of the Australian Grayling (Prototroctes maraena), a freshwater fish, has improved as a result of conservation efforts. Now classed as Near Threatened as opposed to Vulnerable, the population has recovered thanks to fish ladders which have been constructed over dams to allow migration, enhanced riverside vegetation and the education of fishermen, who now face heavy penalties if found with this species.





Wobbling to extinction

31 08 2009

crashI’ve been meaning to highlight for a while a paper that I’m finding more and more pertinent as a citation in my own work. The general theme is concerned with estimating extinction risk of a particular population, species (or even ecosystem), and more and more we’re finding that different drivers of population decline and eventual extinction often act synergistically to drive populations to that point of no return.

In other words, the whole is greater than the sum of its parts.

In other, other words, extinction risk is usually much higher than we generally appreciate.

This might seem at odds with my previous post about the tendency of the stochastic exponential growth model to over-estimate extinction risk using abundance time series, but it’s really more of a reflection of our under-appreciation of the complexity of the extinction process.

In the early days of ConservationBytes.com I highlighted a paper by Fagan & Holmes that described some of the few time series of population abundances right up until the point of extinction – the reason these datasets are so rare is because it gets bloody hard to find the last few individuals before extinction can be confirmed. Most recently, Melbourne & Hastings described in a paper entitled Extinction risk depends strongly on factors contributing to stochasticity published in Nature last year how an under-appreciated component of variation in abundance leads to under-estimation of extinction risk.

‘Demographic stochasticity’ is a fancy term for variation in the probability of births deaths at the individual level. Basically this means that there will be all sorts of complicating factors that move any individual in a population away from its expected (mean) probability of dying or reproducing. When taken as a mean over a lot of individuals, it has generally been assumed that demographic stochasticity is washed out by other forms of variation in mean (population-level) birth and death probability resulting from vagaries of the environmental context (e.g., droughts, fires, floods, etc.).

‘No, no, no’, say Melbourne & Hastings. Using some relatively simple laboratory experiments where environmental stochasticity was tightly controlled, they showed that demographic stochasticity dominated the overall variance and that environmental variation took a back seat. The upshot of all these experiments and mathematical models is that for most species of conservation concern (i.e., populations already reduced below to their minimum viable populations size), not factoring in the appropriate measures of demographic wobble means that most people are under-estimating extinction risk.

Bloody hell – we’ve been saying this for years; a few hundred individuals in any population is a ridiculous conservation target. People must instead focus on getting their favourite endangered species to number at least in the several thousands if the species is to have any hope of persisting (this is foreshadowing a paper we have coming out shortly in Biological Conservationstay tuned for a post thereupon).

Melbourne & Hastings have done a grand job in reminding us how truly susceptible small populations are to wobbling over the line and disappearing forever.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





Rare just tastes better

11 02 2009

I had written this a while ago for publication, but my timing was out and no one had room to publish it. So, I’m reproducing it here as an extension to a previous post (That looks rare – I’ll kill that one).

As the international market for luxury goods expands in value, extent and diversity of items (Nueno & Quelch 1998), the world’s burgeoning pool of already threatened species stands to worsen. Economic theory predicts that harvested species should eventually find refuge from over-exploitation because it simply becomes too costly to find the last remaining wild individuals (Koford & Tschoegl 1998). However, the self-reinforcing cycle of human greed (Brook & Sodhi 2006) can make rare species increasingly valuable to a few select consumers such that mounting financial incentives drive species to extinction (Courchamp et al. 2006). The economic and ecological arguments are compelling, but to date there has been little emphasis on how the phenomenon arises in the human thought process, nor how apparently irrational behaviour can persist. Gault and colleagues (2008) have addressed this gap in a paper published recently in Conservation Letters by examining consumer preferences for arguably one of the most stereotypical luxury food items, caviar from the 200-million-year-old sturgeon (Acipenser spp.).

Sturgeon (6 genera) populations worldwide are in trouble, with all but two of the 27 known species threatened with extinction (either Near Threatened, Vulnerable, Endangered or Critically Endangered) according to the International Union for Conservation of Nature and Natural Resources’ (IUCN) Red List of Threatened Species. Despite all 27 species also having strict international trade restrictions imposed by the Convention on International Trade in Endangered Species (CITES) (Gault et al. 2008), intense commercial pressure persists for 15 of these at an estimated global value exceeding US$200 million annually (Pikitch et al. 2005). The very existence of the industry itself and the luxury good it produces are therefore, at least for some regions, unlikely to endure over the next decade (Pala 2007). What drives such irrational behaviour and why can we not seem to prevent such coveted species from spiralling down the extinction vortex?

Gault and colleagues addressed this question specifically in an elegantly simple set of preference experiments targeting the very end-consumers of the caviar production line – French connoisseurs. Some particularly remarkable results were derived from presentations of identical caviar; 86 % of attendees of luxury receptions not only preferred falsely labelled ‘rarer’ Siberian caviar (A. baeri) after blind tasting experiments, they also scored what they believed was caviar from the rarer species as having a higher ‘gustative quality’. These high-brow results were compared to more modest consumers in French supermarkets, with similar conclusions. Not only were unsuspecting gourmands fooled into believing the experimental propaganda, subjects in both cases stated a preference for seemingly rarer caviar even prior to tasting.

The psycho-sociological implications of perceived rarity are disturbing themselves; but Gault and colleagues extended their results with a mathematical game theory model demonstrating how irrational choices drive just such a harvested species to extinction. The economic implications of attempting to curb exploitation as species become rarer when the irrationality of perceived rarity was taken into consideration were telling – there is no payoff in delaying exploitation as more and more consumers are capable of entering the market. In other words, the assumption that consumers apply a positive temporal discount rate to their payoff (Olson & Bailey 1981) is wrong, with the demographic corollary that total depletion of the resource ensues. The authors contend that such artificial value may drive the entire luxury goods market based mainly on the self-consciousness and social status of consumers able to afford these symbols of affluence.

The poor record of species over-exploitation by humans arising from the Tragedy of the Commons (Hardin 1968) is compounded by this new information. This anthropogenic Allee effect (Courchamp et al. 2006) provides a novel example mechanism for how small populations are driven ever-downward because low densities ensure declining fitness. Many species may follow the same general rules, from bluefin tuna, Napoleon wrasse lips and shark fins, to reptile skins and Tibetan antelope woollen shawls. Gault and colleagues warn that as the human population continues to expand and more people enter the luxury-goods market, more wildlife species will succumb to this Allee effect-driven extinction vortex.

The authors suggest that a combination of consumer education and the encouragement of farmed substitute caviar will be more effective than potentially counter-productive trading bans that ultimately encourage illegal trade. However, the preference results suggest that education might not promote positive action given that reluctance of affluent consumers to self-limit. I believe that the way forward instead requires a combination of international trade bans, certification schemes for ‘sustainable’ goods that flood markets to increase supply and reduce price, better controls on point-of-origin labelling, and even state-controlled ‘warning’ systems to alert prospective consumers that they are enhancing the extinction risk of the very products they enjoy. A better architecture for trading schemes and market systems that embrace long-term persistence can surely counteract the irrationality of the human-induced destruction of global ecosystem services. We just need to put our minds and pocketbooks to the task.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





Classics: the Allee effect

22 12 2008

220px-Vortex_in_draining_bottle_of_waterAs humanity plunders its only home and continues destroying the very life that sustains our ‘success’, certain concepts in ecology, evolution and conservation biology are being examined in greater detail in an attempt to apply them to restoring at least some elements of our ravaged biodiversity.

One of these concepts has been largely overlooked in the last 30 years, but is making a conceptual comeback as the processes of extinction become better quantified. The so-called Allee effect can be broadly defined as a “…positive relationship between any component of individual fitness and either numbers or density of conspecifics” (Stephens et al. 1999, Oikos 87:185-190) and is attributed to Warder Clyde Allee, an American ecologist from the early half of the 20th century, although he himself did not coin the term. Odum referred to it as “Allee’s principle”, and over time, the concept morphed into what we now generally call ‘Allee effects’.

Nonetheless, I’m using Allee’s original 1931 book Animal Aggregations: A Study in General Sociology (University of Chicago Press) as the Classics citation here. In his book, Allee discussed the evidence for the effects of crowding on demographic and life history traits of populations, which he subsequently redefined as “inverse density dependence” (Allee 1941, American Naturalist 75:473-487).

What does all this have to do with conservation biology? Well, broadly speaking, when populations become small, many different processes may operate to make an individual’s average ‘fitness’ (measured in many ways, such as survival probability, reproductive rate, growth rate, et cetera) decline. The many and varied types of Allee effects can work together to drive populations even faster toward extinction than expected by chance alone because of self-reinforcing feedbacks (see also previous post on the small population paradigm). Thus, ignorance of potential Allee effects can bias everything from minimum viable population size estimates, restoration attempts and predictions of extinction risk.

A recent paper in the journal Trends in Ecology and Evolution by Berec and colleagues entitled Multiple Allee effects and population management gives a more specific breakdown of Allee effects in a series of definitions I reproduce here for your convenience:

Allee threshold: critical population size or density below which the per capita population growth rate becomes negative.

Anthropogenic Allee effect: mechanism relying on human activity, by which exploitation rates increase with decreasing population size or density: values associated with rarity of the exploited species exceed the costs of exploitation at small population sizes or low densities (see related post).

Component Allee effect: positive relationship between any measurable component of individual fitness and population size or density.

Demographic Allee effect: positive relationship between total individual fitness, usually quantified by the per capita population growth rate, and population size or density.

Dormant Allee effect: component Allee effect that either does not result in a demographic Allee effect or results in a weak Allee effect and which, if interacting with a strong Allee effect, causes the overall Allee threshold to be higher than the Allee threshold of the strong Allee effect alone.

Double dormancy: two component Allee effects, neither of which singly result in a demographic Allee effect, or result only in a weak Allee effect, which jointly produce an Allee threshold (i.e. the double Allee effect becomes strong).

Genetic Allee effect: genetic-level mechanism resulting in a positive relationship between any measurable fitness component and population size or density.

Human-induced Allee effect: any component Allee effect induced by a human activity.

Multiple Allee effects: any situation in which two or more component Allee effects work simultaneously in the same population.

Nonadditive Allee effects: multiple Allee effects that give rise to a demographic Allee effect with an Allee threshold greater or smaller than the algebraic sum of Allee thresholds owing to single Allee effects.

Predation-driven Allee effect: a general term for any component Allee effect in survival caused by one or multiple predators whereby the per capita predation-driven mortality rate of prey increases as prey numbers or density decline.

Strong Allee effect: demographic Allee effect with an Allee threshold.

Subadditive Allee effects: multiple Allee effects that give rise to a demographic Allee effect with an Allee threshold smaller than the algebraic sum of Allee thresholds owing to single Allee effects.

Superadditive Allee effects: multiple Allee effects that give rise to a demographic Allee effect with an Allee threshold greater than the algebraic sum of Allee thresholds owing to single Allee effects.

Weak Allee effect: demographic Allee effect without an Allee threshold.

For even more detail, I suggest you obtain the 2008 book by Courchamp and colleagues entitled Allee Effects in Ecology and Conservation (Oxford University Press).

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

(Many thanks to Salvador Herrando-Pérez for his insight on terminology)





That looks rare – I’ll kill that one

12 12 2008

Here’s an interesting (and disturbing) one from Conservation Letters by Gault and colleagues entitled Consumers’ taste for rarity drives sturgeons to extinction.

I like caviar, I have to admit. I enjoy the salty fishy-ness and the contrast it makes with the appropriate selection of wine (bubbly or otherwise). I guess a lot of other people like it too, to the extent that worldwide sturgeon population’s have been  hammered (all 27 species are listed in CITES Appendix I or II, and 15 species are still heavily exploited). Indeed, in the Caspian Sea from where 90 % of caviar comes, sturgeon populations have declined by 90 % since the late 1980s. Admittedly, I haven’t had sturgeon caviar very often, and I doubt I’ll ever eat it again.

Using a set of simple ‘preference’ experiments on epicurean (French) human subjects, Gault and colleagues found that when told that a particular type of caviar was rarer than the others (when in reality, they two choices were identical), these refined gourmets generally tended to claim that the rarer one tasted better.

This means that humans have a tendency to place exaggerated value on harvested species when they think they’re rare (in most instances, rarity is itself the result of over-exploitation by humans). This so-called ‘anthropogenic Allee effect‘ (see Courchamp et al. 2006) basically means that at least for the wildlife-based luxury market, there’s little chance that calls for reduced harvest will be heard because people continually adjust their willingness to pay more. This turns into a spiralling extinction vortex for the species concerned.

What to do? Ban all trade of caviar? This might do it, but with the reluctance to reduce highly profitable industries like this (see previous post on tuna over-exploitation here), there’s a strong incentive even for the harvesters to drive themselves out of a job. Consumer education (and a good dose of guilt) might help too, but I have my doubts.

CJA Bradshaw

© S. Crownover courtesy of Caviar Emptor

© S. Crownover courtesy of Caviar Emptor

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





Classics: The Living Dead

30 08 2008

‘Classics’ is a category of posts highlighting research that has made a real difference to biodiversity conservation. All posts in this category will be permanently displayed on the Classics page of ConservationBytes.com

Zombie_ElephantTilman, D., May, R.M., Lehman, C.L., Nowak, M.A. (1994) Habitat destruction and the extinction debt. Nature 371, 65-66

In my opinion, this is truly a conservation classic because it shatters optimistic notions that extinction is something only rarely the consequence of human activities (see relevant post here). The concept of ‘extinction debt‘ is pretty simple – as habitats become increasingly fragmented, long-lived species that are reproductively isolated from conspecifics may take generations to die off (e.g., large trees in forest fragments). This gives rise to a higher number of species than would be otherwise expected for the size of the fragment, and the false impression that many species can persist in habitat patches that are too small to sustain minimum viable populations.

These ‘living dead‘ or ‘zombie‘ species are therefore committed to extinction regardless of whether habitat loss is arrested or reversed. Only by assisted dispersal and/or reproduction can such species survive (an extremely rare event).

Why has this been important? Well, neglecting the extinction debt is one reason why some people have over-estimated the value of fragmented and secondary forests in guarding species against extinction (see relevant example here for the tropics and Brook et al. 2006). It basically means that biological communities are much less resilient to fragmentation than would otherwise be expected given data on species presence collected shortly after the main habitat degradation or destruction event. To appreciate fully the extent of expected extinctions may take generations (e.g., hundreds of years) to come to light, giving us yet another tool in the quest to minimise habitat loss and fragmentation.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





The extinction vortex

25 08 2008

One for the Potential list:

vortexFirst coined by Gilpin & Soulé in 1986, the extinction vortex is the term used to describe the process that declining populations undergo when”a mutual reinforcement occurs among biotic and abiotic processes that drives population size downward to extinction” (Brook, Sodhi & Bradshaw 2008).

Although several types of ‘vortices’ were labelled by Gilpin & Soulé, the concept was subsequently simplified by Caughley (1994) in his famous paper on the declining and small population paradigms, but only truly quantified for the first time by Fagan & Holmes (2006) in their Ecology Letters paper entitled Quantifying the extinction vortex.

Fagan and Holmes compiled a small time-series database of ten vertebrate species (two mammals, five birds, two reptiles and a fish) whose final extinction was witnessed via monitoring. They confirmed that the time to extinction scales to the logarithm of population size. In other words, as populations decline, the time elapsing before extinction occurs becomes rapidly (exponentially) smaller and smaller. They also found greater rates of population decline nearer to the time of extinction than earlier in the population’s history, confirming the expectation that genetic deterioration contributes to a general corrosion of individual performance (fitness). Finally, they found that the variability in abundance was also highest as populations approached extinction, irrespective of population size, thus demonstrating indirectly that random environmental fluctuations take over to cause the final extinction regardless of what caused the population to decline in the first place.

What does this mean for conservation efforts? It was fundamentally the first empirical demonstration that the theory of accelerating extinction proneness occurs as populations decline, meaning that all attempts must be made to ensure large population sizes if there is any chance of maintaining long-term persistence. This relates to the minimum viable population size concept that should underscore each and every recovery and target set or desired for any population in trouble or under conservation scrutiny.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl