It’s not all about cats

20 10 2014

Snake+OilIf you follow any of the environment news in Australia, you will most certainly have seen a lot about feral cats in the last few weeks. I’ve come across dozens of articles in the last week alone talking about the horrendous toll feral cats have had on Australian wildlife since European arrival. In principle, this is a good thing because finally Australians are groggily waking to the fact that our house moggies and their descendants have royally buggered our biodiversity. As a result, we have the highest mammal extinction rate of any country.

But I argue that the newfound enthusiasm for killing anything feline is being peddled mainly as a distraction from bigger environmental issues and to camouflage the complete incompetence of the current government and their all-out war on the environment.

Call me cynical, but when I read headlines like “Australia aims to end extinction of native wildlife by 2020” and Environment Minister Hunt’s recent speech that he has “… set a goal of ending the loss of mammal species by 2020“, I get more than just a little sick to the stomach.

What a preposterous load of shite. Moreover, what a blatant wool-pulling-over-the-eyes public stunt. Read the rest of this entry »

Biodiversity Hotspots have nearly burnt out

10 07 2014

dying embersI recently came across a really important paper that might have flown under the radar for many people. For this reason, I’m highlighting it here and will soon write up a F1000 Recommendation. This is news that needs to be heard, understood and appreciated by conservation scientists and environmental policy makers everywhere.

Sean Sloan and colleagues (including conservation guru, Bill Laurance) have just published a paper entitled Remaining natural vegetation in the global biodiversity hotspots in Biological Conservation, and it we are presented with some rather depressing and utterly sobering data.

Unless you’ve been living under a rock for the past 20 years, you’ll have at least heard of the global Biodiversity Hotspots (you can even download GIS layers for them here). From an initial 10, to 25, they increased penultimately to 34; most recently with the addition of the Forests of East Australia, we now have 35 Biodiversity Hotspots across the globe. The idea behind these is to focus conservation attention, investment and intervention in the areas with the most unique species assemblages that are simultaneously experiencing the most human-caused disturbances.

Indeed, today’s 35 Biodiversity Hotspots include 77 % of all mammal, bird, reptile and amphibian species (holy shit!). They also harbour about half of all plant species, and 42 % of endemic (not found anywhere else) terrestrial vertebrates. They also have the dubious honour of hosting 75 % of all endangered terrestrial vertebrates (holy, holy shit!). Interestingly, it’s not just amazing biological diversity that typifies the Hotspots – human cultural diversity is also high within them, with about half of the world’s indigenous languages found therein.

Of course, to qualify as a Biodiversity Hotspot, an area needs to be under threat – and under threat they area. There are now over 2 billion people living within Biodiversity Hotspots, so it comes as no surprise that about 85 % of their area is modified by humans in some way.

A key component of the original delimitation of the Hotspots was the amount of ‘natural intact vegetation’ (mainly undisturbed by humans) within an area. While revolutionary 30 years ago, these estimates were based to a large extent on expert opinions, undocumented assessments and poor satellite data. Other independent estimates have been applied to the Hotspots to estimate their natural intact vegetation, but these have rarely been made specifically for Hotspots, and they have tended to discount non-forest or open-forest vegetation formations (e.g., savannas & shrublands).

So with horribly out-of-date vegetation assessments fraught with error and uncertainty, Sloan and colleagues set out to estimate what’s really going on vegetation-wise in the world’s 35 Biodiversity Hotspots. What they found is frightening, to say the least.

Read the rest of this entry »

50/500 or 100/1000 debate not about time frame

26 06 2014

Not enough individualsAs you might recall, Dick Frankham, Barry Brook and I recently wrote a review in Biological Conservation challenging the status quo regarding the famous 50/500 ‘rule’ in conservation management (effective population size [Ne] = 50 to avoid inbreeding depression in the short-term, and Ne = 500 to retain the ability to evolve in perpetuity). Well, it inevitably led to some comments arising in the same journal, but we were only permitted by Biological Conservation to respond to one of them. In our opinion, the other comment was just as problematic, and only further muddied the waters, so it too required a response. In a first for me, we have therefore decided to publish our response on the arXiv pre-print server as well as here on

50/500 or 100/1000 debate is not about the time frame – Reply to Rosenfeld

cite as: Frankham, R, Bradshaw CJA, Brook BW. 2014. 50/500 or 100/1000 debate is not about the time frame – Reply to Rosenfeld. arXiv: 1406.6424 [q-bio.PE] 25 June 2014.

The Letter from Rosenfeld (2014) in response to Jamieson and Allendorf (2012) and Frankham et al. (2014) and related papers is misleading in places and requires clarification and correction, as follows: Read the rest of this entry »

If biodiversity is so important, why is Europe not languishing?

17 03 2014

collapseI don’t often respond to many comments on this blog unless they are really, really good questions (and if I think I have the answers). Even rarer is devoting an entire post to answering a question. The other day, I received a real cracker, and so I think it deserves a highlighted response.

Two days ago, a certain ‘P. Basu’ asked this in response to my last blog post (Lose biodiversity and you’ll get sick):

I am an Indian who lived in Germany for quite a long period. Now, if I am not grossly mistaken, once upon a time Germany and other west european countries had large tracts of “real” forests with bears, wolves, foxes and other animals (both carnivore and herbivore). Bear has completely disappeared from these countries with the advent of industrialization. A few wolves have been kept in more or less artificially created forests. Foxes, deer and hares, fortunately, do still exist. My question is, how come these countries are still so well off – not only from the point of view of economy but also from the angle of public health despite the loss of large tracts of natural forests? Or is it that modern science and a health conscious society can compensate the loss of biodiversity.

“Well”, I thought to myself, “Bloody good question”.

I have come across this genre of question before, but usually under more hostile circumstances when an overtly right-wing respondent (hell, let’s call a spade a spade – a ‘completely selfish arsehole’) has challenged me on the ‘value of nature’ logic (I’m not for a moment suggesting that P. Basu is this sort of person; on the contrary, he politely asked an extremely important question that requires an answer). The comeback generally goes something like this: “If biodiversity is so important, why aren’t super-developed countries wallowing in economic and social ruin because they’ve degraded their own life-support systems? Clearly you must be wrong, Sir.”

There have been discussions in the ecological and sustainability literature that have attempted to answer this, but I’ll give it a shot here for the benefit of readers. Read the rest of this entry »

Biowealth: all creatures great and small

4 12 2013

Curious Country flyer“So consider the crocodiles, sharks and snakes, the small and the squirmy, the smelly, slimy and scaly. Consider the fanged and the hairy, the ugly and the cute alike. The more we degrade this astonishing diversity of evolved life and all its interactions on our only home, the more we expose ourselves to the ravages of a universe that is inherently hostile to life.”

excerpt from ‘Biowealth: all creatures great and small’ The Curious Country (C.J.A. Bradshaw 2013).

I’ve spent the last few days on the east coast with my science partner-in-crime, Barry Brook, and one of our newest research associates (Marta Rodrigues-Rey Gomez). We first flew into Sydney at sparrow’s on Monday, then drove a hire car down to The ‘Gong to follow up on some Australian megafauna databasing & writing with Bert Roberts & Zenobia Jacobs. On Tuesday morning we then flitted over to Canberra where we had the opportunity to attend the official launch of a new book that Barry and I had co-authored.

The book, The Curious Country, is an interesting experiment in science communication and teaching dreamed up by Australia’s Chief Scientist, Professor Ian Chubb. Realising that the average Aussie has quite a few questions about ‘how stuff works’, but has little idea how to answer those questions, Ian engaged former Quantum star and science editor, Leigh Dayton, to put together a short, punchy, topical and easily understood book about why science is good for the country.

Yes, intuitive for most of you out there reading this, but science appreciation isn’t always as high as it should be amongst the so-called ‘general public’. Ian thought this might be one way to get more people engaged.

When honoured with the request to write an interesting chapter on biodiversity for the book, I naturally accepted. It turns out Barry was asked to do one on energy provision at the same time (but we didn’t know we had both been asked at the time). Our former lab head, Professor David Bowman, was also asked to write a chapter about fire risk, so it was like a mini-reunion yesterday for the three of us.

Read the rest of this entry »

Quantity, but not quality – slow recovery of disturbed tropical forests

8 11 2013

tropical regrowthIt is a sobering statistic that most of the world’s tropical forests are not ‘primary’ – that is, those that have not suffered some alteration or disturbance from humans (previously logged, cleared for agriculture, burned, etc.).

Today I highlight a really cool paper that confirms this, plus adds some juicy (and disturbing – pun intended – detail). The paper by Phil Martin and colleagues just published in Proceedings of the Royal Society B came to my attention through various channels – not least of which was their citation of one of our previous papers ;-), as well as a blog post by Phil himself. I was so impressed with it that I made my first Faculty of 1000 Prime recommendation1 of the paper (which should appear shortly).

As we did in 2011 (to which Phil refers as our “soon-to-be-classic work” – thanks!), Martin and colleagues amassed a stunning number of papers investigating the species composition of disturbed and primary forests from around the tropics. Using meta-analysis, they matched disturbed and undisturbed sites, recording the following statistics: Read the rest of this entry »

Too small to avoid catastrophic biodiversity meltdown

27 09 2013
Chiew Larn

Chiew Larn Reservoir is surrounded by Khlong Saeng Wildlife Sanctuary and Khao Sok National Park, which together make up part of the largest block of rainforest habitat in southern Thailand (> 3500 km2). Photo: Antony Lynam

One of the perennial and probably most controversial topics in conservation ecology is when is something “too small’. By ‘something’ I mean many things, including population abundance and patch size. We’ve certainly written about the former on many occasions (see here, here, here and here for our work on minimum viable population size), with the associated controversy it elicited.

Now I (sadly) report on the tragedy of the second issue – when is a habitat fragment too small to be of much good to biodiversity?

Published today in the journal Science, Luke Gibson (of No substitute for primary forest fame) and a group of us report disturbing results about the ecological meltdown that has occurred on islands created when the Chiew Larn Reservoir of southern Thailand was flooded nearly 30 years ago by a hydroelectric dam.

As is the case in many parts of the world (e.g., Three Gorges Dam, China), hydroelectric dams can cause major ecological problems merely by flooding vast areas. In the case of Charn Liew, co-author Tony Lynam of Wildlife Conservation Society passed along to me a bit of poignant and emotive history about the local struggle to prevent the disaster.

“As the waters behind the dam were rising in 1987, Seub Nakasathien, the Superintendent of the Khlong Saeng Wildlife Sanctuary, his staff and conservationist friends, mounted an operation to capture and release animals that were caught in the flood waters.

It turned out to be distressing experience for all involved as you can see from the clips here, with the rescuers having only nets and longtail boats, and many animals dying. Ultimately most of the larger mammals disappeared quickly from the islands, leaving just the smaller fauna.

Later Seub moved to Huai Kha Khaeng Wildlife Sanctuary and fought an unsuccessful battle with poachers and loggers, which ended in him taking his own life in despair in 1990. A sad story, and his friend, a famous folk singer called Aed Carabao, wrote a song about Seub, the music of which plays in the video. Read the rest of this entry »

Biogeography comes of age

22 08 2013

penguin biogeographyThis week has been all about biogeography for me. While I wouldn’t call myself a ‘biogeographer’, I certainly do apply a lot of the discipline’s techniques.

This week I’m attending the 2013 Association of Ecology’s (INTECOL) and British Ecological Society’s joint Congress of Ecology in London, and I have purposefully sought out more of the biogeographical talks than pretty much anything else because the speakers were engaging and the topics fascinating. As it happens, even my own presentation had a strong biogeographical flavour this year.

Although the species-area relationship (SAR) is only one small aspect of biogeography, I’ve been slightly amazed that after more than 50 years since MacArthur & Wilson’s famous book, our discipline is still obsessed with SAR.

I’ve blogged about SAR issues before – what makes it so engaging and controversial is that SAR is the principal tool to estimate overall extinction rates, even though it is perhaps one of the bluntest tools in the ecological toolbox. I suppose its popularity stems from its superficial simplicity – as the area of an (classically oceanic) island increases, so too does the total number of species it can hold. The controversies surrounding such as basic relationship centre on describing the rate of that species richness increase with area – in other words, just how nonlinear the SAR itself is.

Even a cursory understanding of maths reveals the importance of estimating this curve correctly. As the area of an ‘island’ (habitat fragment) decreases due to human disturbance, estimating how many species end up going extinct as a result depends entirely on the shape of the SAR. Get the SAR wrong, and you can over- or under-estimate the extinction rate. This was the crux of the palaver over Fangliang He (not attending INTECOL) & Stephen Hubbell’s (attending INTECOL) paper in Nature in 2011.

The first real engagement of SAR happened with John Harte’s maximum entropy talk in the process macroecology session on Tuesday. What was notable to me was his adamant claim that the power-law form of SAR should never be used, despite its commonness in the literature. I took this with a grain of salt because I know all about how messy area-richness data can be, and why one needs to consider alternate models (see an example here). But then yesterday I listened to one of the greats of biogeography – Robert Whittaker – who said pretty much the complete opposite of Harte’s contention. Whittaker showed results from one of his papers last year that the power law was in fact the most commonly supported SAR among many datasets (granted, there was substantial variability in overall model performance). My conclusion remains firm – make sure you use multiple models for each individual dataset and try to infer the SAR from model-averaging. Read the rest of this entry »

No-extinction targets are destined to fail

21 09 2012

I’ve been meaning to write about this for a while, and now finally I have been given the opportunity to put my ideas ‘down on paper’ (seems like a bit of an old-fashioned expression these days). Now this post might strike some as overly parochial because it concerns the state in which I live, but the concept applies to every jurisdiction that passes laws designed to protect biodiversity. So please look beyond my navel and place the example within your own specific context.

As CB readers will appreciate, I am firmly in support of the application of conservation triage – that is, the intelligent, objective and realistic way of attributing finite resources to minimise extinctions for the greatest number of (‘important’) species. Note that deciding which species are ‘important’ is the only fly in the unguent here, with ‘importance’ being defined inter alia as having a large range (to encompass many other species simultaneously), having an important ecological function or ecosystem service, representing rare genotypes, or being iconic (such that people become interested in investing to offset extinction.

But without getting into the specifics of triage per se, a related issue is how we set environmental policy targets. While it’s a lovely, utopian pipe dream that somehow our consumptive 7-billion-and-growing human population will somehow retract its massive ecological footprint and be able to save all species from extinction, we all know that this is irrevocably  fantastical.

So when legislation is passed that is clearly unattainable, why do we accept it as realistic? My case in point is South Australia’s ‘No Species Loss Strategy‘ (you can download the entire 7.3 Mb document here) that aims to

“…lose no more species in South Australia, whether they be on land, in rivers, creeks, lakes and estuaries or in the sea.”

When I first learned of the Strategy, I instantly thought to myself that while the aims are laudable, and many of the actions proposed are good ones, the entire policy is rendered toothless by the small issue of being impossible. Read the rest of this entry »

Conservation catastrophes

22 02 2012

David Reed

The title of this post serves two functions: (1) to introduce the concept of ecological catastrophes in population viability modelling, and (2) to acknowledge the passing of the bloke who came up with a clever way of dealing with that uncertainty.

I’ll start with latter first. It came to my attention late last year that a fellow conservation biologist colleague, Dr. David Reed, died unexpectedly from congestive heart failure. I did not really mourn his passing, for I had never met him in person (I believe it is disingenuous, discourteous, and slightly egocentric to mourn someone who you do not really know personally – but that’s just my opinion), but I did think at the time that the conservation community had lost another clever progenitor of good conservation science. As many CB readers already know, we lost a great conservation thinker and doer last year, Professor Navjot Sodhi (and that, I did take personally). Coincidentally, both Navjot and David died at about the same age (49 and 48, respectively). I hope that the being in one’s late 40s isn’t particularly presaged for people in my line of business!

My friend, colleague and lab co-director, Professor Barry Brook, did, however, work a little with David, and together they published some pretty cool stuff (see References below). David was particularly good at looking for cross-taxa generalities in conservation phenomena, such as minimum viable population sizes, effects of inbreeding depression, applications of population viability analysis and extinction risk. But more on some of that below. Read the rest of this entry »

Not magic, but necessary

18 10 2011

In April this year, some American colleagues of ours wrote a rather detailed, 10-page article in Trends in Ecology and Evolution that attacked our concept of generalizing minimum viable population (MVP) size estimates among species. Steve Beissinger of the University of California at Berkeley, one of the paper’s co-authors, has been a particularly vocal adversary of some of the applications of population viability analysis and its child, MVP size, for many years. While there was some interesting points raised in their review, their arguments largely lacked any real punch, and they essentially ended up agreeing with us.

Let me explain. Today, our response to that critique was published online in the same journal: Minimum viable population size: not magic, but necessary. I want to take some time here to summarise the main points of contention and our rebuttal.

But first, let’s recap what we have been arguing all along in several papers over the last few years (i.e., Brook et al. 2006; Traill et al. 2007, 2010; Clements et al. 2011) – a minimum viable population size is the point at which a declining population becomes a small population (sensu Caughley 1994). In other words, it’s the point at which a population becomes susceptible to random (stochastic) events that wouldn’t otherwise matter for a small population.

Consider the great auk (Pinguinus impennis), a formerly widespread and abundant North Atlantic species that was reduced by intensive hunting throughout its range. How did it eventually go extinct? The last remaining population blew up in a volcanic explosion off the coast of Iceland (Halliday 1978). Had the population been large, the small dent in the population due to the loss of those individuals would have been irrelevant.

But what is ‘large’? The empirical evidence, as we’ve pointed out time and time again, is that large = thousands, not hundreds, of individuals.

So this is why we advocate that conservation targets should aim to keep at or recover to the thousands mark. Less than that, and you’re playing Russian roulette with a species’ existence. Read the rest of this entry »

Little left to lose: deforestation history of Australia

6 10 2011

© donkeycart

I don’t usually do this, but I’m going to blog about a paper I’ve just had accepted in the Journal of Plant Ecology that isn’t yet out online. The reason for the early post is that the paper itself won’t appear until 2012 in a special issue of the journal, and I think the information needs to get out there.

First, a little history – In May this year I blogged about a workshop that I attended at Sun Yat-Sen University in Guangzhou, China at the behest of Fangliang He. The workshop (International Symposium for Biodiversity and Theoretical Ecology) was attended by big-wig overseas ecologists and local talent, and was not only informative, but a lot of fun (apart from the slight headache on the way home from a little too much báijiǔ the night before). More importantly, we  lǎo wài (老外) were paired with various students to assist with publications in progress, and I’m happy to say that for me, two of those have already produced fruit (one paper in review, another about to be submitted).

But the real reason for this post was the special issue of papers written by the invitees – I haven’t published in the journal before, and understand that it is a Chinese journal that has gone mainstream internationally now. I’m only happy to contribute to lifting its profile.

Given that I’m not a plant ecologist per se (although I’ve dabbled), I decided to write a review-like paper that I’ve been meaning to put together for some time now examining the state of Australia’s forests and the history of her deforestation and forest degradation. The reason I thought this was needed is that there is no single peer-reviewed resource one can turn to for a concise synopsis of the history of our country’s forest destruction. The stats are out there, but they’re buried in books, government reports and local-scale scientific papers. My hope is that my paper will be used as a general reference point for people wishing to get up to speed with Australia’s deforestation history.

The paper is entitled Little left to lose: deforestation and forest degradation in Australia since European colonisation, and it describes the general trends in forest loss and degradation Australia-wide, followed by state- and territory-level assessments. I’ve also included sections on plantations, biodiversity loss from deforestation and fragmentation, the feedback loop between climate change and deforestation, the history of forest protection legislation, and finally, a discussion of the necessary general policy directions needed for the country’s forests.

I’ve given a few titbits of the stats in a previous post, but let me just summarise some of the salient features here: Read the rest of this entry »

Life, death and Linneaus

9 07 2011

Barry Brook (left) and Lian Pin Koh (right) attacking Fangliang He (centre). © CJA Bradshaw

I’m sitting in the Brisbane airport contemplating how best to describe the last week. If you’ve been following my tweets, you’ll know that I’ve been sequestered in a room with 8 other academics trying to figure out the best ways to estimate the severity of the Anthropocene extinction crisis. Seems like a pretty straight forward task. We know biodiversity in general isn’t doing so well thanks to the 7 billion Homo sapiens on the planet (hence, the Anthropo prefix) – the question though is: how bad?

I blogged back in March that a group of us were awarded a fully funded series of workshops to address that question by the Australian Centre for Ecological Synthesis and Analysis (a Terrestrial Ecosystem Research Network facility based at the University of Queensland), and so I am essentially updating you on the progress of the first workshop.

Before I summarise our achievements (and achieve, we did), I just want to describe the venue. Instead of our standard, boring, windowless room in some non-descript building on campus, ACEAS Director, Associate Professor Alison Specht, had the brilliant idea of putting us out away from it all on a beautiful nature-conservation estate on the north coast of New South Wales.

What a beautiful place – Linneaus Estate is a 111-ha property just a few kilometres north of Lennox Head (about 30 minutes by car south of Byron Bay) whose mission is to provide a sustainable living area (for a very lucky few) while protecting and restoring some pretty amazing coastal habitat along an otherwise well-developed bit of Australian coastline. And yes, it’s named after Carl Linnaeus. Read the rest of this entry »

Over-estimating extinction rates

19 05 2011

I meant to get this out yesterday, but was too hamstrung with other commitments. Now the media circus has beat me to the punch. Despite the lateness (in news-time) of my post, my familiarity with the analysis and the people involved gives me a unique insight, I believe.

So a couple of months ago, Fangliang He and I were talking about some new analysis he was working on where he was testing the assumption that back-casted species-area relationships (SAR) gave reasonable estimates of inferred extinction rates. Well, that paper has just been published in today’s issue of Nature  by Fangliang He and Stephen Hubbell entitled: Species–area relationships always overestimate extinction rates from habitat loss (see also the News & Views piece by Carsten Rahbek and Rob Colwell).

The paper has already stirred up something of a controversy before the ink has barely had time to dry. Predictably, noted conservation biologists like Stuart Pimm and Michael Rosenzweig have already jumped down Fangliang’s throat.

Extinction rates of modern biota in the current biodiversity crisis (Ehrlich & Pringle 2008) are wildly imprecise. Indeed, it has been proposed that extinction rates exceed the deep-time average background rate by 100- to 10000-fold (Lawton & May 2008; May et al. 1995; Pimm & Raven 2000), and no rigorously quantification of these rates globally has ever been accomplished (although there are several taxon- and region-specific estimates of localised extinction rates (Brook et al. 2003; Regan et al. 2001; Hambler et al. 2011; Shaw 2005).

Much of the information used to infer past extinction rate estimates is based on  the species-area relationship (e.g., Brook et al. 2003); this method estimates extinction rates by reversing the species-area accumulation curve, extrapolating backward to smaller areas to calculate expected species loss. The concept is relatively simple, even though the underlying mathematics might not be. Read the rest of this entry »

Species’ Ability to Forestall Extinction – AudioBoo

8 04 2011

Here’s a little interview I just did on the SAFE index with ABC AM:

Not a bad job, really.

And here’s another one from Radio New Zealand:

CJA Bradshaw

How fast are we losing species anyway?

28 03 2011

© W. Laurance

I’ve indicated over the last few weeks on Twitter that a group of us were recently awarded funding from the Australian Centre for Ecological Synthesis and Analysis – ACEAS – (much like the US version of the same thing – NCEAS) to run a series of analytical workshops to estimate, with a little more precision and less bias than has been done previously, the extinction rates of today’s biota relative to deep-time extinctions.

So what’s the issue? The Earth’s impressive diversity of life has experienced at least five mass extinction events over geological time. Species’ extinctions have kept pace with evolution, with more than 99 % of all species that have ever existed now gone (Bradshaw & Brook 2009). Despite general consensus that biodiversity has entered the sixth mass extinction event because of human-driven degradation of the planet, estimated extinction rates remain highly imprecise (from 100s to 10000s times background rates). This arises partly because the total number of species is unknown for many groups, and most extinctions go unnoticed.

So how are we going to improve on our highly imprecise estimates? One way is to look at the species-area relationship (SAR), which to estimate extinction requires one to extrapolate back to the origin in taxon- and region-specific SARs (e.g., with a time series of deforestation, one can estimate how many species would have been lost if we know how species diversity changes in relation to habitat area). Read the rest of this entry »

Classics: demography versus genetics

16 03 2011

Here’s another short, but sweet Conservation Classic highlighted in our upcoming book chapter (see previous entries on this book). Today’s entry comes from long-time quantitative ecology guru, Russ Lande, who is now based at the Silwood Park Campus (Imperial College London).


In an influential review, Lande (1988) argued that

“…demography may usually be of more immediate importance than population genetics in determining the minimum viable size of wild populations”.

It was a well-reasoned case, and was widely interpreted to mean that demographic and ecological threats would provide the ‘killer blow’ to threatened species before genetic factors such as inbreeding and fitness effects of loss of genetic diversity had time to exert a major influence on small population dynamics.

Read the rest of this entry »

S.A.F.E. = Species Ability to Forestall Extinction

8 01 2011

Note: I’ve just rehashed this post (30/03/2011) because the paper is now available online (see comment stream). Stay tuned for the media release next week. – CJAB

I’ve been more or less underground for the last 3 weeks. It has been a wonderful break (mostly) from the normally hectic pace of academic life. Thanks for all those who remain despite the recent silence.


But I’m back now with a post about a paper we’ve just had accepted in Frontiers in Ecology and Environment. In my opinion it’s a leap forward in how we measure relative threat risk among species, despite some criticism.

I’ve written in past posts about the ‘magic’ minimum number of individuals that should be in a population to reduce the chance of extinction from random events. The so-called ‘minimum viable population (MVP) size’ is basically the abundance of a (connected) population below which random events take over from factors causing sustained declines (Caughley’s distinction between the ‘declining’ and ‘small’ population paradigms).

Up until the last few years, the MVP size was considered to be a population- or species-specific value, and it required very detailed demographic, genetic and biogeographical data to estimate – not something that biologists tend to have at their fingertips for most high-risk species. However, several papers published by our group (Minimum viable population size and global extinction risk are unrelated, Minimum viable population size: a meta-analysis of 30 years of published estimates and Pragmatic population viability targets in a rapidly changing world) have shown that there is in fact little variation in this number among the best-studied species; both demographic and genetic data support a number of around 5000 to avoid crossing the deadly threshold.

Now the fourth paper in this series has just been accepted (sorry, no link yet, but I’ll let you all know as soon as it is available), and it was organised and led by Reuben Clements, and co-written by me, Barry Brook and Bill Laurance.

The idea is fairly simple and it somewhat amazes me that it hasn’t been implemented before. The SAFE (Species Ability to Forestall Extinction) index is simply the distance a population is (in terms of abundance) from its MVP. In the absence of a species-specific value, we used the 5000-individual threshold. Thus, Read the rest of this entry »

Why and how did Pleistocene megafauna go extinct?

27 05 2010

Just a quick post to say that I’m currently at Duke University in the USA attending a special National Evolutionary Synthesis Centre ‘Catalysis Meeting’ entitled: Integrating datasets to investigate megafaunal extinction in the Late Quaternary.

The meeting is basically about nailing down some of the remaining mysteries and controversies surrounding the extinction of many species during periods of rapid climate change 11-60 thousand years ago.

It’s been fun so far, and a lot of exciting analysis will ensue, but for the meantime I’ll just summarise what we’re trying to do. Read the rest of this entry »

The biodiversity extinction numbers game

4 01 2010

© Ferahgo the Assassin

Not an easy task, measuring extinction. For the most part, we must use techniques to estimate extinction rates because, well, it’s just bloody difficult to observe when (and where) the last few individuals in a population finally kark it. Even Fagan & Holmes’ exhaustive search of extinction time series only came up with 12 populations – not really a lot to go on. It’s also nearly impossible to observe species going extinct if they haven’t even been identified yet (and yes, probably still the majority of the world’s species – mainly small, microscopic or subsurface species – have yet to be identified).

So conservation biologists do other things to get a handle on the rates, relying mainly on the species-area relationship (SAR), projecting from threatened species lists, modelling co-extinctions (if a ‘host’ species goes extinct, then its obligate symbiont must also) or projecting declining species distributions from climate envelope models.

But of course, these are all estimates and difficult to validate. Enter a nice little review article recently published online in Biodiversity and Conservation by Nigel Stork entitled Re-assessing current extinction rates which looks at the state of the art and how the predictions mesh with the empirical data. Suffice it to say, there is a mismatch.

Stork writes that the ‘average’ estimate of losing about 100 species per day has hardly any empirical support (not surprising); only about 1200 extinctions have been recorded in the last 400 years. So why is this the case?

As mentioned above, it’s difficult to observe true extinction because of the sampling issue (the rarer the individuals, the more difficult it is to find them). He does cite some other problems too – the ‘living dead‘ concept where species linger on for decades, perhaps longer, even though their essential habitat has been destroyed, forest regrowth buffering some species that would have otherwise been predicted to go extinct under SAR models, and differing extinction proneness among species (I’ve blogged on this before).

Of course, we could just all be just a pack of doomsday wankers vainly predicting the end of the world ;-)

Well, I think not – if anything, Stork concludes that it’s all probably worse than we currently predict because of extinction synergies (see previous post about this concept) and the mounting impact of rapid global climate change. If anything, the “100 species/day” estimate could look like a utopian ideal in a few hundred years. I do disagree with Stork on one issue though – he claims that deforestation isn’t probably as bad as we make it out. I’d say the opposite (see here, here & here) – we know so little of how tropical forests in particular function that I dare say we’ve only just started measuring the tip of the iceberg.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

This post was chosen as an Editor's Selection for

ResearchBlogging.orgStork, N. (2009). Re-assessing current extinction rates Biodiversity and Conservation DOI: 10.1007/s10531-009-9761-9

%d bloggers like this: