Biogeography comes of age

22 08 2013

penguin biogeographyThis week has been all about biogeography for me. While I wouldn’t call myself a ‘biogeographer’, I certainly do apply a lot of the discipline’s techniques.

This week I’m attending the 2013 Association of Ecology’s (INTECOL) and British Ecological Society’s joint Congress of Ecology in London, and I have purposefully sought out more of the biogeographical talks than pretty much anything else because the speakers were engaging and the topics fascinating. As it happens, even my own presentation had a strong biogeographical flavour this year.

Although the species-area relationship (SAR) is only one small aspect of biogeography, I’ve been slightly amazed that after more than 50 years since MacArthur & Wilson’s famous book, our discipline is still obsessed with SAR.

I’ve blogged about SAR issues before – what makes it so engaging and controversial is that SAR is the principal tool to estimate overall extinction rates, even though it is perhaps one of the bluntest tools in the ecological toolbox. I suppose its popularity stems from its superficial simplicity – as the area of an (classically oceanic) island increases, so too does the total number of species it can hold. The controversies surrounding such as basic relationship centre on describing the rate of that species richness increase with area – in other words, just how nonlinear the SAR itself is.

Even a cursory understanding of maths reveals the importance of estimating this curve correctly. As the area of an ‘island’ (habitat fragment) decreases due to human disturbance, estimating how many species end up going extinct as a result depends entirely on the shape of the SAR. Get the SAR wrong, and you can over- or under-estimate the extinction rate. This was the crux of the palaver over Fangliang He (not attending INTECOL) & Stephen Hubbell’s (attending INTECOL) paper in Nature in 2011.

The first real engagement of SAR happened with John Harte’s maximum entropy talk in the process macroecology session on Tuesday. What was notable to me was his adamant claim that the power-law form of SAR should never be used, despite its commonness in the literature. I took this with a grain of salt because I know all about how messy area-richness data can be, and why one needs to consider alternate models (see an example here). But then yesterday I listened to one of the greats of biogeography – Robert Whittaker – who said pretty much the complete opposite of Harte’s contention. Whittaker showed results from one of his papers last year that the power law was in fact the most commonly supported SAR among many datasets (granted, there was substantial variability in overall model performance). My conclusion remains firm – make sure you use multiple models for each individual dataset and try to infer the SAR from model-averaging. Read the rest of this entry »

The rarity of commonness

18 08 2009

I’m attending the 10th International Congress of Ecology (INTECOL) in Brisbane this week and I have just managed to find (a) an internet connection and (b) a small window to write this post.

I have to say I haven’t been to a good plenary talk for some time – maybe it’s just bad luck, but often plenary talks can be less-than-inspiring.

Not so for INTECOL this year. I was very pleased to have the opportunity to listen to biodiversity guru Professor Kevin Gaston of the University of Sheffield give a fantastic talk on… common species (?!).


If you have followed any of Kevin’s work, you’ll know he literally wrote the book on rarity – what species rarity is, how to measure it and what it means for preserving biodiversity as a whole.

Now he’s championing (in a very loose sense) the importance of common species because it is these taxa, he argues, that provide the backbone to the persistence of all biodiversity.

Yes, we conservation biologists have tended to focus on the rare and endemic species to make certain we have as much diversity in species (and genetic material) as possible when conserving habitats.

There are a lot more rare species than common ones, and the most common species (i.e., the ones you most often see) tend to have the broadest distributions. We know from much previous work that having a broad distribution reduces extinction risk, so why should we be concerned about common species?

Kevin made a very good point – if you turn the relationship on its head somewhat, you can state that the state of ‘commonness’ is itself ‘rare’. In fact, only about 25 % of the most common species account for about 90-95 % of ALL individuals. He used an interesting (and scary) example to show what this can mean from an extinction perspective. Although very back-of-the-envelope, there are about 2000 individual birds in a km2 of tropical forest; we are losing between 50000 and 120000 km2 of tropical forest per year, so this translates into the loss of about 100 to 240 million individual birds per year; this is the sum total of all birds in Great Britain (a bird-mad country). Yet where do we have the best information about birds? The UK.

Commonness is also geologically transient, meaning that just because you are a common species at some point in your evolutionary history doesn’t mean you have always been or always will be. In fact, most species never do become common.

But it is just these ‘rare’ common that drive the principal patterns we see globally in community structure. The true ‘rare’ species are, in fact, pretty crap predictors of biodiversity patterns. Kevin made a good point – when you look at a satellite image of a forest, it’s not all the little rare species you see, it’s the 2 or 3 most common tree species that make up the forest. Lose those, and you lose everything else.

Indeed, common species also form most trophic structure (the flow of energy through biological communities). Take away these, and ecosystem function degrades. They also tend to have the highest biomass and provide the structure that supports all those millions of rare species. Being common is quite an important job.

Kevin stated that the world is now in a state where many of the so-called common species are in fact, “artificially” common because of how much we’ve changed the planet. It is these benefactors of our world-destroying machinations that are now in decline themselves, and it is for this reason we should be worried.

When you start to see these bastions of ecosystems start to drop off (and the drop is usually precipitous because we don’t tend to notice their loss until they suddenly disappear), then you know we’re in trouble. And yet, even though once common, few, if any, once-common species have come back after a big decline.

So what does this mean for the way we do biodiversity research? Kevin proposes that we need a lot more good monitoring of these essential common species so that we can understand their structuring roles in the community and more importantly, be able to track their change before ecosystem collapse occurs. The monitoring is crucial – it wasn’t the demise of small companies that signalled the 2007 stock market crash responsible for the Global Financial Crisis in which we now find ourselves, the signal was derived from stock data obtained from just a few large (i.e., ‘common’) companies. All the small companies (‘rare’) ones then followed suit.

A very inspiring, worrying and somewhat controversial talk. Watch out for more things ‘Gaston’ on in the near future.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl