Parochial conservation

30 01 2010
© cagiecartoons.com

A little bit of conservation wisdom for you this weekend.

In last week’s issue of Nature, well-known conservation planner and all-round smart bloke, Reed Noss (who just happens to be an editor for Conservation Letters and Conservation Biology), provided some words of extreme wisdom. Not pulling any punches in his Correspondence piece entitled Local priorities can be too parochial for biodiversity, Noss essentially says ‘don’t leave the important biodiversity decisions to the locals’.

He argues rather strongly in his response to Smith and colleagues’ opinion piece (Let the locals lead) that local administrators just can’t be trusted to make good conservation decisions given their focus on local economic development and other political imperatives. He basically says that the big planning decisions should be made at grander scales that over-ride local concerns because, well, the big fish in their little ponds can’t be trusted (nor do they have the training) to do what’s best for regional biodiversity conservation.

I couldn’t agree more – he states:

“Academic researchers, conservation non-governmental organizations and other ‘foreign’ interests tend to be better informed, less subject to local political influence and more experienced in conservation planning than local agencies.”

Of course, being part of the first group, I’m probably a little biased, but I dare say that we’ve got a lot better handle on the science beyond saving biodiversity, as well as a better understanding of why that’s important, than your average regional representative, village council, chief, Lord Mayor or state member. Sure, ‘engage your stakeholders’ (I have images of shooting missiles at people holding star pickets with this gem of business jargon wankery, but there you go), but please base the decision on science first. I think Smith and colleagues have some good points, but I am more in favour of a broad-scale benevolent dictatorship in conservation planning than fine-scale democracy. Granted, the best formula is likely to be very context-specific, and of course, you need some people with local implementation power to make it happen.

Dear Honourable Minister, you may sign on the dotted line to make policy real, but please, please listen to us before you do. Your very life and those of your children depend on it.

CJA Bradshaw

ResearchBlogging.orgNoss, R. (2010). Local priorities can be too parochial for biodiversity Nature, 463 (7280), 424-424 DOI: 10.1038/463424a

Smith, R., Veríssimo, D., Leader-Williams, N., Cowling, R., & Knight, A. (2009). Let the locals lead Nature, 462 (7271), 280-281 DOI: 10.1038/462280a

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to Yahoo BuzzAdd to Newsvine





Cartoon guide to biodiversity loss VI

26 01 2010

The continuing saga of laughing at our own lunacy (see previous cartoon entries here).

© WWF

© C. Madden

CJA Bradshaw

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to Yahoo BuzzAdd to Newsvine





Avoiding the REDD monster

22 01 2010

© Floog

A short post about a small letter that recently appeared in the latest issue of Conservation Biology – the dangers of REDD.

REDD. What is it? The acronym for ‘Reduced Emissions from Deforestation and Degradation’, it is the idea of providing financial incentives to developing countries to reduce forest clearance by paying them to keep them standing. It should work because of the avoided carbon emissions that can be gained from keeping forests intact. Hell, we certainly need it given the biodiversity crisis arising mainly from deforestation occurring in much of the (largely tropical) developing world. The idea is that someone pollutes, buys carbon credits that are then paid to some developing nation to prevent more forest clearance, and then biodiversity gets a helping hand in the process. It’s essentially carbon trading with an added bonus. Nice idea, but difficult to implement for a host of reasons that I won’t go into here (but see Miles & Kapos Science 2008 & Busch et al. 2009 Environ Res Lett).

Venter and colleagues in their letter entitled Avoiding Unintended Outcomes from REDD now warn us about another potential hazard of REDD that needs some pretty quick thinking and clever political manoeuvring to avoid.

While REDD is a good idea and I support it fully with carefully designed implementation, Venter and colleagues say that without good monitoring data and some well-planned immediate policy implementation, there could be a rush to clear even more forest area in the short term.

Essentially they argue that when the Kyoto Protocol expires in 2012, there could be a 2-year gap when forest loss would not be counted against carbon payments, and its in this window that countries might fell forests and expand agriculture before REDD takes effect (i.e., clear now and avoid later penalties).

How do we avoid this? The authors suggest that the implementation of policies to reward early efforts to reduce forest clearance and to penalise those who rush to do early clearing need to be put in place NOW. Rewards could take the form of credits, and penalties could be something like the annulment of future REDD discounts. Of course, to achieve any of this you have to know who’s doing well and who’s playing silly buggers, which means good forest monitoring. Satellite imagery analysis is probably key here.

CJA Bradshaw
ResearchBlogging.orgOscar Venter, James E.M. Watson, Erik Meijaard, William F. Laurance, & Hugh P. Possingham (2010). Avoiding Unintended Outcomes from REDD Conservation Biology, 24 (1), 5-6 DOI: 10.1111/j.1523-1739.2009.01391.x

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to Yahoo BuzzAdd to Newsvine





No chance Europe will recover fish stocks

19 01 2010

Alternate title: When pigs fly and fish say ‘hi’.

I’m covering a quick little review of a paper just published online in Fish and Fisheries about the two chances Europe has of meeting its legal obligations of rebuilding its North East Atlantic fish stocks by 2015 (i.e., Buckley’s and none).

The paper entitled Rebuilding fish stocks no later than 2015: will Europe meet the deadline? by Froese & Proelß describes briefly the likelihood Europe will meet the obligations set out under the United Nations’ Law of the Sea (UNCLOS) of “maintaining or restoring fish stocks at levels that are capable of producing maximum sustainable yield” by 2015 as set out in the Johannesburg Plan of Implementation of 2002.

Using fish stock assessment data and several criteria (3 methods for estimating maximum sustainable yield [MSY], 3 methods for estimating fishing mortality [Fmsy] & 2 methods for estimating spawning biomass [Bmsy]), they conclude that 49 (91 %) of the examined European stocks will fail to meet the goal under a ‘business as usual’ scenario.

The upshot is that European fisheries authorities have been and continue to set their total allowable catches (TACs) too high. We’ve seen this before with Atlantic bluefin tuna and the International Conspiracy to Catch All Tunas. Seems like most populations of exploited fishes are in fact in the same boat (quite literally!).

It’s amazing, really, the lack of ‘political will’ in fisheries – driving your source of income into oblivion doesn’t seem to register in the short-sighted vision of those earning their associated living or those supposedly looking out for their long-term interests.

CJA Bradshaw

ResearchBlogging.orgFroese, R., & Proelß, A. (2010). Rebuilding fish stocks no later than 2015: will Europe meet the deadline? Fish and Fisheries DOI: 10.1111/j.1467-2979.2009.00349.x

Pitcher, T., Kalikoski, D., Pramod, G., & Short, K. (2009). Not honouring the code Nature, 457 (7230), 658-659 DOI: 10.1038/457658a

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to Yahoo BuzzAdd to Newsvine





Society for Conservation Biology’s 24th International Congress

15 01 2010

I’m off for a long weekend at the beach, so I decided to keep this short. My post concerns the upcoming (well, July 2010) 24th International Congress for Conservation Biology (Society for Conservation Biology – SCB) to be held in Edmonton, Canada from 3-7 July 2010. I hadn’t originally planned on attending, but I’ve changed my mind and will most certainly be giving a few talks there.

There’s not much to report yet, apart from the abstract submission deadline looming next week (20 January). If you plan on submitting an abstract, get it in now (I’m rushing too). Actual registration opens online on 15 February.

The conference’s theme is “Conservation for a Changing Planet” – well, you can’t get much more topical (and general) than that! The conference website states:

Humans are causing large changes to the ecology of the earth. Industrial development and agriculture are changing landscapes. Carbon emissions to the atmosphere are changing climates. Nowhere on earth are changes to climate having more drastic effects on ecosystems and human cultures than in the north. Circumpolar caribou and reindeer populations are declining with huge consequences for indigenous peoples of the north, motivating our use of caribou in the conference logo. Developing conservation strategies to cope with our changing planet is arguably the greatest challenge facing today’s world and its biodiversity.

Sort of hits home in a personal way for me – I did my MSc on caribou populations in northern Canada a long time before getting into conservation biology proper (see example papers: Woodland caribou relative to landscape patterns in northeastern Alberta, Effects of petroleum exploration on woodland caribou in Northeastern Alberta & Winter peatland habitat selection by woodland caribou in northeastern Alberta), and we’ve recently published a major review on the boreal ecosystem.

Only 3 plenary speakers listed so far: David Schindler, Shane Mahoney and Georgina Mace (the latter being a featured Conservation Scholar here on ConservationBytes.com). I’m particularly looking forward to Georgina’s presentation. I’ll hopefully be able to blog some of the presentations while there. If you plan on attending, please come up and say hello!

CJA Bradshaw





Computer-assisted killing for conservation

12 01 2010

Many non-Australians might not know it, but Australia is overrun with feral vertebrates (not to mention weeds and invertebrates). We have millions of pigs, dogs, camels, goats, buffalo, deer, rabbits, cats, foxes and toads (to name a few). In a continent that separated from Gondwana about 80 million years ago, this allowed a fairly unique biota to evolve, such that when Aboriginals and later, Europeans, started introducing all these non-native species, it quickly became an ecological disaster. One of my first posts here on ConservationBytes.com was in fact about feral animals. Since then, I’ve written quite a bit on invasive species, especially with respect to mammal declines (see Few people, many threats – Australia’s biodiversity shame, Shocking continued loss of Australian mammals, Can we solve Australia’s mammal extinction crisis?).

So you can imagine that we do try to find the best ways to reduce the damage these species cause; unfortunately, we tend to waste a lot of money because density reduction culling programmes aren’t usually done with much forethought, organisation or associated research. A case in point – swamp buffalo were killed in vast numbers in northern Australia in the 1980s and 1990s, but now they’re back with a vengeance.

Enter S.T.A.R. – the clumsily named ‘Spatio-Temporal Animal Reduction’ [model] that we’ve just published in Methods in Ecology and Evolution (title: Spatially explicit spreadsheet modelling for optimising the efficiency of reducing invasive animal density by CR McMahon and colleagues).

This little Excel-based spreadsheet model is designed specifically to optimise the culling strategies for feral pigs, buffalo and horses in Kakadu National Park (northern Australia), but our aim was to make it easy enough to use and modify so that it could be applied to any invasive species anywhere (ok, admittedly it would work best for macro-vertebrates).

The application works on a grid of habitat types, each with their own carrying capacities for each species. We then assume some fairly basic density-feedback population models and allow animals to move among cells. We then hit them virtually with a proportional culling rate (which includes a hunting-efficiency feedback), and estimate the costs associated with each level of kill. The final outputs give density maps and graphs of the population trajectory.

We’ve added a lot of little features to maximise flexibility, including adjusting carrying capacities, movement rates, operating costs and overheads, and proportional harvest rates. The user can also get some basic sensitivity analyses done, or do district-specific culls. Finally, we’ve included three optimisation routines that estimate the best allocation of killing effort, for both maximising density reduction or working to a specific budget, and within a spatial or non-spatial context.

Our hope is that wildlife managers responsible for safeguarding the biodiversity of places like Kakadu National Park actually use this tool to maximise their efficiency. Kakadu has a particularly nasty set of invasive species, so it’s important those in charge get it right. So far, they haven’t been doing too well.

You can download the Excel program itself here (click here for the raw VBA code), and the User Manual is available here. Happy virtual killing!

CJA Bradshaw

P.S. If you’re concerned about animal welfare issues associated with all this, I invite you to read one of our recent papers on the subject: Convergence of culture, ecology and ethics: management of feral swamp buffalo in northern Australia.

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

ResearchBlogging.orgC.R. McMahon, B.W. Brook,, N. Collier, & C.J.A. Bradshaw (2010). Spatially explicit spreadsheet modelling for optimising the efficiency of reducing invasive animal density Methods in Ecology and Evolution : 10.1111/j.2041-210X.2009.00002.x

Albrecht, G., McMahon, C., Bowman, D., & Bradshaw, C. (2009). Convergence of Culture, Ecology, and Ethics: Management of Feral Swamp Buffalo in Northern Australia Journal of Agricultural and Environmental Ethics, 22 (4), 361-378 DOI: 10.1007/s10806-009-9158-5

Bradshaw, C., Field, I., Bowman, D., Haynes, C., & Brook, B. (2007). Current and future threats from non-indigenous animal species in northern Australia: a spotlight on World Heritage Area Kakadu National Park Wildlife Research, 34 (6) DOI: 10.1071/WR06056





The elusive Allee effect

8 01 2010

© D. Bishop, Getty Images

In keeping with the theme of extinctions from my last post, I want to highlight a paper we’ve recently had published online early in Ecology entitled Limited evidence for the demographic Allee effect from numerous species across taxa by Stephen Gregory and colleagues. This one is all about Allee effects – well, it’s all about how difficult it is to find them!

If you recall, an Allee effect is a “…positive relationship between any component of individual fitness and either numbers or density of conspecifics” (Stephens et al. 1999, Oikos 87:185-190) and the name itself is attributed to Warder Clyde Allee. There are many different kinds of Allee effects (see previous Allee effects post for Berec and colleagues’ full list of types and definitions), but the two I want to focus on here are component and demographic Allee effects.

Now, the evidence for component Allee effects abounds, but finding real instances of reduced population growth rate at low population sizes is difficult. And this is really what we should be focussing on in conservation biology – a lower-than-expected growth rate at low population sizes means that recovery efforts for rare and endangered species must be stepped up considerably because their rebound potential is lower than it should be.

We therefore queried over 1000 time series of abundance from many different species and lo and behold, the evidence for that little dip in population growth rate at low densities was indeed rare – about 1 % of all time series examined!

I suppose this isn’t that surprising, but what was interesting was that this didn’t depend on sample size (time series where Allee models had highest support were in fact shorter) or variability (they were also less variable). All this seems a little counter-intuitive, but it gels with what’s been assumed or hypothesised before. Measurement error, climate variability and the sheer paucity of low-abundance time series makes their detection difficult. Nonetheless, for those series showing demographic Allee effects, their relative model support was around 12%, suggesting that such density feedback might influence the population growth rate of just over 1 in 10 natural populations. In fact, the many problems with density feedback detections in time series that load toward negative feedback (sometimes spuriously) suggest that even our small sample of Allee time series are probably vastly underestimated. We have pretty firm evidence that inbreeding is prevalent in threatened species, and demographic Allee effects are the mechanism by which such depression can lead a population down the extinction vortex.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

ResearchBlogging.orgGregory, S., Bradshaw, C.J.A., Brook, B.W., & Courchamp, F. (2009). Limited evidence for the demographic Allee effect from numerous species across taxa Ecology DOI: 10.1890/09-1128





The biodiversity extinction numbers game

4 01 2010

© Ferahgo the Assassin

Not an easy task, measuring extinction. For the most part, we must use techniques to estimate extinction rates because, well, it’s just bloody difficult to observe when (and where) the last few individuals in a population finally kark it. Even Fagan & Holmes’ exhaustive search of extinction time series only came up with 12 populations – not really a lot to go on. It’s also nearly impossible to observe species going extinct if they haven’t even been identified yet (and yes, probably still the majority of the world’s species – mainly small, microscopic or subsurface species – have yet to be identified).

So conservation biologists do other things to get a handle on the rates, relying mainly on the species-area relationship (SAR), projecting from threatened species lists, modelling co-extinctions (if a ‘host’ species goes extinct, then its obligate symbiont must also) or projecting declining species distributions from climate envelope models.

But of course, these are all estimates and difficult to validate. Enter a nice little review article recently published online in Biodiversity and Conservation by Nigel Stork entitled Re-assessing current extinction rates which looks at the state of the art and how the predictions mesh with the empirical data. Suffice it to say, there is a mismatch.

Stork writes that the ‘average’ estimate of losing about 100 species per day has hardly any empirical support (not surprising); only about 1200 extinctions have been recorded in the last 400 years. So why is this the case?

As mentioned above, it’s difficult to observe true extinction because of the sampling issue (the rarer the individuals, the more difficult it is to find them). He does cite some other problems too – the ‘living dead‘ concept where species linger on for decades, perhaps longer, even though their essential habitat has been destroyed, forest regrowth buffering some species that would have otherwise been predicted to go extinct under SAR models, and differing extinction proneness among species (I’ve blogged on this before).

Of course, we could just all be just a pack of doomsday wankers vainly predicting the end of the world ;-)

Well, I think not – if anything, Stork concludes that it’s all probably worse than we currently predict because of extinction synergies (see previous post about this concept) and the mounting impact of rapid global climate change. If anything, the “100 species/day” estimate could look like a utopian ideal in a few hundred years. I do disagree with Stork on one issue though – he claims that deforestation isn’t probably as bad as we make it out. I’d say the opposite (see here, here & here) – we know so little of how tropical forests in particular function that I dare say we’ve only just started measuring the tip of the iceberg.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

This post was chosen as an Editor's Selection for ResearchBlogging.org

ResearchBlogging.orgStork, N. (2009). Re-assessing current extinction rates Biodiversity and Conservation DOI: 10.1007/s10531-009-9761-9