Computer-assisted killing for conservation

12 01 2010

Many non-Australians might not know it, but Australia is overrun with feral vertebrates (not to mention weeds and invertebrates). We have millions of pigs, dogs, camels, goats, buffalo, deer, rabbits, cats, foxes and toads (to name a few). In a continent that separated from Gondwana about 80 million years ago, this allowed a fairly unique biota to evolve, such that when Aboriginals and later, Europeans, started introducing all these non-native species, it quickly became an ecological disaster. One of my first posts here on ConservationBytes.com was in fact about feral animals. Since then, I’ve written quite a bit on invasive species, especially with respect to mammal declines (see Few people, many threats – Australia’s biodiversity shame, Shocking continued loss of Australian mammals, Can we solve Australia’s mammal extinction crisis?).

So you can imagine that we do try to find the best ways to reduce the damage these species cause; unfortunately, we tend to waste a lot of money because density reduction culling programmes aren’t usually done with much forethought, organisation or associated research. A case in point – swamp buffalo were killed in vast numbers in northern Australia in the 1980s and 1990s, but now they’re back with a vengeance.

Enter S.T.A.R. – the clumsily named ‘Spatio-Temporal Animal Reduction’ [model] that we’ve just published in Methods in Ecology and Evolution (title: Spatially explicit spreadsheet modelling for optimising the efficiency of reducing invasive animal density by CR McMahon and colleagues).

This little Excel-based spreadsheet model is designed specifically to optimise the culling strategies for feral pigs, buffalo and horses in Kakadu National Park (northern Australia), but our aim was to make it easy enough to use and modify so that it could be applied to any invasive species anywhere (ok, admittedly it would work best for macro-vertebrates).

The application works on a grid of habitat types, each with their own carrying capacities for each species. We then assume some fairly basic density-feedback population models and allow animals to move among cells. We then hit them virtually with a proportional culling rate (which includes a hunting-efficiency feedback), and estimate the costs associated with each level of kill. The final outputs give density maps and graphs of the population trajectory.

We’ve added a lot of little features to maximise flexibility, including adjusting carrying capacities, movement rates, operating costs and overheads, and proportional harvest rates. The user can also get some basic sensitivity analyses done, or do district-specific culls. Finally, we’ve included three optimisation routines that estimate the best allocation of killing effort, for both maximising density reduction or working to a specific budget, and within a spatial or non-spatial context.

Our hope is that wildlife managers responsible for safeguarding the biodiversity of places like Kakadu National Park actually use this tool to maximise their efficiency. Kakadu has a particularly nasty set of invasive species, so it’s important those in charge get it right. So far, they haven’t been doing too well.

You can download the Excel program itself here (click here for the raw VBA code), and the User Manual is available here. Happy virtual killing!

CJA Bradshaw

P.S. If you’re concerned about animal welfare issues associated with all this, I invite you to read one of our recent papers on the subject: Convergence of culture, ecology and ethics: management of feral swamp buffalo in northern Australia.

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

ResearchBlogging.orgC.R. McMahon, B.W. Brook,, N. Collier, & C.J.A. Bradshaw (2010). Spatially explicit spreadsheet modelling for optimising the efficiency of reducing invasive animal density Methods in Ecology and Evolution : 10.1111/j.2041-210X.2009.00002.x

Albrecht, G., McMahon, C., Bowman, D., & Bradshaw, C. (2009). Convergence of Culture, Ecology, and Ethics: Management of Feral Swamp Buffalo in Northern Australia Journal of Agricultural and Environmental Ethics, 22 (4), 361-378 DOI: 10.1007/s10806-009-9158-5

Bradshaw, C., Field, I., Bowman, D., Haynes, C., & Brook, B. (2007). Current and future threats from non-indigenous animal species in northern Australia: a spotlight on World Heritage Area Kakadu National Park Wildlife Research, 34 (6) DOI: 10.1071/WR06056





The biodiversity extinction numbers game

4 01 2010

© Ferahgo the Assassin

Not an easy task, measuring extinction. For the most part, we must use techniques to estimate extinction rates because, well, it’s just bloody difficult to observe when (and where) the last few individuals in a population finally kark it. Even Fagan & Holmes’ exhaustive search of extinction time series only came up with 12 populations – not really a lot to go on. It’s also nearly impossible to observe species going extinct if they haven’t even been identified yet (and yes, probably still the majority of the world’s species – mainly small, microscopic or subsurface species – have yet to be identified).

So conservation biologists do other things to get a handle on the rates, relying mainly on the species-area relationship (SAR), projecting from threatened species lists, modelling co-extinctions (if a ‘host’ species goes extinct, then its obligate symbiont must also) or projecting declining species distributions from climate envelope models.

But of course, these are all estimates and difficult to validate. Enter a nice little review article recently published online in Biodiversity and Conservation by Nigel Stork entitled Re-assessing current extinction rates which looks at the state of the art and how the predictions mesh with the empirical data. Suffice it to say, there is a mismatch.

Stork writes that the ‘average’ estimate of losing about 100 species per day has hardly any empirical support (not surprising); only about 1200 extinctions have been recorded in the last 400 years. So why is this the case?

As mentioned above, it’s difficult to observe true extinction because of the sampling issue (the rarer the individuals, the more difficult it is to find them). He does cite some other problems too – the ‘living dead‘ concept where species linger on for decades, perhaps longer, even though their essential habitat has been destroyed, forest regrowth buffering some species that would have otherwise been predicted to go extinct under SAR models, and differing extinction proneness among species (I’ve blogged on this before).

Of course, we could just all be just a pack of doomsday wankers vainly predicting the end of the world ;-)

Well, I think not – if anything, Stork concludes that it’s all probably worse than we currently predict because of extinction synergies (see previous post about this concept) and the mounting impact of rapid global climate change. If anything, the “100 species/day” estimate could look like a utopian ideal in a few hundred years. I do disagree with Stork on one issue though – he claims that deforestation isn’t probably as bad as we make it out. I’d say the opposite (see here, here & here) – we know so little of how tropical forests in particular function that I dare say we’ve only just started measuring the tip of the iceberg.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

This post was chosen as an Editor's Selection for ResearchBlogging.org

ResearchBlogging.orgStork, N. (2009). Re-assessing current extinction rates Biodiversity and Conservation DOI: 10.1007/s10531-009-9761-9





Conservation Biology for All

26 12 2009

A new book that I’m proud to have had a hand in writing is just about to come out with Oxford University Press called Conservation Biology for All. Edited by the venerable Conservation Scholars, Professors Navjot Sodhi (National University of Singapore) and Paul Ehrlich (Stanford University), it’s a powerhouse of some of the world’s leaders in conservation science and application.

The book strives to “…provide cutting-edge but basic conservation science to a global readership”. In short, it’s written to bring the forefront of conservation science to the general public, with OUP promising to make it freely available online within about a year from its release in early 2010 (or so the rumour goes). The main idea here is that those in most need of such a book – the conservationists in developing nations – can access the wealth of information therein without having to sacrifice the village cow to buy it.

I won’t go into any great detail about the book’s contents (mainly because I have yet to receive my own copy and read most of the chapters!), but I have perused early versions of Kevin Gaston‘s excellent chapter on biodiversity, and Tom Brook‘s overview of conservation planning and prioritisation. Our chapter (Chapter 16 by Barry Brook and me), is an overview of statistical and modelling philosophy and application with emphasis on conservation mathematics. It’s by no means a complete treatment, but it’s something we want to develop further down the track. I do hope many people find it useful.

I’ve reproduced the chapter title line-up below, with links to each of the authors websites.

  1. Conservation Biology: Past and Present (C. Meine)
  2. Biodiversity (K. Gaston)
  3. Ecosystem Functions and Services (C. Sekercioglu)
  4. Habitat Destruction: Death of a Thousand Cuts (W. Laurance)
  5. Habitat Fragmentation and Landscape Change (A. Bennett & D. Saunders)
  6. Overharvesting (C. Peres)
  7. Invasive Species (D. Simberloff)
  8. Climate Change (T. Lovejoy)
  9. Fire and Biodiversity (D. Bowman & B. Murphy)
  10. Extinctions and the Practice of Preventing Them (S. Pimm & C. Jenkins)
  11. Conservation Planning and Priorities (T. Brooks)
  12. Endangered Species Management: The US Experience (D. Wilcove)
  13. Conservation in Human-Modified Landscapes (L.P. Koh & T. Gardner)
  14. The Roles of People in Conservation (A. Claus, K. Chan & T. Satterfield)
  15. From Conservation Theory to Practice: Crossing the Divide (M. Rao & J. Ginsberg)
  16. The Conservation Biologist’s Toolbox – Principles for the Design and Analysis of Conservation Studies (C. Bradshaw & B. Brook)

As you can see, it’s a pretty impressive collection of conservation stars and hard-hitting topics. Can’t wait to get my own copy! I will probably blog individual chapters down the track, so stay tuned.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





Carbon = biodiversity

21 12 2009

I’ve decided to blog this a little earlier than I would usually simply because the COP15 is still fresh in everyone’s minds and the paper is now online as an ‘Accepted Article’, so it is fully citable.

The paper published in Conservation Letters by Strassburg and colleagues is entitled Global congruence of carbon storage and biodiversity in terrestrial ecosystems is noteworthy because it provides a very useful answer to a very basic question. If one were to protect natural habitats based on their carbon storage potential, would one also be protecting the most biodiversity (and of course, vice versa)?

Turns out, one would.

Using a global dataset of ~ 20,000 species of mammal, bird and amphibian, they compared three indices of biodiversity distribution (species richness, species threat & range-size rarity) to a new global above- and below-ground carbon biomass dataset. It turns out that at least for species richness, the correlations were fairly strong (0.8-ish, with some due to spatial autocorrelation); for threat and rarity indices, the correlations were rather weaker (~0.3-ish).

So what does this all mean for policy? Biodiversity hotspots – those areas around the globe with the highest biodiversity and greatest threats – have some of the greatest potential to store carbon as well as guard against massive extinctions if we prioritise them for conservation. Places such as the Amazon, Borneo Sumatra and New Guinea definitely fall within this category.

However, not all biodiversity hotspots are created equal; areas such as Brazil’s Cerrado or the savannas of the Rift Valley in East Africa have relatively lower carbon storage, and so carbon-trading schemes wouldn’t necessarily do much for biodiversity in these areas.

The overall upshot is that we should continue to pursue carbon-trading schemes such as REDD (Reduced Emissions from Deforestation and forest Degradation) because they will benefit biodiversity (contrary to what certain ‘green’ organisations say about it), but we can’t sit back and hope that REDD will solve all of biodiversity’s problems world wide.

CJAB

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

ResearchBlogging.orgStrassburg, B., Kelly, A., Balmford, A., Davies, R., Gibbs, H., Lovett, A., Miles, L., Orme, C., Price, J., Turner, R., & Rodrigues, A. (2009). Global congruence of carbon storage and biodiversity in terrestrial ecosystems Conservation Letters DOI: 10.1111/j.1755-263X.2009.00092.x





A magic conservation number

15 12 2009

Although I’ve already blogged about our recent paper in Biological Conservation on minimum viable population sizes, American Scientist just did a great little article on the paper and concept that I’ll share with you here:

Imagine how useful it would be if someone calculated the minimum population needed to preserve each threatened organism on Earth, especially in this age of accelerated extinctions.

A group of Australian researchers say they have nailed the best figure achievable with the available data: 5,000 adults. That’s right, that many, for mammals, amphibians, insects, plants and the rest.

Their goal wasn’t a target for temporary survival. Instead they set the bar much higher, aiming for a census that would allow a species to pursue a standard evolutionary lifespan, which can vary from one to 10 million years.

That sort of longevity requires abundance sufficient for a species to thrive despite significant obstacles, including random variation in sex ratios or birth and death rates, natural catastrophes and habitat decline. It also requires enough genetic variation to allow adequate amounts of beneficial mutations to emerge and spread within a populace.

“We have suggested that a major rethink is required on how we assign relative risk to a species,” says conservation biologist Lochran Traill of the University of Adelaide, lead author of a Biological Conservation paper describing the projection.

Conservation biologists already have plenty on their minds these days. Many have concluded that if current rates of species loss continue worldwide, Earth will face a mass extinction comparable to the five big extinction events documented in the past. This one would differ, however, because it would be driven by the destructive growth of one species: us.

More than 17,000 of the 47,677 species assessed for vulnerability of extinction are threatened, according to the latest Red List of Threatened Species prepared by the International Union for Conservation of Nature. That includes 21 percent of known mammals, 30 percent of known amphibians, 12 percent of known birds and 70 percent of known plants. The populations of some critically endangered species number in the hundreds, not thousands.

In an effort to help guide rescue efforts, Traill and colleagues, who include conservation biologists and a geneticist, have been exploring minimum viable population size over the past few years. Previously they completed a meta-analysis of hundreds of studies considering such estimates and concluded that a minimum head count of more than a few thousand individuals would be needed to achieve a viable population.

“We don’t have the time and resources to attend to finding thresholds for all threatened species, thus the need for a generalization that can be implemented across taxa to prevent extinction,” Traill says.

In their most recent research they used computer models to simulate what population numbers would be required to achieve long-term persistence for 1,198 different species. A minimum population of 500 could guard against inbreeding, they conclude. But for a shot at truly long-term, evolutionary success, 5,000 is the most parsimonious number, with some species likely to hit the sweet spot with slightly less or slightly more.

“The practical implications are simply that we’re not doing enough, and that many existing targets will not suffice,” Traill says, noting that many conservation programs may inadvertently be managing protected populations for extinction by settling for lower population goals.

The prospect that one number, give or take a few, would equal the minimum viable population across taxa doesn’t seem likely to Steven Beissinger, a conservation biologist at the University of California at Berkeley.

“I can’t imagine 5,000 being a meaningful number for both Alabama beach mice and the California condors. They are such different organisms,” Beissinger says.

Many variables must be considered when assessing the population needs of a given threatened species, he says. “This issue really has to do with threats more than stochastic demography. Take the same rates of reproduction and survival and put them in a healthy environment and your minimum population would be different than in an environment of excess predation, loss of habitat or effects from invasive species.”

But, Beissinger says, Traill’s group is correct for thinking that conservation biologists don’t always have enough empirically based standards to guide conservation efforts or to obtain support for those efforts from policy makers.

“One of the positive things here is that we do need some clear standards. It might not be establishing a required number of individuals. But it could be clearer policy guidelines for acceptable risks and for how many years into the future can we accept a level of risk,” Beissinger says. “Policy people do want that kind of guidance.”

Traill sees policy implications in his group’s conclusions. Having a numerical threshold could add more precision to specific conservation efforts, he says, including stabs at reversing the habitat decline or human harvesting that threaten a given species.

“We need to restore once-abundant populations to the minimum threshold,” Traill says. “In many cases it will make more economic and conservation sense to abandon hopeless-case species in favor of greater returns elsewhere.





December Issue of Conservation Letters

11 12 2009

Gemsbok (Oryx gazella) in Namibia

Another great line-up in Conservation Letters‘ last issue for 2009. For full access, click here.





Breaking the waves – conservation conundrum of bioshields

9 12 2009

Today’s post covers a neat little review just published online in Conservation Letters by Feagin and colleagues entitled Shelter from the storm? Use and misuse of coastal vegetation bioshields for managing natural disasters. I’m covering this for three reasons: (1) it’s a great summary and wake-up call for those contemplating changing coastal ecosystems in the name of disaster management, (2) I have a professional interest in the ecosystem integrity-disaster interface and (3) I had the pleasure of editing this article.

I’ve blogged about quite a few papers on ecosystem services (including some of my own) because I think making the link between ecosystem integrity and human health, wealth and well-being are some of the best ways to convince Joe Bloggs that saving species he’ll never probably see are in his and his family’s best (and selfish) interests. Convincing the poverty-stricken, the greedy and the downright stupid of biodiversity’s inherent value will never, ever work (at least, it hasn’t worked yet).

Today’s feature paper discusses an increasingly relevant policy conundrum in conservation – altering coastal ecosystems such that planted/restored/conserved vegetation minimises the negative impacts of extreme weather events (e.g., tsunamis, cyclones, typhoons and hurricanes): the so-called ‘bioshield’ effect. The idea is attractive – coastal vegetation acts to buffer human development and other land features from intense wave action, so maintain/restore it at all costs.

The problem is, as Feagin and colleagues point out in their poignant review, ‘bioshields’ don’t really seem to have much effect in attenuating the big waves resulting from the extreme events, the very reason they were planted in the first place. Don’t misunderstand them – keeping ecosystems like mangroves and other coastal communities intact has enormous benefits in terms of biodiversity conservation, minimised coastal erosion and human livelihoods. However, with massive coastal development in many parts of the world, the knee-jerk reaction has been to plant up coasts with any sort of tree/shrub going without heeding these species’ real effects. Indeed, many countries have active policies now to plant invasive species along coastal margins, which not only displace native species, they can displace humans and likely play little part in any wave attenuation.

This sleeping giant of a conservation issue needs some serious re-thinking, argue the authors, especially in light of predicted increases in extreme storm events resulting from climate change. I hope policy makers listen to that plea. I highly recommend the read.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

ResearchBlogging.orgFeagin, R., Mukherjee, N., Shanker, K., Baird, A., Cinner, J., Kerr, A., Koedam, N., Sridhar, A., Arthur, R., Jayatissa, L., Lo Seen, D., Menon, M., Rodriguez, S., Shamsuddoha, M., & Dahdouh-Guebas, F. (2009). Shelter from the storm? Use and misuse of coastal vegetation bioshields for managing natural disasters Conservation Letters DOI: 10.1111/j.1755-263X.2009.00087.x





Scoping the future threats and solutions to biodiversity conservation

4 12 2009

Way back in 1989, Jared Diamond defined the ‘evil quartet’ of habitat destruction, over-exploitation, introduced species and extinction cascades as the principal drivers of modern extinctions. I think we could easily update this to the ‘evil quintet’ that includes climate change, and I would even go so far as to add extinction synergies as a the sixth member of the ‘evil sextet’.

But the future could hold quite a few more latent threats to biodiversity, and a corresponding number of potential solutions to its degradation. That’s why Bill Sutherland of Cambridge University recently got together with some other well-known scientists and technology leaders to do a ‘horizon scanning’ exercise to define what these threats and solutions might be in the immediate future. It’s an interesting, eclectic and somewhat enigmatic list, so I thought I’d summarise it here. The paper is entitled A horizon scan of global conservation issues for 2010 and was recently published online in Trends in Ecology and Evolution.

In no particular order or relative rank, Sutherland and colleagues list the following 15 ‘issues’ that I’ve broadly divided into ‘Emerging Threats’ and ‘Potential Solutions’:

Emerging Threats

  1. Microplastic pollution – The massive increase in plastics found in the world’s waterways and oceans really doesn’t have much focus right now in conservation research, but it should. We really don’t know how much we’re potentially threatening species with this source of pollution.
  2. Nanosilver in wastewater – The ubiquity of antimicrobial silver oxide or ions in products these days needs careful consideration for what the waste might be doing to our microbial communities that keep ecosystems alive and functioning.
  3. Stratospheric aerosols – A simultaneous solution and threat. Creating what would in effect be an artificial global cooling by injecting particles like sulphate aerosols into the stratosphere might work to cool the planet down somewhat. However, it would not reduce carbon dioxide, ocean acidification or other greenhouse gas-related changes. This strikes me as a potential for serious mucking up of the global climate and only a band-aid solution to the real problem.
  4. Deoxygenation of the oceans – Very scary. Ironically today I was listening to a talk by Martin Kennedy on the deep-time past of ocean hypoxia and he suggests we’re well on our way to a situation where our shelf waters could essentially become too anoxic for marine life to persist. It’s happened before, and rapid climate change makes the prospect plausible within less than a century. And you thought acidification was scary.
  5. Changes in denitrifying bacteria – Just like we’re changing the carbon cycle, we’re buggering up the nitrogen cycle as well. Changing our water bodies to nitrogen sources rather than sinks could fundamentally change marine ecosystems for the worse.
  6. High-latitude volcanism – One of these horrible positive feedback ideas. Reducing high-latitude ice cover exposes all these slumbering volcanoes that once ‘released’, start increasing atmospheric gas concentrations and contributing to faster ice melt and sea level rise.
  7. Trans-Arctic dispersal and colonisation – Warming polar seas and less ice mean fewer barriers to species movements. Expect Arctic ecosystems to be a hotbed of invasion, regime shifts and community reshuffling as a result.
  8. Invasive Indo-Pacific lionfish – Not one I would have focussed on, but interesting. These spiny, venomous fish like to eat a lot of other species, and so represent a potentially important invasive species in the marine realm.
  9. REDD and non-forested ecosystems – Heralded as a great potential coup for forest preservation and climate change mitigation, focussing on maintaining forests for their carbon sequestration value might divert pressure toward non-forested habitats and ironically, threaten a whole new sphere of species.
  10. International land acquisition – Global financial crises and dwindling food supplies mean that governments are acquiring more and more huge tracts of land for agricultural development. While this might solve some immediate issues, it could potentially threaten a lot more undeveloped land in the long run, putting even more pressure on habitats.

Potential Solutions

  1. Synthetic meat – Ever thought about eating a sausage grown in a vat rather than cut from a dead pig? It could become the norm and a way of reducing the huge pressure on terrestrial and aquatic systems for the production of livestock and fish for human protein provision.
  2. Artificial life – Both a risk and a potential solution. While I’ve commented before on the pointlessness of cloning technology for conservation, the ability to create genomes and reinvigorate species on the brink is an exciting prospect. It’s also frightening as hell because we don’t know how all these custom-made genomes might react and transform naturally evolved ones.
  3. Biochar – Burn organic material (e.g., plant matter) in the absence of oxygen, you get biochar. This essentially sequesters a lot of carbon that can then be put underground. The upshot is that agricultural yields can also increase. Would there be a trade-off though between land available for biochar sequestration and natural habitats?
  4. Mobile-sensing technology – Not so much a solution per se, but the rapid acceleration of remote technology will make our ability to measure and predict the subtleties of ecosystem and climate change much more precise. A lot more work and application required here.
  5. Assisted colonisationI’ve blogged about this before. With such rapid shifts in climate, we might be obliged to move species around so that they can keep up with rapidly changing conditions. Many pros and cons here, not least of which is exacerbating the invasive species problems around the globe.

Certainly some interesting ideas here and worth a thought or two. I wonder if the discipline of ‘conservation biology’ might even exist in 50-100 years – we might all end up being climate or agricultural engineers with a focus on biodiversity-friendly technology. Who knows?

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

ResearchBlogging.orgSutherland, W., Clout, M., Côté, I., Daszak, P., Depledge, M., Fellman, L., Fleishman, E., Garthwaite, R., Gibbons, D., & De Lurio, J. (2009). A horizon scan of global conservation issues for 2010 Trends in Ecology & Evolution DOI: 10.1016/j.tree.2009.10.003





Greenwash, blackwash: two faces of conservation evil

21 11 2009

Beware false prophets, and especially those masquerading as conservationists (or at least ‘green’) when they are not, in fact, doing anything for conservation at all. But this blog site isn’t about typical greenie evil-corporation-making-a-mess-of-the-Earth sermons (there are plenty of those); it’s instead about real conservation science that has/should/could have a real biodiversity benefits. This is why I highlight the bitey and the toothless together.

With the slow (painfully, inadequately, insufficiently slow) maturation of environmental awareness and the rising plight of biodiversity in general (including our own health and prosperity), it has become almost chic to embrace a so-called ‘green’ perspective. This approach has blown out into a full-scale business model where in many wealthier nations especially, it’s just plain good business to attract the green-conscious consumer to buy more ‘environmentally friendly’ products. Problem is, so many of these products are the farthest thing from green you can imagine (see examples here, here & here). This stimulated the environmentalist Jay Westerveld to coin the term greenwashing in 1986. Greenwashing is basically defined as activities that misleadingly give the impression of environmentally sound management that thereby deflect attention away from the continued pursuit of environmentally destructive activities.

Well, not that the problem has disappeared, or even dissipated (if anything, it’s growing), but I don’t want to focus on that here. Instead, I want to highlight a recent paper in which I was involved that outlines too how environmental groups can be guilty of almost the same sin – claiming businesses, practices, individuals, corporations, etc. are far more environmentally destructive than they really are. This, we termed blackwashing.

The paper by Koh and colleagues entitled Wash and spin cycle threats to tropical biodiversity just came out online in the journal Biotropica, and therein we describe the greenwashing-blackwashing twin conservation evils using the oil palm controversy as an excellent example case. Just in case you didn’t know, much of the tropical world (especially South East Asia) is undergoing massive conversion of native forests to oil palm plantations, to the overwhelming detriment of biodiversity. I’ve covered the issue in several posts on ConservationBytes.com before (see for example Tropical forests worth more standing, Indonesia’s precious peatlands under oil palm fire & More greenwashing from the Malaysian oil palm industry).

Briefly, we demonstrate how the palm oil industry is guilty of the following greenwashes:

On the either side, various environmental groups such as Greenpeace, have promoted the following blackwashes:

  • Orang-utan will be extinct imminently – A gross exaggeration, although something we believe is eventually possible.
  • Avoided deforestation schemes (e.g., REDD) will crash carbon-trading – Again, even economists don’t believe this.

For details, see the paper online.

Now, I’d probably tend to believe some of the less outrageous claims made by some environmental groups because if anything, the state of biodiversity is probably overall worse than what most people realise. However, when environmental groups are exposed for exaggerations, or worse, lies, then their credibility goes out the window and even those essentially promoting their cause (e.g., conservation biologists like myself) will have nothing to do with them. The quasi-religious zealotry of anti-whaling campaigns is an example of a terrible waste of funds, goodwill and conservation resources that could be otherwise spent on real conservation gains. Instead, political stunts simply alienate people who would otherwise reasonably contribute to improving the state of biodiversity. Incidentally, an environmental advocacy group in Australia emailed me to support their campaign to highlight the plight of sharks. I am a firm supporter of better conservation of sharks (see recent paper and post about this here). However, when I read their campaign propaganda, the first sentence read:

Almost 90 % of sharks have been wiped out

I immediately distanced myself from them. This is a blatant lie and terrible over-exaggeration. Ninety per cent of sharks HAVE NOT been wiped out. Some localised depletions have occurred, and not one single shark species has been recorded going extinct since records began. While I agree the world has a serious shark problem, saying outrageous things like this will only serve to weaken your cause. My advice to any green group is to get your facts straight and avoid the sensationlist game – you won’t win it, and you probably won’t be successful in doing anything beneficial for the species you purport to save.
CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

ResearchBlogging.orgKoh, L., Ghazoul, J., Butler, R., Laurance, W., Sodhi, N., Mateo-Vega, J., & Bradshaw, C. (2009). Wash and Spin Cycle Threats to Tropical Biodiversity Biotropica DOI: 10.1111/j.1744-7429.2009.00588.x





Crap environmental reporting

13 11 2009

EvilWe do a lot in our lab to get our research results out to a wider community than just scientists – this blog is just one example of how we do that. But of course, we rely on the regular media (television, newspaper, radio) heavily to pick up our media releases (see a list here). I firmly believe it goes well beyond shameless self promotion – it’s a duty of every scientist I think to tell the world (i.e., more than just our colleagues) about what we’re being paid to do. And the masses are hungry for it.

However, the demise of the true ‘journalist’ (one who investigates a story – i.e., does ‘research’) in favour of the automaton ‘reporter’ (one who merely regurgitates, and then sensationalises, what he/she is told or reads) worldwide (and oh, how we are plagued with reporters and deeply in need of journalists in Australia!) means that there is some horrendous stories out there, especially on scientific issues. This is mainly because most reporters have neither the training nor capacity to understand what they’re writing about.

This issue is also particular poignant for the state of the environment, climate change and biodiversity loss – I’ve blogged about this before (see Poor media coverage promotes environmental apathy and untruths).

But after a 30-minute telephone interview with a very friendly American food journalist yesterday, I expected a reasonable report on the issue of frog consumption because, well, I explained many things to her as best I could. What was eventually published was appalling.

Now, in all fairness, I think she was trying to do well, but it’s as though she didn’t even listen to me. The warning bells should have rung loudly when she admitted she hadn’t read my blog “in detail” (i.e., not at all?). You can read the full article here, but let me just point out some of the inconsistencies:

  • She wrote: “That’s a problem, Bradshaw adds, because nearly one half of frog species are facing extinction.”

Ah, no. I told her that between 30 and 50 % of frogs could be threatened with extinction (~30 % officially from the IUCN Red List). It could be as much as half given the paucity of information on so many species. A great example of reporter cherry-picking to add sensationalism.

  • She wrote: “Bradshaw attributes the drop-off to global warming and over-harvesting.”

Again, no, I didn’t. I clearly told her that the number one, way-out-in-front cause of frog declines worldwide is habitat loss. I mentioned chytrid fungus as another major contributor, and that climate change exacerbates the lot. Harvesting pressure is a big unknown in terms of relative impact, but I suspect it’s large.

  • She continued: “Bradshaw has embarked on a one-man campaign to educate eaters about the frog leg industry”

Hmmm. One man? I had a great team of colleagues co-write the original paper in Conservation Biology. I wasn’t even the lead author! Funny how suddenly I’m a lone wolf on a ‘campaign’. Bloody hell.

“Aghast”, was I? I don’t recall being particularly emotional when I told her that I found a photo of Barack Obama eating frog legs during his election campaign. I merely pointed this out to show that the product is readily available in the USA. I also mentioned absolutely nothing about whales or their loins.

So, enough of my little humorous whinge. My point is really that there are plenty of bad journalists out there with little interest in reporting the truth on environmental issues (tell us something we don’t know, Bradshaw). If you want to read a good story about the frog consumption issue, check out a real journalist’s perspective here.

CJA Bradshaw





Raise targets to prevent extinction

12 11 2009

I know I’ve blogged recently about this, but The Adelaidean did a nice little article that I thought I’d reproduce here. The source can be found here.

Adelaidean story Nov 2009





How to restore a tropical rain forest

6 11 2009

thiakiHere’s a little story for you about how a casual chat over a glass of wine (or many) can lead to great scientific endeavours.

A few years ago I was sitting in the living room of my good friends Noel Preece and Penny van Oosterzee in Darwin chatting about life, the universe, and everything. They rather casually mentioned that they would be selling their environmental consulting company and their house and moving to the Queensland rain forest. Ok – sounded like a pretty hippy thing to do when you’re thinking about ‘retiring’ (only from the normal grindstone, at least). But it wasn’t about the easy life away from it all (ok, partially, perhaps) – they wanted to do something with their reasonably large (181 ha), partially deforested (51-ha paddock) property investment. By ‘something’, I mean science.

So they asked me – how would we go about getting money to investigate the best way to reforest a tropical rain forest? I had no idea. As it turns out, no one really knows how to restore rain forests properly. Sure, planting trees happens a lot, but the random, willy-nilly, unquantified ways in which it is done means that no one can tell you what the biggest biodiversity bang for your buck is, or even if it can compete on the carbon sequestration front.

Why carbon sequestration? Well, in case you’ve had your head up your bum for the last decade, one of the major carbon mitigating schemes going is the offset idea – for every tonne of carbon you emit as a consumer, you (or more commonly, someone else you pay) plant a certain number of trees (because trees need carbon to grow and so suck it out of the atmosphere). Nice idea, but if you deforest native ecosystems just to bash up quick-growing monoculture plantations of (usually) exotic species with little benefit to native biota, biodiversity continues to spiral down the extinction vortex. So, there has to be a happy medium, and there has to be a way to measure it.

So I said to Penny and Noel “Why don’t we bash together a proposal and get some experts in the field involved and submit it to the Australian Research Council (ARC) for funding?” They thought that was a smashing idea, and so we did.

Fast forward a few years and … success! The Thiaki Project was born (‘Thiaki’ is the name of the Creek flowing through the property north of Atherton – seems to be of Greek origin). We were extremely lucky to find a new recruit to the University of Queensland, Dr. Margie Mayfield (who worked previously with Paul Ehrlich), who was not only an expert in the area of tropical reforestation for biodiversity, she also had the time and energy to lead the project. We garnered several other academic and industry partners and came up with a pretty sexy experiment that is just now getting underway thanks to good old Mr. ARC.

The project is fairly ambitious, even though the experiments per se are fairly straight forward. We’re using a randomised block design where we are testing 3 tree diversity treatments (monoculture, 1 species each from 6 families, and 5 species each from those same 6 families) and two planting densities (high and low). The major objective is to see what combination of planting density and native tree species provides the most habitat for the most species. We’re starting small, looking mainly at various insects as they start to use the newly planted blocks, but might expand the assessments (before planting and after) to reptiles, amphibians and possibly birds later on.

But we’re not stopping there – we were fortunate enough to get get a clever soil scientist, Dr. David Chittleborough of the University of Adelaide, involved so we could map the change in soil carbon during the experiment. Our major challenge is to find the right combination of tree species and planting techniques that restore native biodiversity the most effectively, all the while maximising carbon sequestration from the growing forest. And of course, we’re trying to do this as most cost-effectively as we can – measuring the relative costs will give landowners contemplating reforestation the scale of expenditures expected.

I’m pretty proud of what Margie, Noel, Penny and the rest of the team have accomplished so far, and what’s planned. Certainly the really exciting results are years away yet, but stay tuned – Thiaki could become the model for tropical reforestation worldwide. Follow the Thiaki Project website for regular updates.

I’d also love to recreate the Thiaki Project in southern Australia because as it turns out, no one knows how to maximise biodiversity and carbon sequestration for the lowest cost in temperate reforestation projects either. All we need is a few hundred hectares of deforested land (shouldn’t be hard to find), about $1 million to start, and a bit of time. Any takers?

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

carbon offset

© C. Madden





Not so ‘looming’ – Anthropocene extinctions

4 11 2009

ABCclip031109

© ABC 2009

Yesterday I was asked to do a quick interview on ABC television (Midday Report) about the release of the 2009 IUCN Red List of Threatened Species. I’ve blogged about the importance of the Red List before, but believe we have a lot more to do with species assessments and getting prioritisation right with respect to minimum viable population size. Have a listen to the interview itself, and read the IUCN’s media release reproduced below.

My basic stance is that we’ve only just started to assess the number of species on the planet (under 50000), yet there are many millions of species still largely under-studied and/or under-described (e.g., extant species richness = > 4 million protists, 16600 protozoa, 75000-300000 helminth parasites, 1.5 million fungi, 320000 plants, 4-6 million arthropods, > 6500 amphibians, 10000 birds and > 5000 mammals – see Bradshaw & Brook 2009 J Cosmol for references). What we’re looking at here is a refinement of knowledge (albeit a small one). We are indeed in the midst of the Anthropocene mass extinction event – there is nothing ‘looming’ about it. We are essentially losing species faster than we can assess them. I believe it’s important to make this clearer to those not working directly in the field of biodiversity conservation.

CJA Bradshaw

Extinction crisis continues apace – IUCN

Gland, Switzerland, 3 November, 2009 (IUCN) – The latest update of the IUCN Red List of Threatened Species™ shows that 17,291 species out of the 47,677 assessed species are threatened with extinction.

The results reveal 21 percent of all known mammals, 30 percent of all known amphibians, 12 percent of all known birds, and 28 percent of reptiles, 37 percent of freshwater fishes, 70 percent of plants, 35 percent of invertebrates assessed so far are under threat.

“The scientific evidence of a serious extinction crisis is mounting,” says Jane Smart, Director of IUCN’s Biodiversity Conservation Group. “January sees the launch of the International Year of Biodiversity. The latest analysis of the IUCN Red List shows the 2010 target to reduce biodiversity loss will not be met. It’s time for governments to start getting serious about saving species and make sure it’s high on their agendas for next year, as we’re rapidly running out of time.”

Of the world’s 5,490 mammals, 79 are Extinct or Extinct in the Wild, with 188 Critically Endangered, 449 Endangered and 505 Vulnerable. The Eastern Voalavo (Voalavo antsahabensis) appears on the IUCN Red List for the first time in the Endangered category. This rodent, endemic to Madagascar, is confined to montane tropical forest and is under threat from slash-and-burn farming.

There are now 1,677 reptiles on the IUCN Red List, with 293 added this year. In total, 469 are threatened with extinction and 22 are already Extinct or Extinct in the Wild. The 165 endemic Philippine species new to the IUCN Red List include the Panay Monitor Lizard (Varanus mabitang), which is Endangered. This highly-specialized monitor lizard is threatened by habitat loss due to agriculture and logging and is hunted by humans for food. The Sail-fin Water Lizard (Hydrosaurus pustulatus) enters in the Vulnerable category and is also threatened by habitat loss. Hatchlings are heavily collected both for the pet trade and for local consumption.

“The world’s reptiles are undoubtedly suffering, but the picture may be much worse than it currently looks,” says Simon Stuart, Chair of IUCN’s Species Survival Commission. “We need an assessment of all reptiles to understand the severity of the situation but we don’t have the $2-3 million to carry it out.”

The IUCN Red List shows that 1,895 of the planet’s 6,285 amphibians are in danger of extinction, making them the most threatened group of species known to date. Of these, 39 are already Extinct or Extinct in the Wild, 484 are Critically Endangered, 754 are Endangered and 657 are Vulnerable.

The Kihansi Spray Toad (Nectophrynoides asperginis) has moved from Critically Endangered to Extinct in the Wild. The species was only known from the Kihansi Falls in Tanzania, where it was formerly abundant with a population of at least 17,000. Its decline is due to the construction of a dam upstream of the Kihansi Falls that removed 90 percent of the original water flow to the gorge. The fungal disease chytridiomycosis was probably responsible for the toad’s final population crash.

The fungus also affected the Rabb’s Fringe-limbed Treefrog (Ecnomiohyla rabborum), which enters the Red List as Critically Endangered. It is known only from central Panama. In 2006, the chytrid fungus (Batrachochytrium dendrobatidis) was reported in its habitat and only a single male has been heard calling since. This species has been collected for captive breeding efforts but all attempts have so far failed.

Of the 12,151 plants on the IUCN Red List, 8,500 are threatened with extinction, with 114 already Extinct or Extinct in the Wild. The Queen of the Andes (Puya raimondii) has been reassessed and remains in the Endangered category. Found in the Andes of Peru and Bolivia, it only produces seeds once in 80 years before dying. Climate change may already be impairing its ability to flower and cattle roam freely among many colonies, trampling or eating young plants.

There are now 7,615 invertebrates on the IUCN Red List this year, 2,639 of which are threatened with extinction. Scientists added 1,360 dragonflies and damselflies, bringing the total to 1,989, of which 261 are threatened. The Giant Jewel (Chlorocypha centripunctata), classed as Vulnerable, is found in southeast Nigeria and southwest Cameroon and is threatened by forest destruction.

Scientists also added 94 molluscs, bringing the total number assessed to 2,306, of which 1,036 are threatened. Seven freshwater snails from Lake Dianchi in Yunnan Province, China, are new to the IUCN Red List and all are threatened. These join 13 freshwater fishes from the same area, 12 of which are threatened. The main threats are pollution, introduced fish species and overharvesting.

There are now 3,120 freshwater fishes on the IUCN Red List, up 510 species from last year. Although there is still a long way to go before the status all the world’s freshwater fishes is known, 1,147 of those assessed so far are threatened with extinction. The Brown Mudfish (Neochanna apoda), found only in New Zealand, has been moved from Near Threatened to Vulnerable as it has disappeared from many areas in its range. Approximately 85-90 percent of New Zealand’s wetlands have been lost or degraded through drainage schemes, irrigation and land development.

“Creatures living in freshwater have long been neglected. This year we have again added a large number of them to the IUCN Red List and are confirming the high levels of threat to many freshwater animals and plants. This reflects the state of our precious water resources. There is now an urgency to pursue our effort but most importantly to start using this information to move towards a wise use of water resources,” says Jean-Christophe Vié, Deputy Head of the IUCN Species Programme.

“This year’s IUCN Red List makes for sobering reading,” says Craig Hilton-Taylor, Manager of the IUCN Red List Unit. “These results are just the tip of the iceberg. We have only managed to assess 47,663 species so far; there are many more millions out there which could be under serious threat. We do, however, know from experience that conservation action works so let’s not wait until it’s too late and start saving our species now.”

The status of the Australian Grayling (Prototroctes maraena), a freshwater fish, has improved as a result of conservation efforts. Now classed as Near Threatened as opposed to Vulnerable, the population has recovered thanks to fish ladders which have been constructed over dams to allow migration, enhanced riverside vegetation and the education of fishermen, who now face heavy penalties if found with this species.





Sick environment, sick people

30 10 2009

sickplanetA quick post to talk about a subject I’m more and more interested in – the direct link between environmental degradation (including biodiversity loss) and human health.

To many conservationists, people are the problem, and so they focus naturally on trying to maintain biodiversity in spite of human development and spread. Well, it’s 60+ years since we’ve been doing ‘conservation biology’ and biodiversity hasn’t been this badly off since the Cretaceous mass extinction event 146-64 million years ago. We now sit squarely within the geological era more and more commonly known as the ‘Anthropocene’, so if we don’t consider people as an integral part of any ecosystem, then we are guaranteed to fail biodiversity.

I haven’t posted in a week because I was in Shanghai attending the rather clumsily entitled “Thematic Reference Group (TRG) on Environment, Agriculture and Infectious Disease’, which is a part of the UNICEF/UNDP/World Bank/World Health Organization Special Programme for Research and Training in Tropical Diseases (TDR) (what a mouthful that is). What’s this all about and why is a conservation ecologist (i.e., me) taking part in the group?

It’s taken humanity a while to realise that what we do to the planet, we eventually end up doing to ourselves. The concept of ecosystem services1 demonstrates this rather well – our food, weather, wealth and well-being are all derived from healthy, functioning ecosystems. When we start to bugger up the inter-species relationships that define one element of an ecosystem, then we hurt ourselves. I’ve blogged about this topic a few times before with respect to flooding, pollination, disease emergence and carbon sequestration.

Our specific task though on the TRG is to define the links between environmental degradation, agriculture, poverty and infectious disease in humans. Turns out, there are quite a few examples of how we’re rapidly making ourselves more susceptible to killer infectious diseases simply by our modification of the landscape and seascape.

Some examples are required to illustrate the point. Schistosomiasis is a snail-borne fluke that infects millions worldwide, and it is on the rise again from expanding habitat of its host due to poor agricultural practices, bad hygiene, damming of large river systems and climate warming. Malaria too is on the rise, with greater and greater risk in the endemic areas of its mosquito hosts. Chagas (a triatomine bug-borne trypanosome) is also increasing in extent and risk. Some work I’m currently doing under the auspices of the TRG is also showing some rather frightening correlations between the degree of environmental degradation within a country and the incidence of infectious disease (e.g., HIV, malaria, TB), non-infectious disease (e.g., cancer, cardiovascular disease) and indices of life expectancy and child mortality.

I won’t bore you with more details of the group because we are still drafting a major World Health Organization report on the issues and research priorities. Suffice it to say that if we want to convince policy makers that resilient functioning ecosystems with healthy biodiversity are worth saving, we have to show them the link to infectious disease in humans, and how this perpetuates poverty, rights injustices, gender imbalances and ultimately, major conflicts. An absolute pragmatist would say that the value of keeping ecosystems intact for this reason alone makes good economic sense (treating disease is expensive, to say the least). A humanitarian would argue that saving human lives by keeping our ecosystems intact is a moral obligation. As a conservation biologist, I argue that biodiversity, human well-being and economies will all benefit if we get this right. But of course, we have a lot of work to do.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

1Although Bruce Wilcox (another of the TRG expert members), who I will be highlighting soon as a Conservation Scholar, challenges the notion of ecosystem services as a tradeable commodity and ‘service’ as defined. More on that topic soon.





Value of a good enemy

25 10 2009

alienpredatorI love these sorts of experiments. Ecology (and considering conservation ecology a special subset of the larger discipline) is a messy business, mainly because ecosystems are complex, non-linear, emergent, interactive, stochastic and meta-stable entities that are just plain difficult to manipulate experimentally. Therefore, making inference of complex ecological processes tends to be enhanced when the simplest components are isolated.

Enter the ‘mini-ecosystem-in-a-box’ approach to ecological research. I’ve blogged before about some clever experiments to examine the role of connectivity among populations in mitigating (or failing to mitigate) extinction risk, and alluded to others indicating how harvest reserves work to maximise population persistence. This latest microcosm experiment is another little gem and has huge implications for conservation.

A fairly long-standing controversy in conservation biology, and in invasive species biology in particular, is whether intact ecosystems are in any way more ‘resilient’ to invasion by alien species (the latter most often being deliberately or inadvertently introduced by humans – think of Australia’s appalling feral species problems; e.g., buffalo, foxes and cats, weeds). Many believe by default that more ‘pristine’ (i.e., less disturbed by humans) communities will naturally provide more ecological checks against invasives because there are more competitors, more specialists and more predators. However, considering the ubiquity of invasives around the world, this assumption has been challenged vehemently.

The paper I’m highlighting today uses the microcosm experimental approach to show how native predators, when abundant, can reduce the severity of an invasion. Using a system of two mosquito species (one ‘native’ – what’s ‘native’ in a microcosm? [another subject] – and one ‘invasive’) and a native midge predator, Juliano and colleagues demonstrate in their paper Your worst enemy could be your best friend: predator contributions to invasion resistance and persistence of natives that predators are something you want to keep around.

In short, they found little evidence of direct competition between the two mosquitoes in terms of abundance when placed together without predators, but when the midges were added, the persistence of the invasive mosquito was reduced substantially. Of course, the midge predators did do their share of damage on the native mosquitoes in terms of reducing the latter’s abundance, but through a type of competitive release from their invasive counterparts, the midges’ reduction of the invasive species left the native mosquito free to develop faster (i.e., more per capita resources).

Such a seemingly academic result has huge conservation implications. In most systems, predators are some of the largest and slowest-reproducing species, so they are characteristically the first to feel the hammer of human damage. From bears to sharks, and tigers to wolves, big, charismatic predators are on the wane worldwide. Juliano and colleagues’ nice experimental work with insects reminds us that keeping functioning native ecosystems intact from all trophic perspectives is imperative.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

This post was chosen as an Editor's Selection for ResearchBlogging.org

ResearchBlogging.orgJuliano, S., Lounibos, L., Nishimura, N., & Greene, K. (2009). Your worst enemy could be your best friend: predator contributions to invasion resistance and persistence of natives Oecologia DOI: 10.1007/s00442-009-1475-x





Sleuthing the Chinese green slime monster

21 10 2009

greenslimemonsterI just returned from a week-long scientific mission in China sponsored by the Australian Academy of Science, the Australian Academy of Technological Sciences and Engineering and the Chinese Academy of Sciences. I was invited to attend a special symposium on Marine and Deltaic Systems where research synergies between Australian and Chinese scientists were to be explored. The respective academies really rolled out the red carpet for the 30 or so Australian scientists on board, so I feel very honoured to have been invited.

During our marine workshop, one of my Chinese counterparts, Dongyan Liu from the Yantai Institute for Coastal Zone Research, presented a brilliant piece of ecological sleuthing that I must share with readers of ConservationBytes.com.

The first time you go to China the thing that strikes you is that everything is big – big population, big cities, big buildings, big projects, big budgets and big, big, big environmental problems. After many years of overt environmental destruction in the name of development, the Chinese government (aided by some very capable scientists) is now starting to address the sins of the past.

Liu and colleagues published their work earlier this year in Marine Pollution Bulletin in a paper entitled World’s largest macroalgal bloom caused by expansion of seaweed aquaculture in China, which describes a bloody massive outbreak of a particularly nasty ‘green tide’.

What’s a ‘green tide’? In late June 2008 in the coastal city of Qingdao not far from Beijing (and just before the 2008 Olympics), a whopping 1 million tonnes of green muck washed up along approximately 400 km2 of coastline. It took 10,000 volunteers 2 weeks to clean up the mess. At the time, many blamed the rising eutrophication of coastal China as the root cause, and a lot of people got their arse kicked over it. However, the reality was that it wasn’t so simple.

The Yellow Sea abutting this part of the Chinese coast is so named because of its relatively high productivity. Warm waters combined with good mixing mean that there are plenty of essential nutrients for green things to grow. So, adding thousands of tonnes of fertilisers from Chinese agricultural run-off seems like a logical explanation for the bloom.

Qingdoa green tide 2008 © Elsevier

Qingdao green tide 2008 © Elsevier

However, it turns out that the bulk of the green slime was comprised of a species called Enteromorpha prolifera, and it just so happens that this particularly unsavoury seaweed loves to grow on the infrastructure used for the aquaculture of nori (a.k.a. amanori or zicai) seaweed (mainly, Porphyra yezoensis). Problem is, P. yezoensis is grown mainly on the coast hundreds of kilometres to the south.

Liu and colleagues examined both satellite imagery and detailed oceanographic data from the period prior to the green tide and not only spotted green splotches many kilometres long, they also determined that the current flow and wind direction placed the trajectory of any green slime mats straight for Qingdao.

So, how does it happen? Biofouling by E. prolifera on P. yezoensis aquaculture frames is dealt with mainly by manual cleaning and then dumping the unwanted muck on the tidal flats. When the tide comes back in, it washes many thousands of kilos of this stuff back out to sea, which then accumulates in rafts and continues to grow in the warm, rich seas. Subsequent genetic work also confirmed that the muck at sea was the same stock as the stuff growing on the aquaculture frames.

Apart from some lovely sleuthing work, the implications are pretty important from a biodiversity perspective. Massive eutrophication coupled with aquaculture that inadvertently spawns a particularly nasty biofouling species is a good recipe for oxygen depletion in areas where the eventual slime monster starts to decay. This can lead to so-called ‘dead’ zones that can kill off huge numbers of marine species. So, the proper management of aquaculture in the hungry Goliath that is China becomes essential to reduce the incidence of dead zones.

Fortunately, it looks like Liu and colleagues’ work is being taken seriously by the Chinese government who is now contemplating financial support for aquaculturists to clean their infrastructure properly without dumping the sludge to sea. A simple policy shift could save a lot of species, a lot of money, and a lot of embarrassment (not to mention prevent a lot of bad smells).

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

This post was chosen as an Editor's Selection for ResearchBlogging.org

ResearchBlogging.orgLiu, D., Keesing, J., Xing, Q., & Shi, P. (2009). World’s largest macroalgal bloom caused by expansion of seaweed aquaculture in China Marine Pollution Bulletin, 58 (6), 888-895 DOI: 10.1016/j.marpolbul.2009.01.013





October Issue of Conservation Letters

18 10 2009

The second-to-last issue in 2009 (October) of Conservation Letters is now out. Click here for full access.

cl2-5

Household goods made of non-timber forest products. © N. Sasaki

Papers in this issue:





Life and death on Earth: the Cronus hypothesis

13 10 2009
Cronus

Cronus

Bit of a strange one for you today, but here’s a post I hope you’ll enjoy.

My colleague, Barry Brook, and I recently published a paper in the very new and perhaps controversial online journal , the Journal of Cosmology. Cosmology? According to the journal, ‘cosmology’ is:

“the study and understanding of existence in its totality, encompassing the infinite and eternal, and the origins and evolution of the cosmos, galaxies, stars, planets, earth, life, woman and man”.

The journal publishes papers dealing with ‘cosmology’ and is a vehicle for those who wish to publish on subjects devoted to the study of existence in its totality.

Ok. Quite an aim.

Our paper is part of the November (second ever) issue of the journal entitled Asteroids, Meteors, Comets, Climate and Mass Extinctions, and because we were the first to submit, we managed to secure the first paper in the issue.

Our paper, entitled The Cronus hypothesis – extinction as a necessary and dynamic balance to evolutionary diversification, introduces a new idea in the quest to find that perfect analogy for understanding the mechanisms dictating how life on our planet has waxed and waned over the billions of years since it first appeared.

Gaia

Gaia

In the 1960s, James Lovelock conceived the novel idea of Gaia – that the Earth functions like a single, self-regulating organism where life itself interacts with the physical environment to maintain conditions favourable for life (Gaia was the ancient Greeks’ Earth mother goddess). Embraced, contested, denounced and recently re-invigorated, the idea has evolved substantially since it first appeared. More recently (this year, in fact), Peter Ward countered the Gaia hypothesis with his own Greek metaphor – the Medea hypothesis. Essentially this view holds that life instead ‘seeks’ to destroy itself in an anti-Gaia manner (Medea was the siblicidal wife of Jason of the Argonauts). Ward described his Medea hypothesis as “Gaia’s evil twin”.

One can marvel at the incredible diversity of life on Earth (e.g., conservatively, > 4 million protists, 16600 protozoa, 75000-300000 helminth parasites, 1.5 million fungi, 320000 plants, 4-6 million arthropods, > 6500 amphibians, 10000 birds and > 5000 mammals) and wonder that there might be something in the ‘life makes it easier for life’ idea underlying Gaia. However, when one considers that over 99 % of all species that have ever existed are today extinct, then a Medea perspective might dominate.

Medea

Medea

Enter Cronus. Here we posit a new way of looking at the tumultuous history of life and death on Earth that effectively relegates Gaia and Medea to opposite ends of a spectrum. Cronus (patricidal son of Gaia overthrown by his own son, Zeus, and banished to Hades) treats speciation and extinction as birth and death in a ‘metapopulation’ of species assemblages split into biogeographic realms. Catastrophic extinction events can be brought about via species engineering their surroundings by passively modifying the delicate balance of oxygen, carbon dioxide and methane – indeed, humans might be the next species to fall victim to our own Medean tendencies. But extinction opens up new niches that eventually elicit speciation, and under conditions of relative environmental stability, specialists evolve because they are (at least temporarily) competitive under those conditions. When conditions change again, extinction ensues because not all can adapt quickly enough. Just as all individuals born in a population must eventually die, extinction is a necessary termination.

We think the Cronus metaphor has a lot of advantages over Gaia and Medea. The notion of a community of species as a population of selfish individuals retains the Darwinian view of contestation; self-regulation in Cronus occurs naturally as a result of extinction modifying the course of future evolution. Cronus also makes existing mathematical tools developed for metapopulation theory amenable to broader lines of inquiry.

For example, species as individuals with particular ‘mortality’ (extinction) rates, and lineages with particular ‘birth’ (speciation) rates, could interact and disperse among ‘habitats’ (biogeographical realms). ‘Density’ feedback could be represented as competitive exclusion or symbioses. As species dwindle, feedbacks such as reduced community resilience that further exacerbate extinction risk (Medea-like phase), and stochastic fluctuation around a ‘carrying capacity’ (niche saturation) arising when environmental conditions are relatively stable is the Gaia-like phase. Our Cronus framework is also scale-invariant – it could be applied to microbial diversity on another organism right up to inter-planetary exchange of life (panspermia).

What’s the relevance to conservation? We’re struggling to prevent extinction, so understanding how it works is an essential first step. Without the realisation that extinction is necessary (albeit, at rates preferably slower than they are currently), we cannot properly implement conservation triage, i.e., where do we invest in conservation and why?

We had fun with this, and I hope you enjoy it too.

CJA Bradshaw

ResearchBlogging.orgBradshaw, C.J.A., & Brook, B.W. (2009). The Cronus Hypothesis – extinction as a necessary and dynamic balance to evolutionary diversification Journal of Cosmology, 2, 201-209 Other: http://journalofcosmology.com/Extinction100.html

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl





Managing for extinction

9 10 2009

ladderAh, it doesn’t go away, does it? Or at least, we won’t let it.

That concept of ‘how many is enough?’ in conservation biology, the so-called ‘minimum viable population size‘, is enough to drive some conservation practitioners batty.

How many times have we heard the (para-) phrase: “It’s simply impractical to bring populations of critically endangered species up into the thousands”?

Well, my friends, if you’re not talking thousands, you’re wasting everyone’s time and money. You are essentially managing for extinction.

Our new paper out online in Biological Conservation entitled Pragmatic population viability targets in a rapidly changing world (Traill et al.) shows that populations of endangered species are unlikely to persist in the face of global climate change and habitat loss unless they number around 5000 mature individuals or more.

After several meta-analytic, time series-based and genetic estimates of the magic minimum number all agreeing, we can be fairly certain now that if a population is much less than several thousands (median = 5000), its likelihood of persisting in the long run in the face of normal random variation is pretty small.

We conclude essentially that many conservation biologists routinely underestimate or ignore the number of animals or plants required to prevent extinction. In fact, aims to maintain tens or hundreds of individuals, when thousands are actually needed, are simply wasting precious and finite conservation resources. Thus, if it is deemed unrealistic to attain such numbers, we essentially advise that in most cases conservation triage should be invoked and the species in question be abandoned for better prospects

A long-standing idea in species restoration programs is the so-called ‘50/500’ rule; this states that at least 50 adults are required to avoid the damaging effects of inbreeding, and 500 to avoid extinctions due to the inability to evolve to cope with environmental change. Our research suggests that the 50/500 rule is at least an order of magnitude too small to stave off extinction.

This does not necessarily imply that populations smaller than 5000 are doomed. But it does highlight the challenge that small populations face in adapting to a rapidly changing world.

We are battling to prevent a mass extinction event in the face of a growing human population and its associated impact on the planet, but the bar needs to be a lot higher. However, we shouldn’t necessarily give up on critically endangered species numbering a few hundred of individuals in the wild. Acceptance that more needs to be done if we are to stop ‘managing for extinction’ should force decision makers to be more explicit about what they are aiming for, and what they are willing to trade off, when allocating conservation funds.

CJA Bradshaw

(with thanks to Lochran Traill, Barry Brook and Dick Frankham)

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

This post was chosen as an Editor's Selection for ResearchBlogging.orgResearchBlogging.org

Traill, L.W., Brook, B.W., Frankham, R.R., & Bradshaw, C.J.A. (2009). Pragmatic population viability targets in a rapidly changing world Biological Conservation DOI: 10.1016/j.biocon.2009.09.001





Connectivity paradigm in extinction biology

6 10 2009

networkI’m going to do a double review here of two papers currently online in Proceedings of the Royal Society B: Biological Sciences. I’m lumping them together because they both more or less challenge the pervasive conservation/restoration paradigm that connectivity is the key to reducing extinction risk. It’s just interesting (and slightly amusing) that the two were published in the same journal and at about the same time, but by two different groups.

From our own work looking at the correlates of extinction risk (measured mainly by proxy as threat risk), the range of a population (i.e., the amount of area and number of habitats it covers) is the principal determinant of risk – the smaller your range, the greater your chance of shuffling off this mortal coil (see also here). This is, of course, because a large range usually means that you have some phenotypic plasticity in your habitat requirements, you can probably disperse well, and your not going to succumb to localised ‘catastrophes’ as often. It also probably means (but not always) that your population size increases as your range size increases; as we all know, populations must be beyond their minimum viable population size to have a good chance of persisting random demographic and environmental vagaries.

Well, the two papers in question, ‘Both population size and patch quality affect local extinctions and colonizations‘ by Franzén & Nilssen and ‘Environment, but not migration rate, influences extinction risk in experimental metapopulations‘ by Griffen & Drake, show that connectivity (i.e., the probability that populations are connected via migration) are probably the least important components in the extinction-persistence game.

Using a solitary bee (Andrena hattorfiana) metapopulation in Sweden, Franzén & Nilssen show that population size and food patch quality (measured by number of pollen-producing plants) were directly (but independently) correlated with extinction risk. Bigger populations in stable, high-quality patches persisted more readily. However, connectivity between patches was uncorrelated with risk.

Griffen & Drake took quite a different approach and stacked experimental aquaria full of daphnia (Daphnia magna) on top of one another to influence the amount of light (and hence, amount of food from algal growth) to which the populations had access (it’s interesting to note here that this was unplanned in the experiment – the different algal growth rates related to the changing exposure to light was a serendipitous discovery that allowed them to test the ‘food’ hypothesis!). They also controlled the migration rate between populations by varying the size of holes connecting the aquaria. In short, they found that environmentally influenced (i.e., food-influenced) variation was far more important at dictating population size and fluctuation than migration, showing again that conditions promoting large population size and reducing temporal variability are essential for reducing extinction risk.

So what’s the upshot for conservation? Well, many depressed populations are thought to be recoverable by making existing and fragmented habitat patches more connected via ‘corridors’ of suitable habitat. The research highlighted here suggests that more emphasis should be placed instead on building up existing population sizes and ensuring food availability is relatively constant instead of worrying about how many trickling migrants might be moving back and forth. This essentially means that a few skinny corridors connecting population fragments will probably be insufficient to save our imperilled species.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

ResearchBlogging.org

This post was chosen as an Editor's Selection for ResearchBlogging.org

Franzen, M., & Nilsson, S. (2009). Both population size and patch quality affect local extinctions and colonizations Proceedings of the Royal Society B: Biological Sciences DOI: 10.1098/rspb.2009.1584

Griffen, B., & Drake, J. (2009). Environment, but not migration rate, influences extinction risk in experimental metapopulations Proceedings of the Royal Society B: Biological Sciences DOI: 10.1098/rspb.2009.1153