We generally ignore the big issues

11 08 2014

I’ve had a good week at Stanford University with Paul Ehrlich where we’ve been putting the final touches1 on our book. It’s been taking a while to put together, but we’re both pretty happy with the result, which should be published by The University of Chicago Press within the first quarter of 2015.

It has indeed been a pleasure and a privilege to work with one of the greatest thinkers of our age, and let me tell you that at 82, he’s still a force with which to be reckoned. While I won’t divulge much of our discussions here given they’ll appear soon-ish in the book, I did want to raise one subject that I think we all need to think about a little more.

The issue is what we, as ecologists (I’m including conservation scientists here), choose to study and contemplate in our professional life.

I’m just as guilty as most of the rest of you, but I argue that our discipline is caught in a rut of irrelevancy on the grander scale. We spend a lot of time refining the basics of what we essentially already know pretty well. While there will be an eternity of processes to understand, species to describe, and relationships to measure, can our discipline really afford to avoid the biggest issues while biodiversity (and our society included) are flushed down the drain?

Read the rest of this entry »





50/500 or 100/1000 debate not about time frame

26 06 2014

Not enough individualsAs you might recall, Dick Frankham, Barry Brook and I recently wrote a review in Biological Conservation challenging the status quo regarding the famous 50/500 ‘rule’ in conservation management (effective population size [Ne] = 50 to avoid inbreeding depression in the short-term, and Ne = 500 to retain the ability to evolve in perpetuity). Well, it inevitably led to some comments arising in the same journal, but we were only permitted by Biological Conservation to respond to one of them. In our opinion, the other comment was just as problematic, and only further muddied the waters, so it too required a response. In a first for me, we have therefore decided to publish our response on the arXiv pre-print server as well as here on ConservationBytes.com.

50/500 or 100/1000 debate is not about the time frame – Reply to Rosenfeld

cite as: Frankham, R, Bradshaw CJA, Brook BW. 2014. 50/500 or 100/1000 debate is not about the time frame – Reply to Rosenfeld. arXiv: 1406.6424 [q-bio.PE] 25 June 2014.

The Letter from Rosenfeld (2014) in response to Jamieson and Allendorf (2012) and Frankham et al. (2014) and related papers is misleading in places and requires clarification and correction, as follows: Read the rest of this entry »





We’re sorry, but 50/500 is still too few

28 01 2014

too fewSome of you who are familiar with my colleagues’ and my work will know that we have been investigating the minimum viable population size concept for years (see references at the end of this post). Little did I know when I started this line of scientific inquiry that it would end up creating more than a few adversaries.

It might be a philosophical perspective that people adopt when refusing to believe that there is any such thing as a ‘minimum’ number of individuals in a population required to guarantee a high (i.e., almost assured) probability of persistence. I’m not sure. For whatever reason though, there have been some fierce opponents to the concept, or any application of it.

Yet a sizeable chunk of quantitative conservation ecology develops – in various forms – population viability analyses to estimate the probability that a population (or entire species) will go extinct. When the probability is unacceptably high, then various management approaches can be employed (and modelled) to improve the population’s fate. The flip side of such an analysis is, of course, seeing at what population size the probability of extinction becomes negligible.

‘Negligible’ is a subjective term in itself, just like the word ‘very‘ can mean different things to different people. This is why we looked into standardising the criteria for ‘negligible’ for minimum viable population sizes, almost exactly what the near universally accepted IUCN Red List attempts to do with its various (categorical) extinction risk categories.

But most reasonable people are likely to agree that < 1 % chance of going extinct over many generations (40, in the case of our suggestion) is an acceptable target. I’d feel pretty safe personally if my own family’s probability of surviving was > 99 % over the next 40 generations.

Some people, however, baulk at the notion of making generalisations in ecology (funny – I was always under the impression that was exactly what we were supposed to be doing as scientists – finding how things worked in most situations, such that the mechanisms become clearer and clearer – call me a dreamer).

So when we were attacked in several high-profile journals, it came as something of a surprise. The latest lashing came in the form of a Trends in Ecology and Evolution article. We wrote a (necessarily short) response to that article, identifying its inaccuracies and contradictions, but we were unable to expand completely on the inadequacies of that article. However, I’m happy to say that now we have, and we have expanded our commentary on that paper into a broader review. Read the rest of this entry »





Cleaning up the rubbish: Australian megafauna extinctions

15 11 2013

diprotodonA few weeks ago I wrote a post about how to run the perfect scientific workshop, which most of you thought was a good set of tips (bizarrely, one person was quite upset with the message; I saved him the embarrassment of looking stupid online and refrained from publishing his comment).

As I mentioned at the end of post, the stimulus for the topic was a particularly wonderful workshop 12 of us attended at beautiful Linnaeus Estate on the northern coast of New South Wales (see Point 5 in the ‘workshop tips’ post).

But why did a group of ecological modellers (me, Barry Brook, Salvador Herrando-Pérez, Fréd Saltré, Chris Johnson, Nick Beeton), ancient DNA specialists (Alan Cooper), palaeontologists (Gav Prideaux), fossil dating specialists (Dizzy Gillespie, Bert Roberts, Zenobia Jacobs) and palaeo-climatologists (Michael Bird, Chris Turney [in absentia]) get together in the first place? Hint: it wasn’t just the for the beautiful beach and good wine.

I hate to say it – mainly because it deserves as little attention as possible – but the main reason is that we needed to clean up a bit of rubbish. The rubbish in question being the latest bit of excrescence growing on that accumulating heap produced by a certain team of palaeontologists promulgating their ‘it’s all about the climate or nothing’ broken record.

Read the rest of this entry »





Don’t blame it on the dingo

21 08 2013

dingo angelOur postdoc, Tom Prowse, has just had one of the slickest set of reviews I’ve ever seen, followed by a quick acceptance of what I think is a pretty sexy paper. Earlier this year his paper in Journal of Animal Ecology showed that thylacine (the badly named ‘Tasmanian tiger‘) was most likely not the victim of some unobserved mystery disease, but instead succumbed to what many large predators have/will: human beings. His latest effort now online in Ecology shows that the thylacine and devil extinctions on the Australian mainland were similarly the result of humans and not the scapegoat dingo. But I’ll let him explain:

‘Regime shifts’ can occur in ecosystems when sometimes even a single component is added or changed. Such additions, of say a new predator, or changes such as a rise in temperature, can fundamentally alter core ecosystem functions and processes, causing the ecosystem to switch to some alternative stable state.

Some of the most striking examples of ecological regime shifts are the mass extinctions of large mammals (‘megafauna’) during human prehistory. In Australia, human arrival and subsequent hunting pressure is implicated in the rapid extinction of about 50 mammal species by around 45 thousand years ago. The ensuing alternative stable state was comprised of a reduced diversity of predators, dominated by humans and two native marsupial predators ‑ the thylacine (also known as the marsupial ‘tiger’ or ‘wolf’) and the devil (which is now restricted to Tasmania and threatened by a debilitating, infectious cancer).

Both thylacines and devils lasted on mainland Australia for over 40 thousand years following the arrival of humans. However, a second regime shift resulted in the extinction of both these predators by about 3 thousand years ago, which was coincidentally just after dingoes were introduced to Australia. Dingoes are descended from early domestic dogs and were introduced to northern Australia from Asia by ancient traders approximately 4 thousand years ago. Today, they are Australia’s only top predator remaining, other than invasive European foxes and feral cats. Since the earliest days of European settlement, dingoes have been persecuted because they prey on livestock. During the 1880s, 5614 km of ‘dingo fence’ was constructed to protect south-east Australia’s grazing rangelands from dingo incursions. The fence is maintained to this day, and dingoes are poisoned and shot both inside and outside this barrier, despite mounting evidence that these predators play a key role in maintaining native ecosystems, largely by suppressing invasive predators.

Perhaps because the public perception of dingoes as ‘sheep-killers’ is so firmly entrenched, it has been commonly assumed that dingoes killed off the thylacines and devils on mainland Australia. People who support this view also point out that thylacines and devils persisted on the island of Tasmania, which was never colonised by dingoes (although thylacines went extinct there too in the early 1900s). To date, most discussion of the mainland thylacine and devil extinctions has focused on the possibility that dingoes disrupted the system by ‘exploitation competition’ (eating the same prey), ‘interference competition’ (wasting the native predators’ precious munching time), as well as ‘direct predation’ (dingoes actually eating devils and thylacines). Read the rest of this entry »





Saving world’s most threatened cat requires climate adaptation

23 07 2013
© CSIC Andalusia Audiovisual Bank/H. Garrido

© CSIC Andalusia Audiovisual Bank/H. Garrido

The Iberian lynx is the world’s most threatened cat, with recent counts estimating only 250 individuals surviving in the wild. Recent declines of Iberian lynx have been associated with sharp regional reductions in the abundance of its main prey, the European rabbit, caused mainly by myxomatosis virus and rabbit haemorrhagic disease. At present, only two Iberian lynx populations persist in the wild compared with nine in the 1990s.

Over €90 million has been spent since 1994 to mitigate the extinction risk of this charismatic animal, mainly through habitat management, reduction of human-caused mortality and, more recently, translocation. Although lynx abundance might have increased in the last ten years in response to intensive management, a new study published in Nature Climate Change warns that the ongoing conservation strategies could buy just a few decades before the species goes extinct.

The study led by Damien Fordham from The Environment Institute (The University of Adelaide) and Miguel Araújo from the Integrative Biogeography and Global Change Group (Spanish Research Council) shows that climate change could lead to a rapid and severe decrease in lynx abundance in coming decades, and probably result in its extinction in the wild within 50 years. Current management efforts could be futile if they do not take into account the combined effects of climate change, land use and prey abundance on population dynamics of the Iberian Lynx.

Read the rest of this entry »





Software tools for conservation biologists

8 04 2013

computer-programmingGiven the popularity of certain prescriptive posts on ConservationBytes.com, I thought it prudent to compile a list of software that my lab and I have found particularly useful over the years. This list is not meant to be comprehensive, but it will give you a taste for what’s out there. I don’t list the plethora of conservation genetics software that is available (generally given my lack of experience with it), but if this is your chosen area, I’d suggest starting with Dick Frankham‘s excellent book, An Introduction to Conservation Genetics.

1. R: If you haven’t yet loaded the open-source R programming language on your machine, do it now. It is the single-most-useful bit of statistical and programming software available to anyone anywhere in the sciences. Don’t worry if you’re not a fully fledged programmer – there are now enough people using and developing sophisticated ‘libraries’ (packages of functions) that there’s pretty much an application for everything these days. We tend to use R to the exclusion of almost any other statistical software because it makes you learn the technique rather than just blindly pressing the ‘go’ button. You could also stop right here – with R, you can do pretty much everything else that the software listed below does; however, you have to be an exceedingly clever programmer and have a lot of spare time. R can also sometimes get bogged down with too much filled RAM, in which case other, compiled languages such as PYTHON and C# are useful.

2. VORTEX/OUTBREAK/META-MODEL MANAGER, etc.: This suite of individual-based projection software was designed by Bob Lacy & Phil Miller initially to determine the viability of small (usually captive) populations. The original VORTEX has grown into a multi-purpose, powerful and sophisticated population viability analysis package that now links to its cousin applications like OUTBREAK (the only off-the-shelf epidemiological software in existence) via the ‘command centre’ META-MODEL MANAGER (see an examples here and here from our lab). There are other add-ons that make almost any population projection and hindcasting application possible. And it’s all free! (warning: currently unavailable for Mac, although I’ve been pestering Bob to do a Mac version).

3. RAMAS: RAMAS is the go-to application for spatial population modelling. Developed by the extremely clever Resit Akçakaya, this is one of the only tools that incorporates spatial meta-population aspects with formal, cohort-based demographic models. It’s also very useful in a climate-change context when you have projections of changing habitat suitability as the base layer onto which meta-population dynamics can be modelled. It’s not free, but it’s worth purchasing. Read the rest of this entry »








Follow

Get every new post delivered to your Inbox.

Join 6,496 other followers

%d bloggers like this: