First Australians arrived in large groups using complex technologies

18 06 2019

file-20190325-36276-12v4jq2

One of the most ancient peopling events of the great diaspora of anatomically modern humans out of Africa more than 50,000 years ago — human arrival in the great continent of Sahul (New Guinea, mainland Australia & Tasmania joined during periods of low sea level) — remains mysterious. The entry routes taken, whether migration was directed or accidental, and just how many people were needed to ensure population viability are shrouded by the mists of time. This prompted us to build stochastic, age-structured human population-dynamics models incorporating hunter-gatherer demographic rates and palaeoecological reconstructions of environmental carrying capacity to predict the founding population necessary to survive the initial peopling of late-Pleistocene Sahul.

As ecological modellers, we are often asked by other scientists to attempt to render the highly complex mechanisms of entire ecosystems tractable for virtual manipulation and hypothesis testing through the inevitable simplification that is ‘a model’. When we work with scientists studying long-since-disappeared ecosystems, the challenges multiply.

Add some multidisciplinary data and concepts into the mix, and the complexity can quickly escalate.

We do have, however, some powerful tools in our modelling toolbox, so as the Modelling Node for the Australian Research Council Centre of Excellence for Australian Biodiversity and Heritage (CABAH), our role is to link disparate fields like palaeontology, archaeology, geochronology, climatology, and genetics together with mathematical ‘glue’ to answer the big questions regarding Australia’s ancient past.

This is how we tackled one of these big questions: just how did the first anatomically modern Homo sapiens make it to the continent and survive?

At that time, Australia was part of the giant continent of Sahul that connected New Guinea, mainland Australia, and Tasmania at times of lower sea level. In fact, throughout most of last ~ 126,000 years (late Pleistocene and much of the Holocene), Sahul was the dominant landmass in the region (see this handy online tool for how the coastline of Sahul changed over this period).

Read the rest of this entry »





Why populations can’t be saved by a single breeding pair

3 04 2018

620x349

© Reuters/Thomas Mukoya

I published this last week on The Conversation, and now reproducing it here for CB.com readers.

 

Two days ago, the last male northern white rhino (Ceratotherium simum cottoni) died. His passing leaves two surviving members of his subspecies: both females who are unable to bear calves.

Even though it might not be quite the end of the northern white rhino because of the possibility of implanting frozen embryos in their southern cousins (C. simum simum), in practical terms, it nevertheless represents the end of a long decline for the subspecies. It also raises the question: how many individuals does a species need to persist?

Fiction writers have enthusiastically embraced this question, most often in the post-apocalypse genre. It’s a notion with a long past; the Adam and Eve myth is of course based on a single breeding pair populating the entire world, as is the case described in the Ragnarok, the final battle of the gods in Norse mythology.

This idea dovetails neatly with the image of Noah’s animals marching “two by two” into the Ark. But the science of “minimum viable populations” tells us a different story.

No inbreeding, please

The global gold standard used to assess the extinction risk of any species is the International Union for the Conservation of Nature (IUCN) Red List of Threatened Species. Read the rest of this entry »





Predicting sustainable shark harvests when stock assessments are lacking

26 03 2018
srb 1

© Andrew Fox

I love it when a good collaboration bears fruit, and our latest paper is a good demonstration of that principle.

It all started a few years ago with an ARC Linkage Project grant we received to examine how the whaler shark fishing industry in Australia might manage its stocks better.

As I’m sure many are aware, sharks around the world aren’t doing terribly well (surprise, surprise — yet another taxon suffering at the hands of humankind). And while some populations (‘stocks’, in the dissociative parlance of the fishing industry) are doing better than others, and some countries have a better track record in managing these stocks than others, the overall outlook is grim.

One of the main reasons sharks tend to fair worse than bony fishes (teleosts) for the same fishing effort is their ‘slow’ life histories. It doesn’t take an advanced quantitative ecology degree to understand that growing slowly, breeding late, and producing few offspring is a good indication that a species can’t handle too much killing before populations start to dwindle. As is the case for most large shark species, I tend to think of them in a life-history sense as similar to large terrestrial mammals.

Now, you’d figure that a taxon with intrinsic susceptibility to fishing would have heaps of good data with which managers could monitor catches and quotas so that declines could be avoided. However, the reality is generally the inverse, with many populations having poor information regarding vital rates (e.g., survival, fertility), age structure, density feedback characteristics, and even simple estimates of abundance. Without such key information, management tends to be ad hoc and often not very effective. Read the rest of this entry »





Four decades of fragmentation

27 09 2017

fragmented

I’ve recently read perhaps the most comprehensive treatise of forest fragmentation research ever compiled, and I personally view this rather readable and succinct review by Bill Laurance and colleagues as something every ecology and conservation student should read.

The ‘Biological Dynamics of Forest Fragments Project‘ (BDFFP) is unquestionably one of the most important landscape-scale experiments ever conceived and implemented, now having run 38 years since its inception in 1979. Indeed, it was way ahead of its time.

Experimental studies in ecology are comparatively rare, namely because it is difficult, expensive, and challenging in the extreme to manipulate entire ecosystems to test specific hypotheses relating to the response of biodiversity to environmental change. Thus, we ecologists tend to rely more on mensurative designs that use existing variation in the landscape (or over time) to infer mechanisms of community change. Of course, such experiments have to be large to be meaningful, which is one reason why the 1000 km2 BDFFP has been so successful as the gold standard for determining the effects of forest fragmentation on biodiversity.

And successful it has been. A quick search for ‘BDFFP’ in the Web of Knowledge database identifies > 40 peer-reviewed articles and a slew of books and book chapters arising from the project, some of which are highly cited classics in conservation ecology (e.g., doi:10.1046/j.1523-1739.2002.01025.x cited > 900 times; doi:10.1073/pnas.2336195100 cited > 200 times; doi:10.1016/j.biocon.2010.09.021 cited > 400 times; and doi:10.1111/j.1461-0248.2009.01294.x cited nearly 600 times). In fact, if we are to claim any ecological ‘laws’ at all, our understanding of fragmentation on biodiversity could be labelled as one of the few, thanks principally to the BDFFP. Read the rest of this entry »





Two new postdoctoral positions in ecological network & vegetation modelling announced

21 07 2017

19420366_123493528240028_621031473222812853_n

With the official start of the new ARC Centre of Excellence for Australian Biodiversity and Heritage (CABAH) in July, I am pleased to announce two new CABAH-funded postdoctoral positions (a.k.a. Research Associates) in my global ecology lab at Flinders University in Adelaide (Flinders Modelling Node).

One of these positions is a little different, and represents something of an experiment. The Research Associate in Palaeo-Vegetation Modelling is being restricted to women candidates; in other words, we’re only accepting applications from women for this one. In a quest to improve the gender balance in my lab and in universities in general, this is a step in the right direction.

The project itself is not overly prescribed, but we would like something along the following lines of inquiry: Read the rest of this entry »





Sensitive numbers

22 03 2016
toondoo.com

A sensitive parameter

You couldn’t really do ecology if you didn’t know how to construct even the most basic mathematical model — even a simple regression is a model (the non-random relationship of some variable to another). The good thing about even these simple models is that it is fairly straightforward to interpret the ‘strength’ of the relationship, in other words, how much variation in one thing can be explained by variation in another. Provided the relationship is real (not random), and provided there is at least some indirect causation implied (i.e., it is not just a spurious coincidence), then there are many simple statistics that quantify this strength — in the case of our simple regression, the coefficient of determination (R2) statistic is a usually a good approximation of this.

In the case of more complex multivariate correlation models, then sometimes the coefficient of determination is insufficient, in which case you might need to rely on statistics such as the proportion of deviance explained, or the marginal and/or conditional variance explained.

When you go beyond this correlative model approach and start constructing more mechanistic models that emulate ecological phenomena from the bottom-up, things get a little more complicated when it comes to quantifying the strength of relationships. Perhaps the most well-known category of such mechanistic models is the humble population viability analysis, abbreviated to PVA§.

Let’s take the simple case of a four-parameter population model we could use to project population size over the next 10 years for an endangered species that we’re introducing to a new habitat. We’ll assume that we have the following information: the size of the founding (introduced) population (n), the juvenile survival rate (Sj, proportion juveniles surviving from birth to the first year), the adult survival rate (Sa, the annual rate of surviving adults to year 1 to maximum longevity), and the fertility rate of mature females (m, number of offspring born per female per reproductive cycle). Each one of these parameters has an associated uncertainty (ε) that combines both measurement error and environmental variation.

If we just took the mean value of each of these three demographic rates (survivals and fertility) and project a founding population of = 10 individuals for 1o years into the future, we would have a single, deterministic estimate of the average outcome of introducing 10 individuals. As we already know, however, the variability, or stochasticity, is more important than the average outcome, because uncertainty in the parameter values (ε) will mean that a non-negligible number of model iterations will result in the extinction of the introduced population. This is something that most conservationists will obviously want to minimise.

So each time we run an iteration of the model, and generally for each breeding interval (most often 1 year at a time), we choose (based on some random-sampling regime) a different value for each parameter. This will give us a distribution of outcomes after the 10-year projection. Let’s say we did 1000 iterations like this; taking the number of times that the population went extinct over these iterations would provide us with an estimate of the population’s extinction probability over that interval. Of course, we would probably also vary the size of the founding population (say, between 10 and 100), to see at what point the extinction probability became acceptably low for managers (i.e., as close to zero as possible), but not unacceptably high that it would be too laborious or expensive to introduce that many individuals. Read the rest of this entry »





Outright bans of trophy hunting could do more harm than good

5 01 2016

In July 2015 an American dentist shot and killed a male lion called ‘Cecil’ with a hunting bow and arrow, an act that sparked a storm of social media outrage. Cecil was a favourite of tourists visiting Hwange National Park in Zimbabwe, and so the allegation that he was lured out of the Park to neighbouring farmland added considerable fuel to the flames of condemnation. Several other aspects of the hunt, such as baiting close to national park boundaries, were allegedly done illegally and against the spirit and ethical norms of a managed trophy hunt.

In May 2015, a Texan legally shot a critically endangered black rhino in Namibia, which also generated considerable online ire. The backlash ensued even though the male rhino was considered ‘surplus’ to Namibia’s black rhino populations, and the US$350,000 generated from the managed hunt was to be re-invested in conservation. Together, these two incidents have triggered vociferous appeals to ban trophy hunting throughout Africa.

These highly politicized events are but a small component of a large industry in Africa worth > US$215 million per year that ‘sells’ iconic animals to (mainly foreign) hunters as a means of generating otherwise scarce funds. While to most people this might seem like an abhorrent way to generate money, we argue in a new paper that sustainable-use activities, such as trophy hunting, can be an important tool in the conservationist’s toolbox. Conserving biodiversity can be expensive, so generating money is a central preoccupation of many environmental NGOs, conservation-minded individuals, government agencies and scientists. Making money for conservation in Africa is even more challenging, and so we argue that trophy hunting should and could fill some of that gap. Read the rest of this entry »





Avoiding genetic rescue not justified on genetic grounds

12 03 2015
Genetics to the rescue!

Genetics to the rescue!

I had the pleasure today of reading a new paper by one of the greatest living conservation geneticists, Dick Frankham. As some of CB readers might remember, I’ve also published some papers with Dick over the last few years, with the most recent challenging the very basis for the IUCN Red List category thresholds (i.e., in general, they’re too small).

Dick’s latest paper in Molecular Ecology is a meta-analysis designed to test whether there are any genetic grounds for NOT attempting genetic rescue for inbreeding-depressed populations. I suppose a few definitions are in order here. Genetic rescue is the process, either natural or facilitated, where inbred populations (i.e., in a conservation sense, those comprising too many individuals bonking their close relatives because the population in question is small) receive genes from another population such that their overall genetic diversity increases. In the context of conservation genetics, ‘inbreeding depression‘ simply means reduced biological fitness (fertility, survival, longevity, etc.) resulting from parents being too closely related.

Seems like an important thing to avoid, so why not attempt to facilitate gene flow among populations such that those with inbreeding depression can be ‘rescued’? In applied conservation, there are many reasons given for not attempting genetic rescue: Read the rest of this entry »





We generally ignore the big issues

11 08 2014

I’ve had a good week at Stanford University with Paul Ehrlich where we’ve been putting the final touches1 on our book. It’s been taking a while to put together, but we’re both pretty happy with the result, which should be published by The University of Chicago Press within the first quarter of 2015.

It has indeed been a pleasure and a privilege to work with one of the greatest thinkers of our age, and let me tell you that at 82, he’s still a force with which to be reckoned. While I won’t divulge much of our discussions here given they’ll appear soon-ish in the book, I did want to raise one subject that I think we all need to think about a little more.

The issue is what we, as ecologists (I’m including conservation scientists here), choose to study and contemplate in our professional life.

I’m just as guilty as most of the rest of you, but I argue that our discipline is caught in a rut of irrelevancy on the grander scale. We spend a lot of time refining the basics of what we essentially already know pretty well. While there will be an eternity of processes to understand, species to describe, and relationships to measure, can our discipline really afford to avoid the biggest issues while biodiversity (and our society included) are flushed down the drain?

Read the rest of this entry »





50/500 or 100/1000 debate not about time frame

26 06 2014

Not enough individualsAs you might recall, Dick Frankham, Barry Brook and I recently wrote a review in Biological Conservation challenging the status quo regarding the famous 50/500 ‘rule’ in conservation management (effective population size [Ne] = 50 to avoid inbreeding depression in the short-term, and Ne = 500 to retain the ability to evolve in perpetuity). Well, it inevitably led to some comments arising in the same journal, but we were only permitted by Biological Conservation to respond to one of them. In our opinion, the other comment was just as problematic, and only further muddied the waters, so it too required a response. In a first for me, we have therefore decided to publish our response on the arXiv pre-print server as well as here on ConservationBytes.com.

50/500 or 100/1000 debate is not about the time frame – Reply to Rosenfeld

cite as: Frankham, R, Bradshaw CJA, Brook BW. 2014. 50/500 or 100/1000 debate is not about the time frame – Reply to Rosenfeld. arXiv: 1406.6424 [q-bio.PE] 25 June 2014.

The Letter from Rosenfeld (2014) in response to Jamieson and Allendorf (2012) and Frankham et al. (2014) and related papers is misleading in places and requires clarification and correction, as follows: Read the rest of this entry »





We’re sorry, but 50/500 is still too few

28 01 2014

too fewSome of you who are familiar with my colleagues’ and my work will know that we have been investigating the minimum viable population size concept for years (see references at the end of this post). Little did I know when I started this line of scientific inquiry that it would end up creating more than a few adversaries.

It might be a philosophical perspective that people adopt when refusing to believe that there is any such thing as a ‘minimum’ number of individuals in a population required to guarantee a high (i.e., almost assured) probability of persistence. I’m not sure. For whatever reason though, there have been some fierce opponents to the concept, or any application of it.

Yet a sizeable chunk of quantitative conservation ecology develops – in various forms – population viability analyses to estimate the probability that a population (or entire species) will go extinct. When the probability is unacceptably high, then various management approaches can be employed (and modelled) to improve the population’s fate. The flip side of such an analysis is, of course, seeing at what population size the probability of extinction becomes negligible.

‘Negligible’ is a subjective term in itself, just like the word ‘very‘ can mean different things to different people. This is why we looked into standardising the criteria for ‘negligible’ for minimum viable population sizes, almost exactly what the near universally accepted IUCN Red List attempts to do with its various (categorical) extinction risk categories.

But most reasonable people are likely to agree that < 1 % chance of going extinct over many generations (40, in the case of our suggestion) is an acceptable target. I’d feel pretty safe personally if my own family’s probability of surviving was > 99 % over the next 40 generations.

Some people, however, baulk at the notion of making generalisations in ecology (funny – I was always under the impression that was exactly what we were supposed to be doing as scientists – finding how things worked in most situations, such that the mechanisms become clearer and clearer – call me a dreamer).

So when we were attacked in several high-profile journals, it came as something of a surprise. The latest lashing came in the form of a Trends in Ecology and Evolution article. We wrote a (necessarily short) response to that article, identifying its inaccuracies and contradictions, but we were unable to expand completely on the inadequacies of that article. However, I’m happy to say that now we have, and we have expanded our commentary on that paper into a broader review. Read the rest of this entry »





Cleaning up the rubbish: Australian megafauna extinctions

15 11 2013

diprotodonA few weeks ago I wrote a post about how to run the perfect scientific workshop, which most of you thought was a good set of tips (bizarrely, one person was quite upset with the message; I saved him the embarrassment of looking stupid online and refrained from publishing his comment).

As I mentioned at the end of post, the stimulus for the topic was a particularly wonderful workshop 12 of us attended at beautiful Linnaeus Estate on the northern coast of New South Wales (see Point 5 in the ‘workshop tips’ post).

But why did a group of ecological modellers (me, Barry Brook, Salvador Herrando-Pérez, Fréd Saltré, Chris Johnson, Nick Beeton), geneticists, palaeontologists (Gav Prideaux), fossil dating specialists (Dizzy Gillespie, Bert Roberts, Zenobia Jacobs) and palaeo-climatologists (Michael Bird, Chris Turney [in absentia]) get together in the first place? Hint: it wasn’t just the for the beautiful beach and good wine.

I hate to say it – mainly because it deserves as little attention as possible – but the main reason is that we needed to clean up a bit of rubbish. The rubbish in question being the latest bit of excrescence growing on that accumulating heap produced by a certain team of palaeontologists promulgating their ‘it’s all about the climate or nothing’ broken record.

Read the rest of this entry »





Don’t blame it on the dingo

21 08 2013

dingo angelOur postdoc, Tom Prowse, has just had one of the slickest set of reviews I’ve ever seen, followed by a quick acceptance of what I think is a pretty sexy paper. Earlier this year his paper in Journal of Animal Ecology showed that thylacine (the badly named ‘Tasmanian tiger‘) was most likely not the victim of some unobserved mystery disease, but instead succumbed to what many large predators have/will: human beings. His latest effort now online in Ecology shows that the thylacine and devil extinctions on the Australian mainland were similarly the result of humans and not the scapegoat dingo. But I’ll let him explain:

‘Regime shifts’ can occur in ecosystems when sometimes even a single component is added or changed. Such additions, of say a new predator, or changes such as a rise in temperature, can fundamentally alter core ecosystem functions and processes, causing the ecosystem to switch to some alternative stable state.

Some of the most striking examples of ecological regime shifts are the mass extinctions of large mammals (‘megafauna’) during human prehistory. In Australia, human arrival and subsequent hunting pressure is implicated in the rapid extinction of about 50 mammal species by around 45 thousand years ago. The ensuing alternative stable state was comprised of a reduced diversity of predators, dominated by humans and two native marsupial predators ‑ the thylacine (also known as the marsupial ‘tiger’ or ‘wolf’) and the devil (which is now restricted to Tasmania and threatened by a debilitating, infectious cancer).

Both thylacines and devils lasted on mainland Australia for over 40 thousand years following the arrival of humans. However, a second regime shift resulted in the extinction of both these predators by about 3 thousand years ago, which was coincidentally just after dingoes were introduced to Australia. Dingoes are descended from early domestic dogs and were introduced to northern Australia from Asia by ancient traders approximately 4 thousand years ago. Today, they are Australia’s only top predator remaining, other than invasive European foxes and feral cats. Since the earliest days of European settlement, dingoes have been persecuted because they prey on livestock. During the 1880s, 5614 km of ‘dingo fence’ was constructed to protect south-east Australia’s grazing rangelands from dingo incursions. The fence is maintained to this day, and dingoes are poisoned and shot both inside and outside this barrier, despite mounting evidence that these predators play a key role in maintaining native ecosystems, largely by suppressing invasive predators.

Perhaps because the public perception of dingoes as ‘sheep-killers’ is so firmly entrenched, it has been commonly assumed that dingoes killed off the thylacines and devils on mainland Australia. People who support this view also point out that thylacines and devils persisted on the island of Tasmania, which was never colonised by dingoes (although thylacines went extinct there too in the early 1900s). To date, most discussion of the mainland thylacine and devil extinctions has focused on the possibility that dingoes disrupted the system by ‘exploitation competition’ (eating the same prey), ‘interference competition’ (wasting the native predators’ precious munching time), as well as ‘direct predation’ (dingoes actually eating devils and thylacines). Read the rest of this entry »





Saving world’s most threatened cat requires climate adaptation

23 07 2013
© CSIC Andalusia Audiovisual Bank/H. Garrido

© CSIC Andalusia Audiovisual Bank/H. Garrido

The Iberian lynx is the world’s most threatened cat, with recent counts estimating only 250 individuals surviving in the wild. Recent declines of Iberian lynx have been associated with sharp regional reductions in the abundance of its main prey, the European rabbit, caused mainly by myxomatosis virus and rabbit haemorrhagic disease. At present, only two Iberian lynx populations persist in the wild compared with nine in the 1990s.

Over €90 million has been spent since 1994 to mitigate the extinction risk of this charismatic animal, mainly through habitat management, reduction of human-caused mortality and, more recently, translocation. Although lynx abundance might have increased in the last ten years in response to intensive management, a new study published in Nature Climate Change warns that the ongoing conservation strategies could buy just a few decades before the species goes extinct.

The study led by Damien Fordham from The Environment Institute (The University of Adelaide) and Miguel Araújo from the Integrative Biogeography and Global Change Group (Spanish Research Council) shows that climate change could lead to a rapid and severe decrease in lynx abundance in coming decades, and probably result in its extinction in the wild within 50 years. Current management efforts could be futile if they do not take into account the combined effects of climate change, land use and prey abundance on population dynamics of the Iberian Lynx.

Read the rest of this entry »





Software tools for conservation biologists

8 04 2013

computer-programmingGiven the popularity of certain prescriptive posts on ConservationBytes.com, I thought it prudent to compile a list of software that my lab and I have found particularly useful over the years. This list is not meant to be comprehensive, but it will give you a taste for what’s out there. I don’t list the plethora of conservation genetics software that is available (generally given my lack of experience with it), but if this is your chosen area, I’d suggest starting with Dick Frankham‘s excellent book, An Introduction to Conservation Genetics.

1. R: If you haven’t yet loaded the open-source R programming language on your machine, do it now. It is the single-most-useful bit of statistical and programming software available to anyone anywhere in the sciences. Don’t worry if you’re not a fully fledged programmer – there are now enough people using and developing sophisticated ‘libraries’ (packages of functions) that there’s pretty much an application for everything these days. We tend to use R to the exclusion of almost any other statistical software because it makes you learn the technique rather than just blindly pressing the ‘go’ button. You could also stop right here – with R, you can do pretty much everything else that the software listed below does; however, you have to be an exceedingly clever programmer and have a lot of spare time. R can also sometimes get bogged down with too much filled RAM, in which case other, compiled languages such as PYTHON and C# are useful.

2. VORTEX/OUTBREAK/META-MODEL MANAGER, etc.: This suite of individual-based projection software was designed by Bob Lacy & Phil Miller initially to determine the viability of small (usually captive) populations. The original VORTEX has grown into a multi-purpose, powerful and sophisticated population viability analysis package that now links to its cousin applications like OUTBREAK (the only off-the-shelf epidemiological software in existence) via the ‘command centre’ META-MODEL MANAGER (see an examples here and here from our lab). There are other add-ons that make almost any population projection and hindcasting application possible. And it’s all free! (warning: currently unavailable for Mac, although I’ve been pestering Bob to do a Mac version).

3. RAMAS: RAMAS is the go-to application for spatial population modelling. Developed by the extremely clever Resit Akçakaya, this is one of the only tools that incorporates spatial meta-population aspects with formal, cohort-based demographic models. It’s also very useful in a climate-change context when you have projections of changing habitat suitability as the base layer onto which meta-population dynamics can be modelled. It’s not free, but it’s worth purchasing. Read the rest of this entry »





Want to work with us?

22 03 2013
© Beboy-Fotolia

© Beboy-Fotolia

Today we announced a HEAP of positions in our Global Ecology Lab for hot-shot, up-and-coming ecologists. If you think you’ve got what it takes, I encourage you to apply. The positions are all financed by the Australian Research Council from grants that Barry Brook, Phill Cassey, Damien Fordham and I have all been awarded in the last few years. We decided to do a bulk advertisement so that we maximise the opportunity for good science talent out there.

We’re looking for bright, mathematically adept people in palaeo-ecology, wildlife population modelling, disease modelling, climate change modelling and species distribution modelling.

The positions are self explanatory, but if you want more information, just follow the links and contacts given below. For my own selfish interests, I provide a little more detail for two of the positions for which I’m directly responsible – but please have a look at the lot.

Good luck!

CJA Bradshaw

Job Reference Number: 17986 & 17987

The world-leading Global Ecology Group within the School of Earth and Environmental Sciences currently has multiple academic opportunities. For these two positions, we are seeking a Postdoctoral Research Associate and a Research Associate to work in palaeo-ecological modelling. Read the rest of this entry »





Science immortalised in cartoon

1 02 2013

Well, this is a first for me (us).

I’ve never had a paper of ours turned into a cartoon. The illustrious and brilliant ‘First Dog on the Moon‘ (a.k.a. Andrew Marlton) who is chief cartoonist for Australia’s irreverent ‘Crikey‘ online news magazine just parodied our Journal of Animal Ecology paper No need for disease: testing extinction hypotheses for the thylacine using multispecies metamodels that I wrote about a last month here on ConservationBytes.com.

Needless to say, I’m chuffed as a chuffed thing.

Enjoy!

Stripey





No need for disease

7 01 2013

dead or alive thylacineIt’s human nature to abhor admitting an error, and I’d wager that it’s even harder for the average person (psycho- and sociopaths perhaps excepted) to admit being a bastard responsible for the demise of someone, or something else. Examples abound. Think of much of society’s unwillingness to accept responsibility for global climate disruption (how could my trips to work and occasional holiday flight be killing people on the other side of the planet?). Or, how about fishers refusing to believe that they could be responsible for reductions in fish stocks? After all, killing fish couldn’t possibly …er, kill fish? Another one is that bastion of reverse racism maintaining that ancient or traditionally living peoples (‘noble savages’) could never have wiped out other species.

If you’re a rational person driven by evidence rather than hearsay, vested interest or faith, then the above examples probably sound ridiculous. But rest assured, millions of people adhere to these points of view because of the phenomenon mentioned in the first sentence above. With this background then, I introduce a paper that’s almost available online (i.e., we have the DOI, but the online version is yet to appear). Produced by our extremely clever post-doc, Tom Prowse, the paper is entitled: No need for disease: testing extinction hypotheses for the thylacine using multispecies metamodels, and will soon appear in Journal of Animal Ecology.

Of course, I am biased being a co-author, but I think this paper really demonstrates the amazing power of retrospective multi-species systems modelling to provide insight into phenomena that are impossible to test empirically – i.e., questions of prehistoric (and in some cases, even data-poor historic) ecological change. The megafauna die-off controversy is one we’ve covered before here on ConservationBytes.com, and this is a related issue with respect to a charismatic extinction in Australia’s recent history – the loss of the Tasmanian thylacine (‘tiger’, ‘wolf’ or whatever inappropriate eutherian epithet one unfortunately chooses to apply). Read the rest of this entry »





Conservation catastrophes

22 02 2012

David Reed

The title of this post serves two functions: (1) to introduce the concept of ecological catastrophes in population viability modelling, and (2) to acknowledge the passing of the bloke who came up with a clever way of dealing with that uncertainty.

I’ll start with latter first. It came to my attention late last year that a fellow conservation biologist colleague, Dr. David Reed, died unexpectedly from congestive heart failure. I did not really mourn his passing, for I had never met him in person (I believe it is disingenuous, discourteous, and slightly egocentric to mourn someone who you do not really know personally – but that’s just my opinion), but I did think at the time that the conservation community had lost another clever progenitor of good conservation science. As many CB readers already know, we lost a great conservation thinker and doer last year, Professor Navjot Sodhi (and that, I did take personally). Coincidentally, both Navjot and David died at about the same age (49 and 48, respectively). I hope that the being in one’s late 40s isn’t particularly presaged for people in my line of business!

My friend, colleague and lab co-director, Professor Barry Brook, did, however, work a little with David, and together they published some pretty cool stuff (see References below). David was particularly good at looking for cross-taxa generalities in conservation phenomena, such as minimum viable population sizes, effects of inbreeding depression, applications of population viability analysis and extinction risk. But more on some of that below. Read the rest of this entry »





Better SAFE than sorry

30 11 2011

Last day of November already – I am now convinced that my suspicions are correct: time is not constant and in fact accelerates as you age (in mathematical terms, a unit of time becomes a progressively smaller proportion of the time elapsed since your birth, so this makes sense). But, I digress…

This short post will act mostly as a spruik for my upcoming talk at the International Congress for Conservation Biology next week in Auckland (10.30 in New Zealand Room 2 on Friday, 9 December) entitled: Species Ability to Forestall Extinction (SAFE) index for IUCN Red Listed species. The post also sets a bit of the backdrop to this paper and why I think people might be interested in attending.

As regular readers of CB will know, we published a paper this year in Frontiers in Ecology and the Environment describing a relatively simple metric we called SAFE (Species Ability to Forestall Extinction) that could enhance the information provided by the IUCN Red List of Threatened Species for assessing relative extinction threat. I won’t go into all the detail here (you can read more about it in this previous post), but I do want to point out that it ended up being rather controversial.

The journal ended up delaying final publication because there were 3 groups who opposed the metric rather vehemently, including people who are very much in the conservation decision-making space and/or involved directly with the IUCN Red List. The journal ended up publishing our original paper, the 3 critiques, and our collective response in the same issue (you can read these here if you’re subscribed, or email me for a PDF reprint). Again, I won’t go into an detail here because our arguments are clearly outlined in the response.

What I do want to highlight is that even beyond the normal in-print tête-à-tête the original paper elicited, we were emailed by several people behind the critiques who were apparently unsatisfied with our response. We found this slightly odd, because many of the objections just kept getting re-raised. Of particular note were the accusations that: Read the rest of this entry »