And this little piggy went extinct

24 11 2021

Back in June of this year I wrote (whinged) about the disappointment of writing a lot of ecological models that were rarely used to assist real-world wildlife management. However, I did hint that another model I wrote had assistance one government agency with pig management on Kangaroo Island.

Well, now that report has been published online and I’m permitted to talk about it. I’m also very happy to report that, in the words of the Government of South Australia’s Department of Primary Industries and Regions (PIRSA),

Modelling by the Flinders University Global Ecology Laboratory shows the likelihood and feasibility of feral pig eradication under different funding and eradication scenarios. With enough funding, feral pigs could be eradicated from Kangaroo Island in 2 years.

This basically means that because of the model, PIRSA was successful in obtaining enough funding to pretty much ensure that the eradication of feral pigs from Kangaroo Island will be feasible!

Why is this important to get rid of feral pigs? They are a major pest on the Island, causing severe economic and environmental impacts both to farms and native ecosystems. On the agricultural side of things, they prey on newborn lambs, eat crops, and compete with livestock for pasture. Feral pigs damage natural habitats by up-rooting vegetation and fouling waterholes. They can also spread weeds and damage infrastructure, as well as act as hosts of parasites and diseases (e.g., leptospirosis, tuberculosis, foot-and-mouth disease) that pose serious threats to industry, wildlife, and even humans.

Read the rest of this entry »




Free resources for learning (and getting better with) R

15 11 2021

While I’m currently in Github mode (see previous post), I thought I’d share a list of resources I started putting together for learning and upskilling in the R programming language.

If you don’t know what R is, this probably won’t be of much use to you. But if you are a novice user, want to improve your skills, or just have access to a kick-arse list of cheatsheets, then this Github repository should be useful.

I started putting this list together for members of the Australian Research Council Centre of Excellence for Australian Biodiversity and Heritage, but I see no reason why it should be limited to that particular group of people.

I don’t claim that this list is exhaustive, nor do I vouch for the quality of any of the listed resources. Some of them are deprecated and fairly old too, so be warned.

The first section includes online resources such as short courses, reference guides, analysis demos, tips for more-efficient programming, better plotting guidelines, as well as some R-related mini-universes like markdown, ggplot, Shiny, and tidyverse.

The section following is a list of popular online communities, list-servers, and blogs that help R users track down advice for solving niggly coding and statistical problems.

The next section is a whopping-great archive of R cheatsheets, covering everything from the basics, plotting, cartography, databasing, applications, time series analysis, machine learning, time & date, building packages, parallel computing, resampling methods, markdown, and more.

Read the rest of this entry »




Want a permanent DOI assigned to your data and code? Follow this simple recipe

2 11 2021

These days with data and code often required to be designated as open-source, licenced, and fully trackable for most manuscript submissions to a peer-reviewed journal, it’s easy to get lost in the multitude of platforms and options available. In most cases, we no longer have much of a choice to do so, even if you are reticent (although the benefits of posting your data and code online immediately far outweigh any potential disadvantages).

But do you post your data and code on the Open Science Framework (free), Github (free), Figshare (free), Zenodo (free, but donations encouraged), Dryad ($), or Harvard Dataverse (free) (and so on, and so on, …)? Pick your favourite. Another issue that arises is that even if you have solved the first dilemma, how do you obtain a digital object identifier (DOI) for your data and/or code?

Again, there are many ways to do this, and some methods are more automated than other. That said, I do have a preference that is rather easy to implement that I’d thought I’d share with you here.

The first requirement is getting yourself a (free) Github account. What’s Github? Github is one of the world’s largest communities of developers, where code for all manner of types and uses can be developed, shared, updated, collaborated, shipped, and maintained. It might seem a bit overwhelming for non-developers, but if you strip it down to its basics, it’s straightforward to use as a simple repository for your code and data. Of course, Github is designed for so much more than just this (software development collaboration being one of the main ones), but you don’t need to worry about that for now.

Step 1

Once you create an account, you can start creating ‘repositories’, which are essentially just sections of your account dedicated to specific code (and data). I mostly code in R, so I upload my R code text files and associated datasets to these repositories, and spend a good deal of effort on making the Readme.md file highly explanatory and easy to follow. You can check out some of mine here.

Ok. So, you have a repository with some code and data, you’ve explained what’s going on and how the code works in the Readme file, and now you want a permanent DOI that will point to the repository (and any updates) for all time.

Github doesn’t do this by itself, but it integrates seamlessly with another platform — Zenodo — that does. Oh no! Not another platform! Yes, I’m afraid so, but it’s not as painful as you might expect.

Read the rest of this entry »




PhD opportunity in control strategies of feral deer

30 09 2021

In collaboration with Biosecurity South Australia, the Global Ecology Lab at Flinders University is happy to announce a wonderful new PhD opportunity in feral deer control strategies for South Australia.

The project is tentatively entitled: Refining models of feral deer abundance and distribution to inform culling programs in South Australia

Feral fallow deer (Dama dama) digging in a mallee fowl (Leipoa ocellata) mound © Lee Williams

The project brief follows:

South Australian legislation requires that all landholders cull feral deer on their properties. Despite this, feral deer abundance and distribution are increasing across South Australia. This arises because culling by land managers and government organisations is not keeping pace with rates of population growth, and some landholders are harbouring deer for hunting, whereas some deer escape from deer farms.

There are an estimated 40,000 feral deer in South Australia, and state government agencies are working to ramp up programs to cull feral deer before their numbers reach a point where control is no longer feasible.

Planning such large-scale and costly programs requires that government agencies engage economists to measure the economic impacts of feral deer, and to predict the value of these impacts in the future. That modelling is done regularly by governments, and in the case of pest-control programs, the modelling draws on models of feral deer population growth, farmer surveys about the economic, social, and environmental impacts of feral deer, and analyses of culling programs and trials of new culling techniques.

The economic models predict and compare both the current and future costs of:

  • deer impacts on pastures, crops, native plants, and social values (including illegal hunting)
  • culling programs that achieve different objectives (e.g., contain vs. reduce vs. eradicate)

The outputs of the models also inform whether there are sufficient public benefits from the investment of public funds into the culling of feral deer.


This PhD project will collate published and unpublished data to refine models of feral deer distribution and abundance under various culling scenarios. This project will drive both high-impact publications and, because this project builds extensive collaborations with government agencies, the results will inform the management of feral deer in South Australia.

Read the rest of this entry »




It’s a tough time for young conservation scientists

24 08 2021

Sure, it’s a tough time for everyone, isn’t it? But it’s a lot worse for the already disadvantaged, and it’s only going to go downhill from here. I suppose that most people who read this blog can certainly think of myriad ways they are, in fact, still privileged and very fortunate (I know that I am).

Nonetheless, quite a few of us I suspect are rather ground down by the onslaught of bad news, some of which I’ve been responsible for perpetuating myself. Add lock downs, dwindling job security, and the prospect of dying tragically due to lung infection, many have become exasperated.

I once wrote that being a conservation scientist is a particularly depressing job, because in our case, knowledge is a source of despair. But as I’ve shifted my focus from ‘preventing disaster’ to trying to lessen the degree of future shittyness, I find it easier to get out of bed in the morning.

What can we do in addition to shifting our focus to making the future a little less shitty than it could otherwise be? I have a few tips that you might find useful:

Read the rest of this entry »




Smart genetic analysis made fast and easy

29 07 2021

If you use genetics to differentiate populations, the new package smartsnp might be your new friend. Written in R language and available from GitHub and CRAN, this package does principal component analysis with control for genetic drift, projects ancient samples onto modern genetic space, and tests for population differences in genotypes. The package has been built to load big datasets and run complex stats in the blink of an eye, and is fully described in a paper published in Methods in Ecology and Evolution (1).


In the bioinformatics era, sequencing a genome has never been so straightforward. No surprise that > 20 petabytes of genomic data are expected to be generated every year by the end of this decade (2) — if 1 byte of information was 1 mm long, we could make 29,000 round trips to the moon with 20 petabytes. Data size in genetics keeps outpacing the computer power available to handle it at any given time (3). Many will be familiar with a computer freezing if unable to load or run an analysis on a huge dataset, and how many coffees or teas we might have drunk, or computer screens might have been broken, during the wait. The bottom line is that software advances that speed up data processing and genetic analysis are always good news.

With that idea in mind, I have just published a paper presenting the new R package smartsnp (1) to run multivariate analysis of big genotype data, with applications to studies of ancestry, evolution, forensics, lineages, and overall population genetics. I am proud to say that the development of the package has been one of the most gratifying short-term collaborations in my entire career, with my colleagues Christian Huber and Ray Tobler: a true team effort!

The package is available on GitHub and the Comprehensive R Archive Network CRAN. See downloading options here, and vignettes here with step-by-step instructions to run different functionalities of our package (summarised below).

In this blog, I use “genotype” meaning the combination of gene variants (alleles) across a predefined set of positions (loci) in the genome of a given individual of animal, human, microbe, or plant. One type of those variants is single nucleotide polymorphisms (SNP), a DNA locus at which two or more alternative nucleotides occur, sometimes conditioning protein translation or gene expression. SNPs are relatively stable over time and are routinely used to identify individuals and ancestors in humans and wildlife.

What the package does

The package smartsnp is partly based on the field-standard software EIGENSOFT (4, 5) which is only available for Unix command-line environments. In fact, our driving motivation was (i) to broaden the use of EIGENSOFT tools by making them available to the rocketing community of professionals, not only academics who employ R for their work (6), and (ii) to optimise our package to handle big datasets and complex stats efficiently. Our package mimics EIGENSOFT’s principal component analysis (SMARTPCA) (5), and also runs multivariate tests for population differences in genotypes as follows:

Read the rest of this entry »




… some (models) are useful

8 06 2021

As someone who writes a lot of models — many for applied questions in conservation management (e.g., harvest quotas, eradication targets, minimum viable population sizes, etc.), and supervises people writing even more of them, I’ve had many different experiences with their uptake and implementation by management authorities.

Some of those experiences have involved catastrophic failures to influence any management or policy. One particularly painful memory relates to a model we wrote to assist with optimising approaches to eradicate (or at least, reduce the densities of) feral animals in Kakadu National Park. We even wrote the bloody thing in Visual Basic (horrible coding language) so people could run the module in Excel. As far as I’m aware, no one ever used it.

Others have been accepted more readily, such as a shark-harvest model, which (I think, but have no evidence to support) has been used to justify fishing quotas, and one we’ve done recently for the eradication of feral pigs on Kangaroo Island (as yet unpublished) has led directly to increased funding to the agency responsible for the programme.

According to Altmetrics (and the online tool I developed to get paper-level Altmetric information quickly), only 3 of the 16 of what I’d call my most ‘applied modelling’ papers have been cited in policy documents:

Read the rest of this entry »




Killing (feral) cats quickly (and efficiently)

20 05 2021

I’m pleased to announce the publication of a paper led by Kathryn Venning (KV) that was derived from her Honours work in the lab. Although she’s well into her PhD on an entirely different topic, I’m overjoyed that she persevered and saw this work to publication.

Here, killa, killa, killa, killa …

As you probably already know, feral cats are a huge problem in Australia. The are probably the primary reason Australia leads the world in mammal extinctions in particular, and largely the reason so many re-introduction attempts of threatened marsupials fail miserably only after a few years.

Feral cats occupy every habitat in the country, from the high tropics to the deserts, and from the mountains to the sea. They adapt to the cold just as easily as they adapt to the extreme heat, and they can eat just about anything that moves, from invertebrates to the carcases of much larger animals that they scavenge.

Cats are Australia’s bane, but you can’t help but be at least a little impressed with their resilience.

Still, we have to try our best to get rid of them where we can, or at least reduce their densities to the point where their ecological damage is limited.

Typically, the only efficient and cost-effective way to do that is via lethal control, but by using various means. These can include direct shooting, trapping, aerial poison-baiting, and a new ‘smart’ method of targeted poison delivery via a prototype device known as a Felixer™️. The latter are particularly useful for passive control in areas where ground-shooting access is difficult.

A live Felixer™️ deployed on Kangaroo Island (photo: CJA Bradshaw 2020)

A few years back the federal government committed what might seem like a sizeable amount of money to ‘eradicate’ cats from Australia. Yeah, good luck with that, although the money has been allocated to several places where cat reduction and perhaps even eradication is feasible. Namely, on islands.

Read the rest of this entry »




Mapping the ‘super-highways’ the First Australians used to cross the ancient land

4 05 2021

Author provided/The Conversation, Author provided


There are many hypotheses about where the Indigenous ancestors first settled in Australia tens of thousands of years ago, but evidence is scarce.

Few archaeological sites date to these early times. Sea levels were much lower and Australia was connected to New Guinea and Tasmania in a land known as Sahul that was 30% bigger than Australia is today.

Our latest research advances our knowledge about the most likely routes those early Australians travelled as they peopled this giant continent.


Read more: The First Australians grew to a population of millions, much more than previous estimates


We are beginning to get a picture not only of where those first people landed in Sahul, but how they moved throughout the continent.

Navigating the landscape

Modelling human movement requires understanding how people navigate new terrain. Computers facilitate building models, but they are still far from easy. We reasoned we needed four pieces of information: (1) topography; (2) the visibility of tall landscape features; (3) the presence of freshwater; and (4) demographics of the travellers.

We think people navigated in new territories — much as people do today — by focusing on prominent land features protruding above the relative flatness of the Australian continent. Read the rest of this entry »





Population of First Australians grew to millions, much more than previous estimates

30 04 2021

Shutterstock/Jason Benz Bennee


We know it is more than 60,000 years since the first people entered the continent of Sahul — the giant landmass that connected New Guinea, Australia and Tasmania when sea levels were lower than today.

But where the earliest people moved across the landscape, how fast they moved, and how many were involved, have been shrouded in mystery.

Our latest research, published today shows the establishment of populations in every part of this giant continent could have occurred in as little as 5,000 years. And the entire population of Sahul could have been as high as 6.4 million people.

This translates to more than 3 million people in the area that is now modern-day Australia, far more than any previous estimate.


Read more: We mapped the ‘super-highways’ the First Australians used to cross the ancient land


The first people could have entered through what is now western New Guinea or from the now-submerged Sahul Shelf off the modern-day Kimberley (or both).

But whichever the route, entire communities of people arrived, adapted to and established deep cultural connections with Country over 11 million square kilometres of land, from northwestern Sahul to Tasmania.

A map showing a much larger landmass as Australia is joined to both Tasmania and New Guinea due to lower sea levels

Map of what Australia looked like for most of the human history of the continent when sea levels were lower than today. Author provided


This equals a rate of population establishment of about 1km per year (based on a maximum straight-line distance of about 5,000km from the introduction point to the farthest point).

That’s doubly impressive when you consider the harshness of the Australian landscape in which people both survived and thrived.

Previous estimates of Indigenous population

Various attempts have been made to calculate the number of people living in Australia before European invasion. Estimates vary from 300,000 to more than 1,200,000 people. Read the rest of this entry »





The biggest and slowest don’t always bite it first

13 04 2021

For many years I’ve been interested in modelling the extinction dynamics of megafauna. Apart from co-authoring a few demographically simplified (or largely demographically free) models about how megafauna species could have gone extinct, I have never really tried to capture the full nuances of long-extinct species within a fully structured demographic framework.

That is, until now.

But how do you get the life-history data of an extinct animal that was never directly measured. Surely, things like survival, reproductive output, longevity and even environmental carrying capacity are impossible to discern, and aren’t these necessary for a stage-structured demographic model?

Thylacine mum & joey. Nellie Pease & CABAH

The answer to the first part of that question “it’s possible”, and to the second, it’s “yes”. The most important bit of information we palaeo modellers need to construct something that’s ecologically plausible for an extinct species is an estimate of body mass. Thankfully, palaeontologists are very good at estimating the mass of the things they dig up (with the associated caveats, of course). From such estimates, we can reconstruct everything from equilibrium densities, maximum rate of population growth, age at first breeding, and longevity.

But it’s more complicated than that, of course. In Australia anyway, we’re largely dealing with marsupials (and some monotremes), and they have a rather different life-history mode than most placentals. We therefore have to ‘correct’ the life-history estimates derived from living placental species. Thankfully, evolutionary biologists and ecologists have ways to do that too.

The Pleistocene kangaroo Procoptodon goliah, the largest and most heavily built of the  short-faced kangaroos, was the largest and most heavily built kangaroo known. It had an  unusually short, flat face and forwardly directed 
eyes, with a single large toe on each foot  (reduced from the more normal count of four). Each forelimb had two long, clawed fingers  that would have been used to bring leafy branches within reach.

So with a battery of ecological, demographic, and evolutionary tools, we can now create reasonable stochastic-demographic models for long-gone species, like wombat-like creatures as big as cars, birds more than two metres tall, and lizards more than seven metres long that once roamed the Australian continent. 

Ancient clues, in the shape of fossils and archaeological evidence of varying quality scattered across Australia, have formed the basis of several hypotheses about the fate of megafauna that vanished during a peak about 42,000 years ago from the ancient continent of Sahul, comprising mainland Australia, Tasmania, New Guinea and neighbouring islands.

There is a growing consensus that multiple factors were at play, including climate change, the impact of people on the environment, and access to freshwater sources.

Just published in the open-access journal eLife, our latest CABAH paper applies these approaches to assess how susceptible different species were to extinction – and what it means for the survival of species today. 

Using various characteristics such as body size, weight, lifespan, survival rate, and fertility, we (Chris Johnson, John Llewelyn, Vera Weisbecker, Giovanni Strona, Frédérik Saltré & me) created population simulation models to predict the likelihood of these species surviving under different types of environmental disturbance.

Simulations included everything from increasing droughts to increasing hunting pressure to see which species of 13 extinct megafauna (genera: Diprotodon, Palorchestes, Zygomaturus, Phascolonus, Procoptodon, Sthenurus, Protemnodon, Simosthenurus, Metasthenurus, Genyornis, Thylacoleo, Thylacinus, Megalibgwilia), as well as 8 comparative species still alive today (Vombatus, Osphranter, Notamacropus, Dromaius, Alectura, Sarcophilus, Dasyurus, Tachyglossus), had the highest chances of surviving.

We compared the results to what we know about the timing of extinction for different megafauna species derived from dated fossil records. We expected to confirm that the most extinction-prone species were the first species to go extinct – but that wasn’t necessarily the case.

While we did find that slower-growing species with lower fertility, like the rhino-sized wombat relative Diprotodon, were generally more susceptible to extinction than more-fecund species like the marsupial ‘tiger’ thylacine, the relative susceptibility rank across species did not match the timing of their extinctions recorded in the fossil record.

Indeed, we found no clear relationship between a species’ inherent vulnerability to extinction — such as being slower and heavier and/or slower to reproduce — and the timing of its extinction in the fossil record.

In fact, we found that most of the living species used for comparison — such as short-beaked echidnas, emus, brush turkeys, and common wombats — were more susceptible on average than their now-extinct counterparts.

Read the rest of this entry »




How to avoid reduce the probability of being killed by a shark

31 03 2021

Easy. Don’t go swimming/surfing/snorkelling/diving in the ocean.


“Oh, shit”

Sure, that’s true, but if you’re like many Australians, the sea is not just a beautiful thing to look at from the window, it’s a way of life. Trying telling a surfer not to surf, or a diver not to dive. Good luck with that.

A few years ago, I joined a team of super-cool sharkologists led by Charlie ‘Aussie-by-way-of-Belgium shark-scientist extraordinaire Huveneers, and including Maddie ‘Chomp’ Thiele and Lauren ‘Acid’ Meyer — to publish the results of some of the first experimentally tested shark deterrents.

It turns out that many of the deterrents we tested failed to show any reduction in the probability of a shark biting, with only one type of electronic deterrent showing any effect at all (~ 60% reduction).

Great. But what might that mean in terms of how many people could be saved by wearing such electronic deterrents? While the probability of being bitten by a shark is low globally, even in Australia (despite public perceptions), we wondered if the number of lives saved and injuries avoided was substantial.

In a new paper just published today in Royal Society Open Science, we attempted to answer that question.

To predict how many people could avoid shark bites if they were using properly donned electronic deterrents that demonstrate some capacity to dissuade sharks from biting, we examined the century-scale time series of shark bites on humans in Australia. This database — the ‘Australian Shark Attack File‘ — is one of the most comprehensive databases of its kind.

Read the rest of this entry »




Need to predict population trends, but can’t code? No problem

2 12 2020

Yes, yes. I know. Another R Shiny app.

However, this time I’ve strayed from my recent bibliometric musings and developed something that’s more compatible with the core of my main research and interests.

Welcome to LeslieMatrixShiny!

Over the years I’ve taught many students the basics of population modelling, with the cohort-based approaches dominating the curriculum. Of these, the simpler ‘Leslie’ (age-classified) matrix models are both the easiest to understand and for which data can often be obtained without too many dramas.

But unless you’re willing to sit down and learn the code, they can be daunting to the novice.

Sure, there are plenty of software alternatives out there, such as Bob Lacy‘s Vortex (a free individual-based model available for PCs only), Resit Akçakaya & co’s RAMAS Metapop ($; PC only), Stéphane Legendre‘s Unified Life Models (ULM; open-source; all platforms), and Charles Todd‘s Essential (open-source; PC only) to name a few. If you’re already an avid R user and already into population modelling, you might be familiar with the population-modelling packages popdemo, OptiPopd, or sPop. I’m sure there are still other good resources out there of which I’m not aware.

But, even to install the relevant software or invoke particular packages in R takes a bit of time and learning. It’s probably safe to assume that many people find the prospect daunting.

It’s for this reason that I turned my newly acquired R Shiny skills to matrix population models so that even complete coding novices can run their own stochastic population models.

I call the app LeslieMatrixShiny.

Read the rest of this entry »




Grand Challenges in Global Biodiversity Threats

8 10 2020

Last week I mentioned that the new journal Frontiers in Conservation Science is now open for business. As promised, I wrote a short article outlining our vision for the Global Biodiversity Threats section of the journal. It’s open-access, of course, so I’m also copying here on ConservationBytes.com.


Most conservation research and its applications tend to happen most frequently at reasonably fine spatial and temporal scales — for example, mesocosm experiments, single-species population viability analyses, recovery plans, patch-level restoration approaches, site-specific biodiversity surveys, et cetera. Yet, at the other end of the scale spectrum, there have been many overviews of biodiversity loss and degradation, accompanied by the development of multinational policy recommendations to encourage more sustainable decision making at lower levels of sovereign governance (e.g., national, subnational).

Yet truly global research in conservation science is fact comparatively rare, as poignantly demonstrated by the debates surrounding the evidence for and measurement of planetary tipping points (Barnosky et al., 2012; Brook et al., 2013; Lenton, 2013). Apart from the planetary scale of human-driven disruption to Earth’s climate system (Lenton, 2011), both scientific evidence and policy levers tend to be applied most often at finer, more tractable research and administrative scales. But as the massive ecological footprint of humanity has grown exponentially over the last century (footprintnetwork.org), robust, truly global-scale evidence of our damage to the biosphere is now starting to emerge (Díaz et al., 2019). Consequently, our responses to these planet-wide phenomena must also become more global in scope.

Conservation scientists are adept at chronicling patterns and trends — from the thousands of vertebrate surveys indicating an average reduction of 68% in the numbers of individuals in populations since the 1970s (WWF, 2020), to global estimates of modern extinction rates (Ceballos and Ehrlich, 2002; Pimm et al., 2014; Ceballos et al., 2015; Ceballos et al., 2017), future models of co-extinction cascades (Strona and Bradshaw, 2018), the negative consequences of invasive species across the planet (Simberloff et al., 2013; Diagne et al., 2020), discussions surrounding the evidence for the collapse of insect populations (Goulson, 2019; Komonen et al., 2019; Sánchez-Bayo and Wyckhuys, 2019; Cardoso et al., 2020; Crossley et al., 2020), the threats to soil biodiversity (Orgiazzi et al., 2016), and the ubiquity of plastic pollution (Beaumont et al., 2019) and other toxic substances (Cribb, 2014), to name only some of the major themes in global conservation. 

Read the rest of this entry »




History of species distribution models

21 07 2020

This little historical overview by recently completed undergraduate student, Sofie Costin (soon to join our lab!), nicely summarises the history, strengths, and limitations of species distribution modelling in ecology, conservation and restoration. I thought it would be an excellent resource for those who are just entering the world of species distribution models.

SDM

Of course, there is a strong association between and given species and its environment1. As such, climate and geographical factors have been often used to explain the distribution of plant and animal species around the world.

Predictive ecological models, otherwise known as ‘niche models’ or ‘species distribution models’ have become a widely used tool for the planning of conservation strategies such as pest management and translocations2-5. In short, species distribution models assess the relationship between environmental conditions and species’ occurrences, and then can estimate the spatial distribution of habitats suited to the study species outside of the sampling area3,6.

While the application of species distribution models can reduce the time and cost associated with conservation research, and conservation managers are relying increasingly on them to inform their conservation strategies4, species distribution models are by no means a one-stop solution to all conservation issues. Read the rest of this entry »





Did people or climate kill off the megafauna? Actually, it was both

10 12 2019

When freshwater dried up, so did many megafauna species.
Centre of Excellence for Australian Biodiversity and Heritage, Author provided

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Earth is now firmly in the grips of its sixth “mass extinction event”, and it’s mainly our fault. But the modern era is definitely not the first time humans have been implicated in the extinction of a wide range of species.

In fact, starting about 60,000 years ago, many of the world’s largest animals disappeared forever. These “megafauna” were first lost in Sahul, the supercontinent formed by Australia and New Guinea during periods of low sea level.

The causes of these extinctions have been debated for decades. Possible culprits include climate change, hunting or habitat modification by the ancestors of Aboriginal people, or a combination of the two.


Read more: What is a ‘mass extinction’ and are we in one now?


The main way to investigate this question is to build timelines of major events: when species went extinct, when people arrived, and when the climate changed. This approach relies on using dated fossils from extinct species to estimate when they went extinct, and archaeological evidence to determine when people arrived.


Read more: An incredible journey: the first people to arrive in Australia came in large numbers, and on purpose


Comparing these timelines allows us to deduce the likely windows of coexistence between megafauna and people.

We can also compare this window of coexistence to long-term models of climate variation, to see whether the extinctions coincided with or shortly followed abrupt climate shifts.

Data drought

One problem with this approach is the scarcity of reliable data due to the extreme rarity of a dead animal being fossilised, and the low probability of archaeological evidence being preserved in Australia’s harsh conditions. Read the rest of this entry »





Environmental damage kills children

1 10 2019

Yes, childrenairpollutionit’s a provocative title, I agree. But then again, it’s true.

But I don’t just mean in the most obvious ways. We already have good data showing that lack of access to clean water and sanitation kills children (especially in developing nations), that air pollution is a nasty killer of young children in particular, and now even climate change is starting to take its toll.

These aspects of child health aren’t very controversial, but when we talk about the larger suite of indicators of environmental ‘damage’, such as deforestation rates, species extinctions, and the overall reduction of ecosystem services, the empirical links to human health, and to children in particular, are far rarer.

This is why I’m proud to report the publication today of a paper on which I and team of wonderful collaborators (Sally Otto, Zia Mehrabi, Alicia Annamalay, Sam Heft-Neal, Zach Wagner, and Peter Le Souëf) have worked for several years.

I won’t lie — the path to publishing this paper was long and hard, I think mainly because it traversed so many different disciplines. But we persevered and today published the paper entitled ‘Testing the socioeconomic and environmental determinants of better child-health outcomes in Africa: a cross-sectional study among nations* in the journal BMJ Open.

Read the rest of this entry »





The Great Dying

30 09 2019

Here’s a presentation I gave earlier in the year for the Flinders University BRAVE Research and Innovation series:

There is No Plan(et) B — What you can do about Earth’s extinction emergency

Earth is currently experiencing a mass extinction brought about by, … well, … us. Species are being lost at a rate similar to when the dinosaurs disappeared. But this time, it’s not due to a massive asteroid hitting the Earth; species are being removed from the planet now because of human consumption of natural resources. Is a societal collapse imminent, and do we need to prepare for a post-collapse society rather than attempt to avoid one? Or, can we limit the severity and onset of a collapse by introducing a few changes such as removing political donations, becoming vegetarians, or by reducing the number of children one has?

Read the rest of this entry »





Increasing human population density drives environmental degradation in Africa

26 06 2019

 

stumps

Almost a decade ago, I (co-) wrote a paper examining the socio-economic correlates of gross, national-scale indices of environmental performance among the world’s nations. It turned out to be rather popular, and has so far garnered over 180 citations and been cited in three major policy documents.

In addition to the more pedestrian ranking itself, we also tested which of three main socio-economic indicators best explained variation in the environmental rank — a country’s gross ‘wealth’ indicator (gross national income) turned out to explain the most, and there was no evidence to support a non-linear relationship between environmental performance and per capita wealth (the so-called environmental Kuznets curve).

Well, that was then, and this is now. Something that always bothered me about that bit of research was that in some respects, it probably unfairly disadvantaged certain countries that were in more recent phases of the ‘development’ pathway, such that environmental damage long since done in major development pulses many decades or even centuries prior to today (e.g., in much of Europe) probably meant that certain countries got a bit of an unfair advantage. In fact, the more recently developed nations probably copped a lower ranking simply because their damage was fresher

While I defend the overall conclusions of that paper, my intentions have always been since then to improve on the approach. That desire finally got the better of me, and so I (some might say unwisely) decided to focus on a particular region of the planet where some of the biggest biodiversity crunches will happen over the next few decades — Africa.

Africa is an important region to re-examine these national-scale relationships for many reasons. The first is that it’s really the only place left on the planet where there’s a semi-intact megafauna assemblage. Yes, the great Late Pleistocene megafauna extinction event did hit Africa too, but compared to all other continents, it got through that period relatively unscathed. So now we (still) have elephants, rhinos, giraffes, hippos, etc. It’s a pretty bloody special place from that perspective alone.

P1080625

Elephants in the Kruger National Park, South Africa (photo: CJA Bradshaw)

Then there’s the sheer size of the continent. Unfortunately, most mercator projections of the Earth show a rather quaint continent nuzzled comfortably in the middle of the map, when in reality, it’s a real whopper. If you don’t believe me, go to truesize.com and drag any country of interest over the African continent (it turns out that its can more or less fit all of China, Australia, USA, and India within its greater borders).

Third, most countries in Africa (barring a few rare exceptions), are still in the so-called ‘development’ phase, although some are much farther along the economic road than others. For this reason, an African nation-to-nation comparison is probably a lot fairer than comparing, say, Bolivia to Germany, or Mongolia to Canada.

Read the rest of this entry »





First Australians arrived in large groups using complex technologies

18 06 2019

file-20190325-36276-12v4jq2

One of the most ancient peopling events of the great diaspora of anatomically modern humans out of Africa more than 50,000 years ago — human arrival in the great continent of Sahul (New Guinea, mainland Australia & Tasmania joined during periods of low sea level) — remains mysterious. The entry routes taken, whether migration was directed or accidental, and just how many people were needed to ensure population viability are shrouded by the mists of time. This prompted us to build stochastic, age-structured human population-dynamics models incorporating hunter-gatherer demographic rates and palaeoecological reconstructions of environmental carrying capacity to predict the founding population necessary to survive the initial peopling of late-Pleistocene Sahul.

As ecological modellers, we are often asked by other scientists to attempt to render the highly complex mechanisms of entire ecosystems tractable for virtual manipulation and hypothesis testing through the inevitable simplification that is ‘a model’. When we work with scientists studying long-since-disappeared ecosystems, the challenges multiply.

Add some multidisciplinary data and concepts into the mix, and the complexity can quickly escalate.

We do have, however, some powerful tools in our modelling toolbox, so as the Modelling Node for the Australian Research Council Centre of Excellence for Australian Biodiversity and Heritage (CABAH), our role is to link disparate fields like palaeontology, archaeology, geochronology, climatology, and genetics together with mathematical ‘glue’ to answer the big questions regarding Australia’s ancient past.

This is how we tackled one of these big questions: just how did the first anatomically modern Homo sapiens make it to the continent and survive?

At that time, Australia was part of the giant continent of Sahul that connected New Guinea, mainland Australia, and Tasmania at times of lower sea level. In fact, throughout most of last ~ 126,000 years (late Pleistocene and much of the Holocene), Sahul was the dominant landmass in the region (see this handy online tool for how the coastline of Sahul changed over this period).

Read the rest of this entry »








%d bloggers like this: