Correcting times for light exposure across spatial extents

30 03 2022

The other day I was tasked with revising a figure for a paper (that should be out soon) where I had to figure out how to compare incident times in a biologically meaningful way.

Without giving away too many details, we had a long list of incidents spread right across Australia, covering all periods of the year and going back to the early 20th Century. The specifics of the ‘incidents’ isn’t important here — suffice it to say they were biological in nature, and we wanted to see if they were clustered around any particular times of the day.

Yes, we could just do a histogram of the time bins (say, every 2 hours), but this ignores a very important phenomenon — 17:00 in July in Hobart isn’t directly comparable to 17:00 in January in Darwin (and so on). What matters instead — from a biological/phenological perspective — is the period of day in terms of available light.

Fortunately, there are some clear definitions of relative light availability we can use.

‘Night’ is defined as the time between astronomical dusk and astronomical dawn, which are when the sun is 18º below the horizon. ‘Twilight’ is the period between night and sunrise/sunset (the latter being when the sun first appears/disappears above/below the horizon), further broken down into three periods: astronomical twilight, nautical twilight, and civil twilight. These latter refer to when the sun is 18º, 12º, and 6º below the horizon, respectively.

It’s still ‘dark’ in astronomical twilight, but light starts to be discernible at the start of nautical twilight. We can therefore define four major periods of relative light availability per 24-hour period: night (between the start of astronomical dusk and end of astronomical dawn), dawn (between the end of nautical twilight and sunrise), day (between sunrise and sunset), and dusk (between sunset and the onset of astronomical twilight).

Phew!

So, after all that malarkey, now we need a way of determining when those transition periods occur on any given day in any given location. Sounds difficult, but, there’s a function for that!

Read the rest of this entry »




The integrity battlefield: where science meets policy

4 03 2022

Professor Ross Thompson, University of Canberra


On the whole, I am inclined to conclude that my experience of academia and publishing my work has been largely benign. Despite having published 120-odd peer-reviewed papers, I can count the number of major disputes on one hand. Where there have been disagreements, they have centred on issues of content, and despite the odd grumble, things have rarely escalated to the ad hominem. I have certainly never experienced concerted attacks on my work.

But that changed recently. I work in water science, participating in and leading multi-disciplinary teams that do research directly relevant to water policy and management. My colleagues and I work closely with state and federal governments and are often funded by them through a variety of mechanisms. Our teams are a complex blend of scientists from universities, state and federal research agencies, and private-sector consultancies. Water is big business in Australia, and its management is particularly pertinent as the world’s driest inhabited continent struggles to come to terms with the impacts of climate change.

In the last 10 years, Australia has undergone a AU$16 billion program of water reform that has highlighted the extreme pressure on ecosystems, rural communities, and water-dependent industries. In 2019, two documentaries (Cash Splash and Pumped) broadcast by the Australian Broadcasting Corporation were highly critical of the  outcomes of water reform. A group of scientists involved in working on the Murray-Darling Basin were concerned enough about the accuracy of aspects of those stories to support Professor Rob Vertessy from the University of Melbourne in drafting an Open Letter in response. I was a co-author on that letter, and something into which I did not enter lightly. We were very concerned about being seen to advocate for any particular policy position, but were simultaneously committed to contributing to an informed public debate. A later investigation by the Australian Communications and Media Authority also highlighted concerns with the Cash Splash documentary.

Fast forward to 2021 and the publication of a paper by Colloff et al. (2021) in the Australasian Journal of Water Resources. In that paper, the authors were critical of the scientists that had contributed to the Open Letter and claimed they had been subject to “administrative capture” and “issue advocacy”. Administrative capture is defined here as:

Read the rest of this entry »




Remote areas not necessarily safe havens for biodiversity

16 12 2021

The intensity of threats to biodiversity from human endeavour becomes weaker as the distance to them increases.


As you move away from the big city to enjoy the countryside, you’ll notice the obvious increase in biodiversity. Even the data strongly support this otherwise subjective perception — there is a positive correlation between the degree we destroy habitat, harvest species, and pollute the environment, and the distance from big cities.

Remote locations are therefore usually considered safe havens and potential reservoirs for biodiversity. But our new study published recently in Nature Communications shows how this obvious pattern depicts only half of the story, and that global conservation management and actions might benefit from learning more about the missing part.

Communities are not just lists of individual species. Instead, they consist of complex networks of ecological interactions linking interdependent species. The structure of such networks is a fundamental determinant of biodiversity emergence and maintenance. However, it also plays an essential role in the processes of biodiversity loss. The decline or disappearance of some species might have detrimental —often fatal — effects on their associates. For example, a parasite cannot survive without its hosts, as much as a predator will starve without prey, or a plant will not reproduce without pollinators.

Events where a species disappears following the loss of other species on which it depends are known as co-extinctions, and they are now recognised as a primary driver of the ongoing global biodiversity crisis. The potential risk stemming from ecological dependencies is a major concern for all ecological systems.

Read the rest of this entry »




Extinct megafauna prone to ancient hunger games

14 12 2021

I’m very chuffed today to signal the publication of what I think is one of the most important contributions to the persistent conundrum surrounding the downfall of Australia’s megafauna many tens of millennia ago.

Diprotodon optimum. Artwork by palaeontologist and artist Eleanor (Nellie) Pease (commissioned by the ARC Centre of Excellence for Australian Biodiversity and Heritage)

Sure, I’m obviously biased in that assessment because it’s a paper from our lab and I’m a co-author, but if readers had any inkling of the work that went into this paper, I think they might consider adopting my position. In addition, the injection of some actual ecology into the polemic should be viewed as fresh and exciting.

Having waded into the murky waters of the ‘megafauna debate’ for about a decade now, I’ve become a little sensitive to even a whiff of binary polemic surrounding their disappearance in Australia. Acolytes of the climate-change prophet still beat their drums, screaming for the smoking gun of a spear sticking out of a Diprotodon‘s skull before they even entertain the notion that people might have had something to do with it — but we’ll probably never find one given the antiquity of the event (> 40,000 years ago). On the other side are the blitzkriegers who declaim that human hunting single-handedly wiped out the lot.

Well, as it is for nearly all extinctions, it’s actually much more complicated than that. In the case of Sahul’s megafauna disappearances, both drivers likely contributed, but the degree to which both components played a part depends on where and when you look — Fred Saltré demonstrated that elegantly a few years ago.

Palorchestes. Artwork by palaeontologist and artist Eleanor (Nellie) Pease (commissioned by the ARC Centre of Excellence for Australian Biodiversity and Heritage)

So, why does the polemic persist? In my view, it’s because we have largely depended on the crude comparison of relative dates to draw our conclusions. That is, we look to see if some climate-change proxy shifted in any notable way either before or after an inferred extinction date. If a particular study claims evidence that a shift happened before, then it concludes climate change was the sole driver. If a study presents evidence that a shift happened after, then humans did it. Biases in geochronological inference (e.g., spatial, contamination), incorrect application of climate proxies, poor taxonomic resolution, and not accounting for the Signor-Lipps effect all contribute unnecessarily to the debate because small errors or biases can flip relative chronologies on their head and push conclusions toward uncritical binary outcomes. The ‘debate’ has been almost entirely grounded on this simplistically silly notion.

This all means that the actual ecology has been either ignored or merely made up based on whichever pet notion of the day is being proffered. Sure, there are a few good ecological inferences out there from some damn good modellers and ecologists, but these have all been greatly simplified themselves. This is where our new paper finally takes the ecology part of the problem to the next level.

Led by Global Ecology and CABAH postdoctoral fellow, John Llewelyn, and guided by modelling guru Giovanni Strona at University of Helsinki, the paper Sahul’s megafauna were vulnerable to plant-community changes due to their position in the trophic network has just been published online in Ecography. Co-authors include Kathi Peters, Fred Saltré, and me from Flinders Global Ecology, Matt McDowell and Chris Johnson from UTAS, Daniel Stouffer from University of Canterbury (NZ), and Sara de Visser from University of Groningen (Netherlands).

Read the rest of this entry »




Animating models of ecological change

6 12 2021

Flinders University Global Ecology postdoc, Dr Farzin Shabani, recently created this astonishing video not only about the results of his models predicting vegetation change in northern Australia as a function of long-term (tens of thousands of years) climate change, but also on the research journey itself!

He provides a brief background to how and why he took up the challenge:


Science would be a lot harder to digest without succinct and meaningful images, graphs, and tables. So, being able to visualise both inputs and outputs of scientific models to cut through the fog of data is an essential element of all science writing and communication. Diagrams help us understand trends and patterns much more quickly than do raw data, and they assist with making comparisons.

During my academic career, I have studied many different topics, including natural hazards (susceptibility & vulnerability risks), GIS-based ensemble modelling, climate-change impacts, environmental modelling at different temporal and spatial scales, species-distribution modelling, and time-series analysis. I use a wide range of graphschartsplotsmaps and tables to transfer the key messages.

For my latest project, however, I was given the opportunity to make a short animation and visualise my results and the journey itself. I think that my animation inspires a sense of wonder, which is among the most important goals of science education. I also think that my animation draws connections to real-life problems (e.g., ecosystem changes as a product of climate change), and also develops an appreciation of the scientific process itself.

Take a look at let me know what you think!

Read the rest of this entry »




An eye on the past: a view to the future

29 11 2021

originally published in Brave Minds, Flinders University’s research-news publication (text by David Sly)

Clues to understanding human interactions with global ecosystems already exist. The challenge is to read them more accurately so we can design the best path forward for a world beset by species extinctions and the repercussions of global warming.


This is the puzzle being solved by Professor Corey Bradshaw, head of the Global Ecology Lab at Flinders University. By developing complex computer modelling and steering a vast international cohort of collaborators, he is developing research that can influence environmental policy — from reconstructing the past to revealing insights of the future.

As an ecologist, he aims both to reconstruct and project how ecosystems adapt, how they are maintained, and how they change. Human intervention is pivotal to this understanding, so Professor Bradshaw casts his gaze back to when humans first entered a landscape – and this has helped construct an entirely fresh view of how Aboriginal people first came to Australia, up to 75,000 years ago.

Two recent papers he co-authored — ‘Stochastic models support rapid peopling of Late Pleistocene Sahul‘, published in Nature Communications, and ‘Landscape rules predict optimal super-highways for the first peopling of Sahul‘ published in Nature Human Behaviour — showed where, how and when Indigenous Australians first settled in Sahul, which is the combined mega-continent that joined Australia with New Guinea in the Pleistocene era, when sea levels were lower than today.

Professor Bradshaw and colleagues identified and tested more than 125 billion possible pathways using rigorous computational analysis in the largest movement-simulation project ever attempted, with the pathways compared to the oldest known archaeological sites as a means of distinguishing the most likely routes.

The study revealed that the first Indigenous people not only survived but thrived in harsh environments, providing further evidence of the capacity and resilience of the ancestors of Indigenous people, and suggests large, well-organised groups were able to navigate tough terrain.

Read the rest of this entry »




And this little piggy went extinct

24 11 2021

Back in June of this year I wrote (whinged) about the disappointment of writing a lot of ecological models that were rarely used to assist real-world wildlife management. However, I did hint that another model I wrote had assistance one government agency with pig management on Kangaroo Island.

Well, now that report has been published online and I’m permitted to talk about it. I’m also very happy to report that, in the words of the Government of South Australia’s Department of Primary Industries and Regions (PIRSA),

Modelling by the Flinders University Global Ecology Laboratory shows the likelihood and feasibility of feral pig eradication under different funding and eradication scenarios. With enough funding, feral pigs could be eradicated from Kangaroo Island in 2 years.

This basically means that because of the model, PIRSA was successful in obtaining enough funding to pretty much ensure that the eradication of feral pigs from Kangaroo Island will be feasible!

Why is this important to get rid of feral pigs? They are a major pest on the Island, causing severe economic and environmental impacts both to farms and native ecosystems. On the agricultural side of things, they prey on newborn lambs, eat crops, and compete with livestock for pasture. Feral pigs damage natural habitats by up-rooting vegetation and fouling waterholes. They can also spread weeds and damage infrastructure, as well as act as hosts of parasites and diseases (e.g., leptospirosis, tuberculosis, foot-and-mouth disease) that pose serious threats to industry, wildlife, and even humans.

Read the rest of this entry »




Free resources for learning (and getting better with) R

15 11 2021

While I’m currently in Github mode (see previous post), I thought I’d share a list of resources I started putting together for learning and upskilling in the R programming language.

If you don’t know what R is, this probably won’t be of much use to you. But if you are a novice user, want to improve your skills, or just have access to a kick-arse list of cheatsheets, then this Github repository should be useful.

I started putting this list together for members of the Australian Research Council Centre of Excellence for Australian Biodiversity and Heritage, but I see no reason why it should be limited to that particular group of people.

I don’t claim that this list is exhaustive, nor do I vouch for the quality of any of the listed resources. Some of them are deprecated and fairly old too, so be warned.

The first section includes online resources such as short courses, reference guides, analysis demos, tips for more-efficient programming, better plotting guidelines, as well as some R-related mini-universes like markdown, ggplot, Shiny, and tidyverse.

The section following is a list of popular online communities, list-servers, and blogs that help R users track down advice for solving niggly coding and statistical problems.

The next section is a whopping-great archive of R cheatsheets, covering everything from the basics, plotting, cartography, databasing, applications, time series analysis, machine learning, time & date, building packages, parallel computing, resampling methods, markdown, and more.

Read the rest of this entry »




Want a permanent DOI assigned to your data and code? Follow this simple recipe

2 11 2021

These days with data and code often required to be designated as open-source, licenced, and fully trackable for most manuscript submissions to a peer-reviewed journal, it’s easy to get lost in the multitude of platforms and options available. In most cases, we no longer have much of a choice to do so, even if you are reticent (although the benefits of posting your data and code online immediately far outweigh any potential disadvantages).

But do you post your data and code on the Open Science Framework (free), Github (free), Figshare (free), Zenodo (free, but donations encouraged), Dryad ($), or Harvard Dataverse (free) (and so on, and so on, …)? Pick your favourite. Another issue that arises is that even if you have solved the first dilemma, how do you obtain a digital object identifier (DOI) for your data and/or code?

Again, there are many ways to do this, and some methods are more automated than other. That said, I do have a preference that is rather easy to implement that I’d thought I’d share with you here.

The first requirement is getting yourself a (free) Github account. What’s Github? Github is one of the world’s largest communities of developers, where code for all manner of types and uses can be developed, shared, updated, collaborated, shipped, and maintained. It might seem a bit overwhelming for non-developers, but if you strip it down to its basics, it’s straightforward to use as a simple repository for your code and data. Of course, Github is designed for so much more than just this (software development collaboration being one of the main ones), but you don’t need to worry about that for now.

Step 1

Once you create an account, you can start creating ‘repositories’, which are essentially just sections of your account dedicated to specific code (and data). I mostly code in R, so I upload my R code text files and associated datasets to these repositories, and spend a good deal of effort on making the Readme.md file highly explanatory and easy to follow. You can check out some of mine here.

Ok. So, you have a repository with some code and data, you’ve explained what’s going on and how the code works in the Readme file, and now you want a permanent DOI that will point to the repository (and any updates) for all time.

Github doesn’t do this by itself, but it integrates seamlessly with another platform — Zenodo — that does. Oh no! Not another platform! Yes, I’m afraid so, but it’s not as painful as you might expect.

Read the rest of this entry »




PhD opportunity in control strategies of feral deer

30 09 2021

In collaboration with Biosecurity South Australia, the Global Ecology Lab at Flinders University is happy to announce a wonderful new PhD opportunity in feral deer control strategies for South Australia.

The project is tentatively entitled: Refining models of feral deer abundance and distribution to inform culling programs in South Australia

Feral fallow deer (Dama dama) digging in a mallee fowl (Leipoa ocellata) mound © Lee Williams

The project brief follows:

South Australian legislation requires that all landholders cull feral deer on their properties. Despite this, feral deer abundance and distribution are increasing across South Australia. This arises because culling by land managers and government organisations is not keeping pace with rates of population growth, and some landholders are harbouring deer for hunting, whereas some deer escape from deer farms.

There are an estimated 40,000 feral deer in South Australia, and state government agencies are working to ramp up programs to cull feral deer before their numbers reach a point where control is no longer feasible.

Planning such large-scale and costly programs requires that government agencies engage economists to measure the economic impacts of feral deer, and to predict the value of these impacts in the future. That modelling is done regularly by governments, and in the case of pest-control programs, the modelling draws on models of feral deer population growth, farmer surveys about the economic, social, and environmental impacts of feral deer, and analyses of culling programs and trials of new culling techniques.

The economic models predict and compare both the current and future costs of:

  • deer impacts on pastures, crops, native plants, and social values (including illegal hunting)
  • culling programs that achieve different objectives (e.g., contain vs. reduce vs. eradicate)

The outputs of the models also inform whether there are sufficient public benefits from the investment of public funds into the culling of feral deer.


This PhD project will collate published and unpublished data to refine models of feral deer distribution and abundance under various culling scenarios. This project will drive both high-impact publications and, because this project builds extensive collaborations with government agencies, the results will inform the management of feral deer in South Australia.

Read the rest of this entry »




It’s a tough time for young conservation scientists

24 08 2021

Sure, it’s a tough time for everyone, isn’t it? But it’s a lot worse for the already disadvantaged, and it’s only going to go downhill from here. I suppose that most people who read this blog can certainly think of myriad ways they are, in fact, still privileged and very fortunate (I know that I am).

Nonetheless, quite a few of us I suspect are rather ground down by the onslaught of bad news, some of which I’ve been responsible for perpetuating myself. Add lock downs, dwindling job security, and the prospect of dying tragically due to lung infection, many have become exasperated.

I once wrote that being a conservation scientist is a particularly depressing job, because in our case, knowledge is a source of despair. But as I’ve shifted my focus from ‘preventing disaster’ to trying to lessen the degree of future shittyness, I find it easier to get out of bed in the morning.

What can we do in addition to shifting our focus to making the future a little less shitty than it could otherwise be? I have a few tips that you might find useful:

Read the rest of this entry »




Smart genetic analysis made fast and easy

29 07 2021

If you use genetics to differentiate populations, the new package smartsnp might be your new friend. Written in R language and available from GitHub and CRAN, this package does principal component analysis with control for genetic drift, projects ancient samples onto modern genetic space, and tests for population differences in genotypes. The package has been built to load big datasets and run complex stats in the blink of an eye, and is fully described in a paper published in Methods in Ecology and Evolution (1).


In the bioinformatics era, sequencing a genome has never been so straightforward. No surprise that > 20 petabytes of genomic data are expected to be generated every year by the end of this decade (2) — if 1 byte of information was 1 mm long, we could make 29,000 round trips to the moon with 20 petabytes. Data size in genetics keeps outpacing the computer power available to handle it at any given time (3). Many will be familiar with a computer freezing if unable to load or run an analysis on a huge dataset, and how many coffees or teas we might have drunk, or computer screens might have been broken, during the wait. The bottom line is that software advances that speed up data processing and genetic analysis are always good news.

With that idea in mind, I have just published a paper presenting the new R package smartsnp (1) to run multivariate analysis of big genotype data, with applications to studies of ancestry, evolution, forensics, lineages, and overall population genetics. I am proud to say that the development of the package has been one of the most gratifying short-term collaborations in my entire career, with my colleagues Christian Huber and Ray Tobler: a true team effort!

The package is available on GitHub and the Comprehensive R Archive Network CRAN. See downloading options here, and vignettes here with step-by-step instructions to run different functionalities of our package (summarised below).

In this blog, I use “genotype” meaning the combination of gene variants (alleles) across a predefined set of positions (loci) in the genome of a given individual of animal, human, microbe, or plant. One type of those variants is single nucleotide polymorphisms (SNP), a DNA locus at which two or more alternative nucleotides occur, sometimes conditioning protein translation or gene expression. SNPs are relatively stable over time and are routinely used to identify individuals and ancestors in humans and wildlife.

What the package does

The package smartsnp is partly based on the field-standard software EIGENSOFT (4, 5) which is only available for Unix command-line environments. In fact, our driving motivation was (i) to broaden the use of EIGENSOFT tools by making them available to the rocketing community of professionals, not only academics who employ R for their work (6), and (ii) to optimise our package to handle big datasets and complex stats efficiently. Our package mimics EIGENSOFT’s principal component analysis (SMARTPCA) (5), and also runs multivariate tests for population differences in genotypes as follows:

Read the rest of this entry »




… some (models) are useful

8 06 2021

As someone who writes a lot of models — many for applied questions in conservation management (e.g., harvest quotas, eradication targets, minimum viable population sizes, etc.), and supervises people writing even more of them, I’ve had many different experiences with their uptake and implementation by management authorities.

Some of those experiences have involved catastrophic failures to influence any management or policy. One particularly painful memory relates to a model we wrote to assist with optimising approaches to eradicate (or at least, reduce the densities of) feral animals in Kakadu National Park. We even wrote the bloody thing in Visual Basic (horrible coding language) so people could run the module in Excel. As far as I’m aware, no one ever used it.

Others have been accepted more readily, such as a shark-harvest model, which (I think, but have no evidence to support) has been used to justify fishing quotas, and one we’ve done recently for the eradication of feral pigs on Kangaroo Island (as yet unpublished) has led directly to increased funding to the agency responsible for the programme.

According to Altmetrics (and the online tool I developed to get paper-level Altmetric information quickly), only 3 of the 16 of what I’d call my most ‘applied modelling’ papers have been cited in policy documents:

Read the rest of this entry »




Killing (feral) cats quickly (and efficiently)

20 05 2021

I’m pleased to announce the publication of a paper led by Kathryn Venning (KV) that was derived from her Honours work in the lab. Although she’s well into her PhD on an entirely different topic, I’m overjoyed that she persevered and saw this work to publication.

Here, killa, killa, killa, killa …

As you probably already know, feral cats are a huge problem in Australia. The are probably the primary reason Australia leads the world in mammal extinctions in particular, and largely the reason so many re-introduction attempts of threatened marsupials fail miserably only after a few years.

Feral cats occupy every habitat in the country, from the high tropics to the deserts, and from the mountains to the sea. They adapt to the cold just as easily as they adapt to the extreme heat, and they can eat just about anything that moves, from invertebrates to the carcases of much larger animals that they scavenge.

Cats are Australia’s bane, but you can’t help but be at least a little impressed with their resilience.

Still, we have to try our best to get rid of them where we can, or at least reduce their densities to the point where their ecological damage is limited.

Typically, the only efficient and cost-effective way to do that is via lethal control, but by using various means. These can include direct shooting, trapping, aerial poison-baiting, and a new ‘smart’ method of targeted poison delivery via a prototype device known as a Felixer™️. The latter are particularly useful for passive control in areas where ground-shooting access is difficult.

A live Felixer™️ deployed on Kangaroo Island (photo: CJA Bradshaw 2020)

A few years back the federal government committed what might seem like a sizeable amount of money to ‘eradicate’ cats from Australia. Yeah, good luck with that, although the money has been allocated to several places where cat reduction and perhaps even eradication is feasible. Namely, on islands.

Read the rest of this entry »




Mapping the ‘super-highways’ the First Australians used to cross the ancient land

4 05 2021

Author provided/The Conversation, Author provided


There are many hypotheses about where the Indigenous ancestors first settled in Australia tens of thousands of years ago, but evidence is scarce.

Few archaeological sites date to these early times. Sea levels were much lower and Australia was connected to New Guinea and Tasmania in a land known as Sahul that was 30% bigger than Australia is today.

Our latest research advances our knowledge about the most likely routes those early Australians travelled as they peopled this giant continent.


Read more: The First Australians grew to a population of millions, much more than previous estimates


We are beginning to get a picture not only of where those first people landed in Sahul, but how they moved throughout the continent.

Navigating the landscape

Modelling human movement requires understanding how people navigate new terrain. Computers facilitate building models, but they are still far from easy. We reasoned we needed four pieces of information: (1) topography; (2) the visibility of tall landscape features; (3) the presence of freshwater; and (4) demographics of the travellers.

We think people navigated in new territories — much as people do today — by focusing on prominent land features protruding above the relative flatness of the Australian continent. Read the rest of this entry »





Population of First Australians grew to millions, much more than previous estimates

30 04 2021

Shutterstock/Jason Benz Bennee


We know it is more than 60,000 years since the first people entered the continent of Sahul — the giant landmass that connected New Guinea, Australia and Tasmania when sea levels were lower than today.

But where the earliest people moved across the landscape, how fast they moved, and how many were involved, have been shrouded in mystery.

Our latest research, published today shows the establishment of populations in every part of this giant continent could have occurred in as little as 5,000 years. And the entire population of Sahul could have been as high as 6.4 million people.

This translates to more than 3 million people in the area that is now modern-day Australia, far more than any previous estimate.


Read more: We mapped the ‘super-highways’ the First Australians used to cross the ancient land


The first people could have entered through what is now western New Guinea or from the now-submerged Sahul Shelf off the modern-day Kimberley (or both).

But whichever the route, entire communities of people arrived, adapted to and established deep cultural connections with Country over 11 million square kilometres of land, from northwestern Sahul to Tasmania.

A map showing a much larger landmass as Australia is joined to both Tasmania and New Guinea due to lower sea levels

Map of what Australia looked like for most of the human history of the continent when sea levels were lower than today. Author provided


This equals a rate of population establishment of about 1km per year (based on a maximum straight-line distance of about 5,000km from the introduction point to the farthest point).

That’s doubly impressive when you consider the harshness of the Australian landscape in which people both survived and thrived.

Previous estimates of Indigenous population

Various attempts have been made to calculate the number of people living in Australia before European invasion. Estimates vary from 300,000 to more than 1,200,000 people. Read the rest of this entry »





The biggest and slowest don’t always bite it first

13 04 2021

For many years I’ve been interested in modelling the extinction dynamics of megafauna. Apart from co-authoring a few demographically simplified (or largely demographically free) models about how megafauna species could have gone extinct, I have never really tried to capture the full nuances of long-extinct species within a fully structured demographic framework.

That is, until now.

But how do you get the life-history data of an extinct animal that was never directly measured. Surely, things like survival, reproductive output, longevity and even environmental carrying capacity are impossible to discern, and aren’t these necessary for a stage-structured demographic model?

Thylacine mum & joey. Nellie Pease & CABAH

The answer to the first part of that question “it’s possible”, and to the second, it’s “yes”. The most important bit of information we palaeo modellers need to construct something that’s ecologically plausible for an extinct species is an estimate of body mass. Thankfully, palaeontologists are very good at estimating the mass of the things they dig up (with the associated caveats, of course). From such estimates, we can reconstruct everything from equilibrium densities, maximum rate of population growth, age at first breeding, and longevity.

But it’s more complicated than that, of course. In Australia anyway, we’re largely dealing with marsupials (and some monotremes), and they have a rather different life-history mode than most placentals. We therefore have to ‘correct’ the life-history estimates derived from living placental species. Thankfully, evolutionary biologists and ecologists have ways to do that too.

The Pleistocene kangaroo Procoptodon goliah, the largest and most heavily built of the  short-faced kangaroos, was the largest and most heavily built kangaroo known. It had an  unusually short, flat face and forwardly directed 
eyes, with a single large toe on each foot  (reduced from the more normal count of four). Each forelimb had two long, clawed fingers  that would have been used to bring leafy branches within reach.

So with a battery of ecological, demographic, and evolutionary tools, we can now create reasonable stochastic-demographic models for long-gone species, like wombat-like creatures as big as cars, birds more than two metres tall, and lizards more than seven metres long that once roamed the Australian continent. 

Ancient clues, in the shape of fossils and archaeological evidence of varying quality scattered across Australia, have formed the basis of several hypotheses about the fate of megafauna that vanished during a peak about 42,000 years ago from the ancient continent of Sahul, comprising mainland Australia, Tasmania, New Guinea and neighbouring islands.

There is a growing consensus that multiple factors were at play, including climate change, the impact of people on the environment, and access to freshwater sources.

Just published in the open-access journal eLife, our latest CABAH paper applies these approaches to assess how susceptible different species were to extinction – and what it means for the survival of species today. 

Using various characteristics such as body size, weight, lifespan, survival rate, and fertility, we (Chris Johnson, John Llewelyn, Vera Weisbecker, Giovanni Strona, Frédérik Saltré & me) created population simulation models to predict the likelihood of these species surviving under different types of environmental disturbance.

Simulations included everything from increasing droughts to increasing hunting pressure to see which species of 13 extinct megafauna (genera: Diprotodon, Palorchestes, Zygomaturus, Phascolonus, Procoptodon, Sthenurus, Protemnodon, Simosthenurus, Metasthenurus, Genyornis, Thylacoleo, Thylacinus, Megalibgwilia), as well as 8 comparative species still alive today (Vombatus, Osphranter, Notamacropus, Dromaius, Alectura, Sarcophilus, Dasyurus, Tachyglossus), had the highest chances of surviving.

We compared the results to what we know about the timing of extinction for different megafauna species derived from dated fossil records. We expected to confirm that the most extinction-prone species were the first species to go extinct – but that wasn’t necessarily the case.

While we did find that slower-growing species with lower fertility, like the rhino-sized wombat relative Diprotodon, were generally more susceptible to extinction than more-fecund species like the marsupial ‘tiger’ thylacine, the relative susceptibility rank across species did not match the timing of their extinctions recorded in the fossil record.

Indeed, we found no clear relationship between a species’ inherent vulnerability to extinction — such as being slower and heavier and/or slower to reproduce — and the timing of its extinction in the fossil record.

In fact, we found that most of the living species used for comparison — such as short-beaked echidnas, emus, brush turkeys, and common wombats — were more susceptible on average than their now-extinct counterparts.

Read the rest of this entry »




How to avoid reduce the probability of being killed by a shark

31 03 2021

Easy. Don’t go swimming/surfing/snorkelling/diving in the ocean.


“Oh, shit”

Sure, that’s true, but if you’re like many Australians, the sea is not just a beautiful thing to look at from the window, it’s a way of life. Trying telling a surfer not to surf, or a diver not to dive. Good luck with that.

A few years ago, I joined a team of super-cool sharkologists led by Charlie ‘Aussie-by-way-of-Belgium shark-scientist extraordinaire Huveneers, and including Maddie ‘Chomp’ Thiele and Lauren ‘Acid’ Meyer — to publish the results of some of the first experimentally tested shark deterrents.

It turns out that many of the deterrents we tested failed to show any reduction in the probability of a shark biting, with only one type of electronic deterrent showing any effect at all (~ 60% reduction).

Great. But what might that mean in terms of how many people could be saved by wearing such electronic deterrents? While the probability of being bitten by a shark is low globally, even in Australia (despite public perceptions), we wondered if the number of lives saved and injuries avoided was substantial.

In a new paper just published today in Royal Society Open Science, we attempted to answer that question.

To predict how many people could avoid shark bites if they were using properly donned electronic deterrents that demonstrate some capacity to dissuade sharks from biting, we examined the century-scale time series of shark bites on humans in Australia. This database — the ‘Australian Shark Attack File‘ — is one of the most comprehensive databases of its kind.

Read the rest of this entry »




Need to predict population trends, but can’t code? No problem

2 12 2020

Yes, yes. I know. Another R Shiny app.

However, this time I’ve strayed from my recent bibliometric musings and developed something that’s more compatible with the core of my main research and interests.

Welcome to LeslieMatrixShiny!

Over the years I’ve taught many students the basics of population modelling, with the cohort-based approaches dominating the curriculum. Of these, the simpler ‘Leslie’ (age-classified) matrix models are both the easiest to understand and for which data can often be obtained without too many dramas.

But unless you’re willing to sit down and learn the code, they can be daunting to the novice.

Sure, there are plenty of software alternatives out there, such as Bob Lacy‘s Vortex (a free individual-based model available for PCs only), Resit Akçakaya & co’s RAMAS Metapop ($; PC only), Stéphane Legendre‘s Unified Life Models (ULM; open-source; all platforms), and Charles Todd‘s Essential (open-source; PC only) to name a few. If you’re already an avid R user and already into population modelling, you might be familiar with the population-modelling packages popdemo, OptiPopd, or sPop. I’m sure there are still other good resources out there of which I’m not aware.

But, even to install the relevant software or invoke particular packages in R takes a bit of time and learning. It’s probably safe to assume that many people find the prospect daunting.

It’s for this reason that I turned my newly acquired R Shiny skills to matrix population models so that even complete coding novices can run their own stochastic population models.

I call the app LeslieMatrixShiny.

Read the rest of this entry »




Grand Challenges in Global Biodiversity Threats

8 10 2020

Last week I mentioned that the new journal Frontiers in Conservation Science is now open for business. As promised, I wrote a short article outlining our vision for the Global Biodiversity Threats section of the journal. It’s open-access, of course, so I’m also copying here on ConservationBytes.com.


Most conservation research and its applications tend to happen most frequently at reasonably fine spatial and temporal scales — for example, mesocosm experiments, single-species population viability analyses, recovery plans, patch-level restoration approaches, site-specific biodiversity surveys, et cetera. Yet, at the other end of the scale spectrum, there have been many overviews of biodiversity loss and degradation, accompanied by the development of multinational policy recommendations to encourage more sustainable decision making at lower levels of sovereign governance (e.g., national, subnational).

Yet truly global research in conservation science is fact comparatively rare, as poignantly demonstrated by the debates surrounding the evidence for and measurement of planetary tipping points (Barnosky et al., 2012; Brook et al., 2013; Lenton, 2013). Apart from the planetary scale of human-driven disruption to Earth’s climate system (Lenton, 2011), both scientific evidence and policy levers tend to be applied most often at finer, more tractable research and administrative scales. But as the massive ecological footprint of humanity has grown exponentially over the last century (footprintnetwork.org), robust, truly global-scale evidence of our damage to the biosphere is now starting to emerge (Díaz et al., 2019). Consequently, our responses to these planet-wide phenomena must also become more global in scope.

Conservation scientists are adept at chronicling patterns and trends — from the thousands of vertebrate surveys indicating an average reduction of 68% in the numbers of individuals in populations since the 1970s (WWF, 2020), to global estimates of modern extinction rates (Ceballos and Ehrlich, 2002; Pimm et al., 2014; Ceballos et al., 2015; Ceballos et al., 2017), future models of co-extinction cascades (Strona and Bradshaw, 2018), the negative consequences of invasive species across the planet (Simberloff et al., 2013; Diagne et al., 2020), discussions surrounding the evidence for the collapse of insect populations (Goulson, 2019; Komonen et al., 2019; Sánchez-Bayo and Wyckhuys, 2019; Cardoso et al., 2020; Crossley et al., 2020), the threats to soil biodiversity (Orgiazzi et al., 2016), and the ubiquity of plastic pollution (Beaumont et al., 2019) and other toxic substances (Cribb, 2014), to name only some of the major themes in global conservation. 

Read the rest of this entry »







%d bloggers like this: