Australia’s motto: “Screw the environment!”

2 09 2015
Mmmmm! I love coal!

Mmmmm! I love coal!

An article originally posted on ALERT by April Reside (with permission to reproduce).

The Conservative Tony Abbott government in Australia is proposing alarming changes to Australia’s Environmental Protection and Biodiversity Conservation (EPBC) Act 1999 — a remarkable move that would prevent environment groups from challenging many damaging development projects.

This has all come to a head over the Carmichael Coal Mine — a plan to build a massive mine in central Queensland in order to export 60 million of tonnes of coal to India each year.

Coal, of course, is the dirtiest of all fossil fuels, and India’s plan to burn it by the shipload for electricity is bad news for the planet.

The Abbott government is in a tizzy after after a community organization, the Mackay Conservation Group, challenged the approval of the Carmichael Mine in Australia’s Federal Court.

The community group says Environment Minister Greg Hunt didn’t properly consider the impact the mine would have on two threatened species, the yakka skink and ornamental snake.

The mine site also sustains the largest population of the southern subspecies of the black-throated Finch, which is endangered.

The implications of the mega-mine go well beyond a few imperilled species. If the mine goes ahead, it will be one of the biggest in the world — and the emissions from burning its mountains of coal would cancel out all gains made from Australia’s current emissions-reduction strategy.

On top of the frightening precedent it would set, the Abbott government appears to be double-dealing. Read the rest of this entry »

Ice Age? No. Abrupt warmings and hunting together polished off Holarctic megafauna

24 07 2015
Oh shit oh shit oh shit ...

Oh shit oh shit oh shit …

Did ice ages cause the Pleistocene megafauna to go extinct? Contrary to popular opinion, no, they didn’t. But climate change did have something to do with them, only it was global warming events instead.

Just out today in Science, our long-time-coming (9 years in total if you count the time from the original idea to today) paper ‘Abrupt warmings drove Late Pleistocene Holarctic megafaunal turnover‘ led by Alan Cooper of the Australian Centre for Ancient DNA and Chris Turney of the UNSW Climate Change Research Centre demonstrates for the first time that abrupt warming periods over the last 60,000 years were at least partially responsible for the collapse of the megafauna in Eurasia and North America.

You might recall that I’ve been a bit sceptical of claims that climate changes had much to do with megafauna extinctions during the Late Pleistocene and early Holocene, mainly because of the overwhelming evidence that humans had a big part to play in their demise (surprise, surprise). What I’ve rejected though isn’t so much that climate had nothing to do with the extinctions; rather, I took issue with claims that climate change was the dominant driver. I’ve also had problems with blanket claims that it was ‘always this’ or ‘always that’, when the complexity of biogeography and community dynamics means that it was most assuredly more complicated than most people think.

I’m happy to say that our latest paper indeed demonstrates the complexity of megafauna extinctions, and that it took a heap of fairly complex datasets and analyses to demonstrate. Not only were the data varied – the combination of scientists involved was just as eclectic, with ancient DNA specialists, palaeo-climatologists and ecological modellers (including yours truly) assembled to make sense of the complicated story that the data ultimately revealed. Read the rest of this entry »

Cartoon guide to biodiversity loss XXX 30

27 05 2015

[10.06.2015 update: Because of all the people looking for cartoon porn, I’ve slightly altered the title of this post. Should have predicted that one]

Third batch of six biodiversity cartoons for 2015 (see full stock of previous ‘Cartoon guide to biodiversity loss’ compendia here).

Read the rest of this entry »

Statistical explainer: average temperature increases can be deceiving

12 05 2015

Beating-the-Heat-Without-PowerOver the years I’ve used a simple graphic from the IPCC 2007 Report to explain to people without a strong background in statistics just why average temperature increases can be deceiving. If you’re not well-versed in probability theory (i.e., most people), it’s perhaps understandable why so few of us appear to be up-in-arms about climate change. I posit that if people had a better appreciation of mathematics, there would be far less inertia in dealing with the problem.

Instead of using the same image, I’ve done up a few basic graphs that explain the concept of why average increases in temperature can be deceiving; in other words, I explain why focussing on the ‘average’ projected increases will not enable you to appreciate the most dangerous aspects of a disrupted climate – the frequency of extreme events. Please forgive me if you find this little explainer too basic – if you have a modicum of probability theory tucked away in your educational past, then this will be of little insight. However, you may wish to use these graphs to explain the problem to others who are less up-to-speed than you.

Let’s take, for example, all the maximum daily temperature data from a single location compiled over the last 100 years. We’ll assume for the moment that there has been no upward trend in the data over this time. If you plot the frequency of these temperatures in, say, 2-degree bins over those 100 years, you might get something like this:


This is simply an illustration, but here the long-term annual average temperature is 25 degrees Celsius, and the standard deviation is 5 degrees. In other words, over those 100 years, the average daily maximum temperature is 25 degrees, but there were a few days when the maximum was < 10 degrees, and a few others where it was > 40 degrees. This could represent a lot of different places in the world.

We can now fit what’s known as a ‘probability density function’ to this histogram to obtain a curve of expected probability of any temperature within that range:


If you’ve got some background in statistics, then you’ll know that this is simply a normal (Gaussian) distribution. With this density function, we can now calculate the probability of any particular day’s maximum temperature being above or below any particular threshold we choose. In the case of the mean (25 degrees), we know that exactly half (p = 0.50) of the days will have a maximum temperature below it, and exactly half above it. In other words, this is simply the area under the density function itself (the total area under the entire curve = 1). Read the rest of this entry »

Lomborg: a detailed citation analysis

24 04 2015

There’s been quite a bit of palaver recently about the invasion of Lomborg’s ‘Consensus’ Centre to the University of Western Australia, including inter alia that there was no competitive process for the award of $4 million of taxpayer money from the Commonwealth Government, that Lomborg is a charlatan with a not-terribly-well-hidden anti-climate change agenda, and that he his not an academic and possesses no credibility, so he should have no right to be given an academic appointment at one of Australia’s leading research universities.

On that last point, there’s been much confusion among non-academics about what it means to have no credible academic track record. In my previous post, I reproduced a letter from the Head of UWA’s School of Animal Biology, Professor Sarah Dunlop where she stated that Lomborg had a laughably low h-index of only 3. The Australian, in all their brilliant capacity to report the unvarnished truth, claimed that a certain Professor Ian Hall of Griffith University had instead determined that Lomborg’s h-index was 21 based on Harzing’s Publish or Perish software tool. As I show below, if Professor Hall did indeed conclude this, it shows he knows next to nothing about citation indices.

What is a ‘h-index’ and why does it matter? Below I provide an explainer as well as some rigorous analysis of Lomborg’s track record.

Read the rest of this entry »

Missing the forest despite its trees

21 04 2015

An exchange on over the intactness of boreal forests has just erupted. Bill Laurance asked me to weigh in as an independent appraiser of the debate, so I copy my thoughts below. You can read the original exchange between Jeff Wells and Nick Haddad (& colleagues) here.

Despite its immense size, there is little doubt that the ugly second cousin of forest conservation is the boreal region covering much of Alaska, Canada, Fennoscandia and Russia. Indeed, extending some 1.4 billion hectares, of which well over 60% is found in Russia alone (1, 2), the entirety of the boreal forest is more than double the area of the Amazon forest. Yet despite this massive expanse, the impressive biota it shelters (2), and its important contribution to the global carbon (1), nitrogen (3) and oxygen (4) cycles, the boreal is an oft-overlooked region in terms of global conservation priorities and possibilities (5).

The exchange between Haddad & Sexton and Wells regarding the former researchers’ recent paper (6) highlights this problem, of which even many expert ecologists are often only vaguely aware. Wells takes particular issue with Haddad and colleagues’ assertion that the boreal forest is highly fragmented, claiming to the contrary that the (North America) boreal forest is “… truly intact … ”. While Haddad et al. respond that they did not differentiate between ‘natural’ and human-caused fragmentation, my view is that the exchange misses some important concerns about the state of the boreal forest.

Wells correctly points out that the boreal zone in North America is “massive”, but can his other claim – that it is “truly intact” – stand up to scrutiny? Citing one of my own papers from 2009 (2) to demonstrate (correctly) that the boreal forest of North America holds a stunning array of species, Wells neglects to highlight that in that same paper we also identified the extensive, artificial fragmentation that has occurred there and in other parts of the boreal zone over the last few decades. For example, we showed clearly that only 44% of the entire biome is considered to be ‘intact’, defining the term precisely as “areas ≥ 500 km2, internally undivided by infrastructure (e.g., roads) and with linear dimensions ≥ 10 km”. Satellite imagery has also confirmed that between 2000 and 2005, the boreal biome experienced the largest area of gross forest cover loss compared to any other (7). Despite recent evidence that so-called edge effects (characteristics of a disturbed matrix that penetrate some distance into habitat fragments) are probably of a smaller spatial magnitude in boreal compared to other biomes (8), it is disingenuous to claim that North America’s boreal forests are “truly intact”. Read the rest of this entry »

How things have (not) changed

13 04 2015

The other night I had the pleasure of dining with the former Australian Democrats leader and senator, Dr John Coulter, at the home of Dr Paul Willis (Director of the Royal Institution of Australia). It was an enlightening evening.

While we discussed many things, the 84 year-old Dr Coulter showed me a rather amazing advert that he and several hundred other scientists, technologists and economists constructed to alert the leaders of Australia that it was heading down the wrong path. It was amazing for three reasons: (i) it was written in 1971, (ii) it was published in The Australian, and (iii) it could have, with a few modifications, been written for today’s Australia.

If you’re an Australian and have even a modicum of environmental understanding, you’ll know that The Australian is a Murdochian rag infamous for its war on science and reason. Even I have had a run-in with its outdated, consumerist and blinkered editorial board. You certainly wouldn’t find an article like Dr Coulter’s in today’s Australian.

More importantly, this 44 year-old article has a lot today that is still relevant. While the language is a little outdated (and sexist), the grammar could use a few updates, and there are some predictions that clearly never came true, it’s telling that scientists and others have been worrying about the same things for quite some time.

In reading the article (reproduced below), one could challenge the authors for being naïve about how society can survive and even prosper despite a declining ecological life-support system. As I once queried Paul Ehrlich about some of his particularly doomerist predictions from over 50 years ago, he politely pointed out that much of what he predicted did, in fact, come true. There are over 1 billion people today that are starving, and another billion or so that are malnourished; combined, this is greater than the entire world population when Paul was born.

So while we might have delayed the crises, we certainly haven’t averted them. Technology does potentially play a positive role, but it can also increase our short-term carrying capacity and buffer the system against shocks. We then tend to ignore the indirect causes of failures like wars, famines and political instability because we do not recognise the real drivers: resource scarcity and ecosystem malfunction.

Australia has yet to learn its lesson.

To Those Who Shape Australia’s Destiny

We believe that western technological society has ignored two vital facts: Read the rest of this entry »


Get every new post delivered to your Inbox.

Join 8,773 other followers

%d bloggers like this: