Grim tale of global shark declines

25 06 2015
Please don't eat me

Please don’t eat me

How do you prevent declines of species you cannot even see? This is (and has always been) the dilemma for fisheries because, well, humans don’t live underwater. Even when we strap on a metal tank full of air and a pair of fins, we’re still more or less like wounded astronauts peering through a narrow window of glass at the huge, largely empty, ocean space. It’s little wonder then that we have a fairly crap system of estimating fish abundance, and an even worse track record of managing them sustainably.

But humans love to eat fish – the total world estimate of legal fisheries landings is something in the vicinity of 190 million tonnes in 2013, up from 18 million tonnes in 1950 (according to FAO). We’re probably familiar with some of the losers of that massive harvest, with species like tunas, bill fishes and orange roughy making the news for catastrophic declines in abundance over the last 30-40 years. And we’re not even talking about the estimated tragedy that is illegal, unreported and unregulated (IUU) fishing.

Back in 1999, the FAO started to report that sharks – the new-ish target of many world fisheries resulting from the commercial extinction of many other fin fish fisheries – we’re starting to take the hit. Once generally ignored by fishing industries, sharks soon became popular target species. Then in 2003, Julia Baum and colleagues famously (and somewhat controversially) sounded the alarm for sharks in the Gulf of Mexico by some claims of major and catastrophic declines of large, predatory sharks. While some of the subsequent to-ing and fro-ing in the literature challenged these claims, Baum’s excellent work was ultimately vindicated.

Since then, more and more evidence that sharks are in trouble has surfaced, including the assessment of the reported (again, only legal) catch indicated heavy depletion of coastal sharks even by 1975, and the estimate that 25% of all shark and ray species have an elevated extinction risk, mainly resulting from overfishing. Now even the direct fisheries landings statistics are confirming this grim tale. Read the rest of this entry »

National commitment to conservation brings biodiversity benefits

16 06 2015

united-nations-dayWhat makes some conservation endeavours successful where so many fail to protect biodiversity? Or, how long is a piece of string?

Yes, it’s a difficult question because it’s not just about the biology – such as resilience and area relationships – in fact, it’s probably more about the socio-economic setting that will ultimately dictate how the biodiversity in any particular area fares in response to disturbance.

In the case of protected areas (that I’ll just refer to as ‘reserves’ for the remainder of this post), there’s been a lot of work done about the things that make them ‘work’ (or not) in terms of biodiversity preservation. Yes, we can measure investment, how much the community supports and is involved with the reserve, how much emphasis is put on enforcement, the types of management done within (and outside) of the reserves, et ceteraet cetera. All of these things can (and have to some extent) been correlated with indices of the fate of the biodiversity within reserves, such as rates and patterns of deforestation, the amount of illegal hunting, and the survival probability of particular taxa.

But the problem with these indices is that there are just indices – they probably do not encapsulate the overall ‘health’ of the biodiversity within a reserve (be that trends in the overall abundance of organisms, the resilience of the community as a whole to future disturbances, or the combined phylogenetic diversity of the ecosystem). This is because there are few long-term monitoring programmes of sufficient taxonomic and temporal breadth to summarise these components of complex ecosystems (i.e., ecology is complex). It’s no real surprise, and even though we should put a lot more emphasis on targeted, efficient, long-term biodiversity monitoring inside and outside of all major biodiversity reserves, the cold, hard truth of it is that we’ll never manage to get the required systems in place. Humanity just doesn’t value it enough. Read the rest of this entry »

Scientific conspiracies are impossible

9 06 2015

madscientistWe’ve all heard it somewhere before: “It’s all just a big conspiracy and those bloody scientists are just trying to protect their funding sources.”

Whether it’s about climate change, pharmacology, genetically modified organisms or down-to-earth environmentalism, people who don’t want to agree with a particular scientific finding often invoke the conspiracy argument.

There are three main reasons why conspiracies among scientists are impossible. First, most scientists are just not that organised, nor do they have the time to get together to plan such elaborate practical jokes on the public. We can barely keep our own shit together than try to construct a water-tight conspiracy. I’ve never met a scientist who would be capable of doing this, let alone who would want to.

But this doesn’t necessarily prove my claim that it is ‘impossible’. Most importantly, the idea that a conspiracy could form among scientists ignores one of the most fundamental components of scientific progress — dissension; and bloody hell, can we dissent! The scientific approach is one where successive lines of evidence testing hypotheses are eventually amassed into a concept, then perhaps a rule of thumb. Read the rest of this entry »

An appeal to extinction chronologists

2 06 2015

u7Pi3Extinction is forever, right? Yes, it’s true that once the last individual of a species dies (apart from insane notions that de-extinction will do anything to resurrect a species in perpetuity), the species is extinct. However, the answer can also be ‘no’ when you are limited by poor sampling. In other words, when you think something went extinct when in reality you just missed it.

Most of you are familiar with the concept of Lazarus1 species – when we’ve thought of something long extinct that suddenly gets re-discovered by a wandering naturalist or a wayward fisher. In paleontological (and modern conservation biological) terms, the problem is formally described as the ‘Signor-Lipps’ effect, named2 after two American palaeontologists, Phil Signor3 and Jere Lipps. It’s a fairly simple concept, but it’s unfortunately ignored in most palaeontological, and to a lesser extent, conservation studies.

The Signor-Lipps effect arises because the last (or first) evidence (fossil or sighting) of a species presence has a nearly zero chance of heralding its actual timing of extinction (or appearance). In paleontological terms, it’s easy to see why. Fossilisation is in fact a nearly impossible phenomenon – all the right conditions have to be in place for a once-living biological organism to be fossilised: it either has to be buried quickly, in a place where nothing can decompose it (usually, an anoxic environment), and then turned to rock by the process of mineral replacement. It then has to resist transformation by not undergoing metamorphosis (e.g., vulcanism, extensive crushing, etc.). For more recent specimens, preservation can occur without the mineralisation process itself (e.g., bones or flesh in an anoxic bog). Then the bloody things have to be found by a diligent geologist or palaeontologist! In other words, the chances that any one organism is preserved as a fossil after it dies are extremely small. In more modern terms, individuals can go undetected if they are extremely rare or remote, such that sighting records alone are usually insufficient to establish the true timing of extinction. The dodo is a great example of this problem. Remember too that all this works in reverse – the first fossil or observation is very much unlikely to be the first time that the species was there. Read the rest of this entry »

Cartoon guide to biodiversity loss XXX 30

27 05 2015

[10.06.2015 update: Because of all the people looking for cartoon porn, I’ve slightly altered the title of this post. Should have predicted that one]

Third batch of six biodiversity cartoons for 2015 (see full stock of previous ‘Cartoon guide to biodiversity loss’ compendia here).

Read the rest of this entry »

Dawn of life

18 05 2015
Looking east toward the northern Flinders Ranges from Ediacara Conservation Park. © CJA Bradshaw

Looking east toward the northern Flinders Ranges from Ediacara Conservation Park. © CJA Bradshaw

I’ve had one of the most mind-blowing weeks of scientific discovery in my career, and it’s not even about a subject from within my field.

As some of you might know, I’ve been getting more and more interested in palaeo-ecology over the past few years. I’m fascinated by the challenge of reconstructing past communities and understanding how and why they changed. It’s a natural progression for someone interested in modern extinction dynamics.

Most of my recent interests have focussed on palaeo-communities of the Late Quaternary, and mainly in the range of 100 thousand years ago to the present. We’ve started publishing a few things in this area, and I can confirm that they’ll be plenty more to come in the following months and years. Despite plenty more to do in the youngest of palaeo-communities, I’ve now been bitten by the deep-time bug.

The giant Dickinsonia rex - a flat, worm-like discoid animal. © D. García-Bellido

The giant Dickinsonia rex – a flat, worm-like discoid animal. © D. García-Bellido

When I write ‘deep time’, I bloody well mean it: back to 580 million years, to be accurate. This is the time before the great Cambrian explosion of life popularised by the late Stephen Jay Gould in his brilliant book, Wonderful Life1,2. I’m talking about the Ediacaran period from 635-541 million years ago.

I’ve lived in South Australia now for over seven years, but it was only in the last few that I realised the Ediacaran was named after the Ediacara Hills in the northern Flinders Ranges some 650 km north of Adelaide where I live, and it wasn’t until last week that I had the extremely gratifying privilege of visiting the region with some of the world’s top Ediacaran specialists. If you have even the remotest interest in geological time and the origin of life on Earth, you should make a pilgrimage to the Flinders Ranges at some point before you die.

Read the rest of this entry »

Statistical explainer: average temperature increases can be deceiving

12 05 2015

Beating-the-Heat-Without-PowerOver the years I’ve used a simple graphic from the IPCC 2007 Report to explain to people without a strong background in statistics just why average temperature increases can be deceiving. If you’re not well-versed in probability theory (i.e., most people), it’s perhaps understandable why so few of us appear to be up-in-arms about climate change. I posit that if people had a better appreciation of mathematics, there would be far less inertia in dealing with the problem.

Instead of using the same image, I’ve done up a few basic graphs that explain the concept of why average increases in temperature can be deceiving; in other words, I explain why focussing on the ‘average’ projected increases will not enable you to appreciate the most dangerous aspects of a disrupted climate – the frequency of extreme events. Please forgive me if you find this little explainer too basic – if you have a modicum of probability theory tucked away in your educational past, then this will be of little insight. However, you may wish to use these graphs to explain the problem to others who are less up-to-speed than you.

Let’s take, for example, all the maximum daily temperature data from a single location compiled over the last 100 years. We’ll assume for the moment that there has been no upward trend in the data over this time. If you plot the frequency of these temperatures in, say, 2-degree bins over those 100 years, you might get something like this:


This is simply an illustration, but here the long-term annual average temperature is 25 degrees Celsius, and the standard deviation is 5 degrees. In other words, over those 100 years, the average daily maximum temperature is 25 degrees, but there were a few days when the maximum was < 10 degrees, and a few others where it was > 40 degrees. This could represent a lot of different places in the world.

We can now fit what’s known as a ‘probability density function’ to this histogram to obtain a curve of expected probability of any temperature within that range:


If you’ve got some background in statistics, then you’ll know that this is simply a normal (Gaussian) distribution. With this density function, we can now calculate the probability of any particular day’s maximum temperature being above or below any particular threshold we choose. In the case of the mean (25 degrees), we know that exactly half (p = 0.50) of the days will have a maximum temperature below it, and exactly half above it. In other words, this is simply the area under the density function itself (the total area under the entire curve = 1). Read the rest of this entry »


Get every new post delivered to your Inbox.

Join 7,931 other followers

%d bloggers like this: