Lomborg: a detailed citation analysis

24 04 2015

There’s been quite a bit of palaver recently about the invasion of Lomborg’s ‘Consensus’ Centre to the University of Western Australia, including inter alia that there was no competitive process for the award of $4 million of taxpayer money from the Commonwealth Government, that Lomborg is a charlatan with a not-terribly-well-hidden anti-climate change agenda, and that he his not an academic and possesses no credibility, so he should have no right to be given an academic appointment at one of Australia’s leading research universities.

On that last point, there’s been much confusion among non-academics about what it means to have no credible academic track record. In my previous post, I reproduced a letter from the Head of UWA’s School of Animal Biology, Professor Sarah Dunlop where she stated that Lomborg had a laughably low h-index of only 3. The Australian, in all their brilliant capacity to report the unvarnished truth, claimed that a certain Professor Ian Hall of Griffith University had instead determined that Lomborg’s h-index was 21 based on Harzing’s Publish or Perish software tool. As I show below, if Professor Hall did indeed conclude this, it shows he knows next to nothing about citation indices.

What is a ‘h-index’ and why does it matter? Below I provide an explainer as well as some rigorous analysis of Lomborg’s track record.

Read the rest of this entry »

Hate journal impact factors? Try Google rankings instead

18 11 2013

pecking orderA lot of people hate journal impact factors (IF). The hatred arises for many reasons, some of which are logical. For example, Thomson Reuters ISI Web of Knowledge® keeps the process fairly opaque, so it’s sometimes difficult to tell if journals are fairly ranked. Others hate IF because it does not adequately rank papers within or among sub disciplines. Still others hate the idea that citations should have anything to do with science quality (debatable, in my view). Whatever your reason though, IF are more or less here to stay.

Yes, individual scientists shouldn’t be ranked based only on the IF of the journals in which they publish; there are decent alternatives such as the h-index (which can grow even after you die), or even better, the m-index (or m-quotient; think of the latter as a rate of citation accumulation). Others would rather ditch the whole citation thing altogether and measure some element of ‘impact’, although that elusive little beast has yet to be captured and applied objectively.

So just in case you haven’t already seen it, Google has recently put its journal-ranking hat in the ring with its journal metrics. Having firmly wrested the cumbersome (and expensive) personal citation accumulators from ISI and Scopus (for example) with their very popular (and free!) Google Scholar (which, as I’ve said before, all researchers should set-up and make available), they now seem poised to do the same for journal rankings.

So for your viewing and arguing pleasure, here are the ‘top’ 20 journals in Biodiversity and Conservation Biology according to Google’s h5-index (the h-index for articles published in that journal in the last 5 complete years; it is the largest number h such that h articles published in 2008-2012 have at least h citations each):

Read the rest of this entry »

A posthumous citation tribute for Sodhi

6 11 2012

I’m sitting at a friend’s house in Sydney writing this quick entry before jumping on a plane to London. It’s been a busy few days, and will be an even busier next few weeks.

I met with Paul and Anne Ehrlich yesterday (who are visiting Australia) and we finalised the first complete draft of our book – I will keep you posted on that. In London, I will be meeting with the Journal of Animal Ecology crew on Wednesday night (I’m on the editorial board), followed by two very interesting days at the Zoological Society of London‘s Protected Areas Symposium at Regent’s Park. Then I’ll be off to the Universities of Liverpool and York for a quick lecture tour, followed by a very long trip back home. I’m already tired.

In the meantime, I thought I’d share a little bit of news about our dear and recently deceased friend and colleague, Navjot Sodhi. We’ve already written several times our personal tributes (see here, here and here) to this great mind of conservation thinking who disappeared from us far too soon, but this is a little different. Barry Brook, as is his wont to do, came up with a great idea to get Navjot up posthumously on Google Scholar.
Read the rest of this entry »

Arguing for scientific socialism in ecology funding

26 06 2012

What makes an ecologist ‘successful’? How do you measure ‘success’? We’d all like to believe that success is measured by our results’ transformation of ecological theory and practice – in a conservation sense, this would ultimately mean our work’s ability to prevent (or at least, slow down) extinctions.

Alas, we’re not that good at quantifying such successes, and if you use the global metric of species threats, deforestation, pollution, invasive species and habitat degradation, we’ve failed utterly.

So instead, we measure scientific ‘success’ via peer-reviewed publications, and the citations (essentially, scientific cross-referencing) that arise from these. These are blunt instruments, to be sure, but they are really the only real metrics we have. If you’re not being cited, no one is reading your work; and if no one is reading you’re work, your cleverness goes unnoticed and you help nothing and no one.

A paper I just read in the latest issue of Oikos goes some way to examine what makes a ‘successful’ ecologist (i.e., in terms of publications, citations and funding), and there are some very interesting results. Read the rest of this entry »


Get every new post delivered to your Inbox.

Join 8,806 other followers

%d bloggers like this: