Lomborg: a detailed citation analysis

24 04 2015

There’s been quite a bit of palaver recently about the invasion of Lomborg’s ‘Consensus’ Centre to the University of Western Australia, including inter alia that there was no competitive process for the award of $4 million of taxpayer money from the Commonwealth Government, that Lomborg is a charlatan with a not-terribly-well-hidden anti-climate change agenda, and that he his not an academic and possesses no credibility, so he should have no right to be given an academic appointment at one of Australia’s leading research universities.

On that last point, there’s been much confusion among non-academics about what it means to have no credible academic track record. In my previous post, I reproduced a letter from the Head of UWA’s School of Animal Biology, Professor Sarah Dunlop where she stated that Lomborg had a laughably low h-index of only 3. The Australian, in all their brilliant capacity to report the unvarnished truth, claimed that a certain Professor Ian Hall of Griffith University had instead determined that Lomborg’s h-index was 21 based on Harzing’s Publish or Perish software tool. As I show below, if Professor Hall did indeed conclude this, it shows he knows next to nothing about citation indices.

What is a ‘h-index’ and why does it matter? Below I provide an explainer as well as some rigorous analysis of Lomborg’s track record.

The h-index stands for the Hirsch index, created by Jorge Hirsch of the University of California, San Diego ten years ago. Put simply, it’s the number of academic (peer-reviewed) papers h one has published with citation number ≥ h. As an example, let’s say I have published ten peer-reviewed journal articles (‘papers’) in my life to date. In descending order, they have been cited by different authors (of other peer-reviewed journal articles) 256, 150, 10, 4, 3, 3, 3, 2, 1, 0 times. Even though I have a total of 432 citations from those 10 papers, giving a mean of 43.2 citations per paper, this is driven largely by only 2 papers. As such, my h-index would be 4 (I have 4 papers with at least 4 citations). For me to increase my h-index to 5, I would need at least one more citation for the paper currently sitting at 4 citations, and at least 2 more citations from one of my other papers that currently only have 3 citations.

You can see the advantages of using such an index – it’s not influenced to the same degree by wild outliers and it resists artificial inflation by auto-citation (citing your previous papers in your latest ones). The disadvantage of the h-index is that even if you die, it can still increase as time progresses, such that the older you get, the higher your h-index. Some have proposed correcting for this by dividing the h-index by the number of years since your first publication (called the ‘m-index’). This essentially indicates your ‘speed’ of accumulating citations. As a general rule, if your m-index is over 1, you’re an active, publishing researcher. If your m-index is greater than 2, you’re doing very well.

But what can you count as a ‘paper’? This is in fact the crux of the bullshit floating around the web on this particular issue with respect to Lomborg. As a general rule, the two main services to calculate the h-index – Scopus (Elsevier) and Web of Science (Thomson Reuters) – are perhaps a little conservative (i.e., they don’t necessarily count all your works or all the citations to them). This is because they have very stringent rules for what counts as a ‘paper’ – they have to be recognised and accredited, peer-reviewed academic journal articles (i.e., books, magazine and newspaper articles are disallowed), and in the case of Web of Science, they have to be in ISI-indexed journals.

Lomborg-ScopusLomborg only has a Scopus profile (you need a subscription to see this), which gives an h-index = 3 based on 31 articles and a total of 71 citations (see adjacent screenshot). He hasn’t set up a public ResearcherID, which would give his Web of Science h-index, but I took the liberty of combing through all of his Web of Science-listed articles and came up with the exact same h-index (3), based on 25 articles and 54 total citations.

Google Scholar is the new kid on the block to calculate researcher citation profiles, but to use this, each individual researcher needs to set up a Google Scholar profile (you can see mine here). Every academic should do this because it’s free to use and browse. If one doesn’t do this (and Lomborg hasn’t), then you have to search for individual publications on Google Scholar.

Herein lies the problem for those who submit that Lomborg’s h-index is 21. Software tools like Harzing’s Publish or Perish are merely Google Scholar aggregators; in other words, they merely act as a front end for the Google Scholar search engine. They are not individual profiles. In fact, ever since Google Scholar introduced profiles a few years ago, Publish or Perish has become effectively obsolete. Why? Because it aggregates everything – including all the inappropriate stuff – and vastly overestimates one’s h-index. Further, it doesn’t distinguish duplicates entries, makes no differentiation between peer-reviewed or non-peer-reviewed articles, and it counts any mention of the author’s name, even if isn’t one of their own articles! In other words, it’s utterly flawed.

So, in the absence of a Google Scholar profile for Lomborg, I combed through his Google Scholar entries and dumped all the duplicates, I ignored all the magazine and newspaper articles (e.g., you can’t count opinion editorials in The Wall Street Journal as evidence of an academic track record), I cut out all non-articles (things Lomborg hadn’t actually written), omitted any website diatribes (e.g., blog posts and the like) and calculated his citation profile.

Based on my analysis, Lomborg’s Google Scholar h-index is 4 for his peer-reviewed articles. If I was being particularly generous and included all of Lomborg’s books, which have by far the most citations, then his h-index climbs to 9. However, none of his books is peer-reviewed, and in the case of his most infamous book, The Skeptical Environmentalist, it has been entirely discredited. As such, any reasonable academic selection committee would omit any metrics based on opinion-based books.

So, the best-case scenario is that Lomborg’s h-index is no more than 4. Given his appointment to Level D (Associate Professor) at a world-class university, the suggestion that he earned it on academic merit is not only laughable, it’s completely fraudulent. There is no way that his academic credentials had anything to do with the appointment.

CJA Bradshaw

Addendum

I suppose I should have contextualised what an h-index of 3 or 4 means.

Even a fresh-out-of-the-PhD postdoc with an h-index of only 3 or 4 would have trouble finding a job. As a rule of thumb, the h-index of a Level D appointment should be in the 20-30 range (this would vary among disciplines). Despite this variation, Lomborg’s h-index is so far off the mark that even accounting for uncertainty and difference of opinion, it’s nowhere near a senior academic appointment.

Guidelines are guidelines, and no one is going to commit to a particular h-index as a minimum. Instead, an applicant’s h-index is usually benchmarked against others in the field and/or school of appointment. All one has to do is search for academics of the same or similar academic appointment level on Google Scholar as a point of comparison.


Actions

Information

33 responses

11 11 2015
Lomborg: If we emit more, we’ll warm more | …and Then There's Physics

[…] a paper on the impact of current climate proposals. Possibly he’s also trying to improve his h-index, but given the Impact Factor of the Journal in which he’s published, it could take a […]

Like

15 10 2015
Aleksey Belikov

I’ve recently proposed a novel index for evaluation of individual researchers that does not depend on the number of publications, accounts for different co-author contributions and age of publications, and scales from 0.0 to 9.9 (http://f1000research.com/articles/4-884). Moreover, it can be calculated with the help of freely available software. Please, share your thoughts on it. Would you use it along with the h-index, or maybe even instead of it, for evaluating your peers, potential collaborators or job applicants? If you’ve tried it on the people you know, do you find the results fair?

Like

30 06 2015
What the H? Explaining That Citation Metric | Business Daily Report

[…] scientists who were opposed to Lomborg’s new research center pointed out that his H-index score was 3. Usually, someone appointed to a professorship in the natural sciences would be expected to have an […]

Like

2 06 2015
23 05 2015
What the H? Explaining That Citation Metric

[…] scientists who were opposed to Lomborg’s new research center pointed out that his H-index score was 3. Usually, someone appointed to a professorship in the natural sciences would be expected to have an […]

Like

21 05 2015
We need real consensus, not Bjorn Lomborg's illusion of it | Em News

[…] is not active as an academic (with a relatively low h-index of 3) and has forged his reputation largely by publishing non-peer-reviewed books, with environmental […]

Like

21 05 2015
Explainer: what is an H-index and how is it calculated? | Em News

[…] scientists who were opposed to Lomborg’s new research centre pointed out that his H-index score was 3. Usually, someone appointed to a professorship in the natural sciences would be expected to have an […]

Like

21 05 2015
We need real consensus, not Bjorn Lomborg's illusion of it - ParisProtocol.com

[…] is not active as an academic (with a relatively low h-index of 3) and has forged his reputation largely by publishing non-peer-reviewed books, with environmental […]

Like

10 05 2015
The Answer is Blowing in the Wind: Lomborg on Renewable Electricity Subsidies | Graham K. Brown

[…] record in the metrics that are often used to judge academic output, notably attempts to calculate his h-index score. The Australian has come to Lomborg’s defence on this, suggesting that Lomborg does not score […]

Like

27 04 2015
Michael McCarthy

While the politics of this is certainly up for debate, two aspects need to be considered. Firstly, journal citations are less relevant in political science (Lomborg’s disciplne) than in many other disciplines. For example, journal citation metrics are used by the ARC’s “Excellence in Research for Australia” initiative for many disciplines but notably not for political science. So I think having a crack at Lomborg’s h-index is not very relevant and risks distracting from legitimate concern about any appointment.

Secondly, there is an interesting discussion of this topic on Dave Pannell’s blog. Dave Pannell is at UWA in a relevant field, so his perspective is well worth reading (as is the comments section!):

http://www.pannelldiscussions.net/2015/04/280-lomborg-at-uwa/

Like

27 04 2015
CJAB

I think you’re missing the point, Mick. If his appointment was merit-based, then I’d like to see the evidence. Whether or not his h-index, total citations, m-index (whatever) are the best metrics, it’s irrelevant because he should be judged against the rest of us. He clearly wasn’t, so my analysis merely shows one of the many, many ways his appointment was fraudulent.

Like

27 04 2015
James

As a rule of thumb, the h-index of a Level D appointment should be in the 20-30 range (this would vary among disciplines). Despite this variation, Lomborg’s h-index is so far off the mark that even accounting for uncertainty and difference of opinion, it’s nowhere near a senior academic appointment.

That’s not true. Have a chat with your humanities colleagues. Anyone producing predominantly book or chapter-based research could easily be Level D/Associate Professor with a Scopus H-Index of 3. It was pretty easy to find people at UWA in that camp.
I know from talking to my School of Business colleagues as well that they don’t fair well on these metrics. The first two Assoc. Profs at UWA in business I found had Scopus numbers of 5 and 7.
The H-Index is really too blunt a tool. For almost anything. Unless you are comparing two very very similar academics.
Better to stay the line that Lomberg’s outputs aren’t academic.

Like

27 04 2015
CJAB

I’m sure you can find people of equally poor citation standings in UWA and elsewhere. However, there are many more with much greater citation track records (some examples include: John Quiggin (http://goo.gl/dET3LN), Frank Jotzo (http://goo.gl/6RrD0a), James Hamilton (http://t.co/l1tCbgaR69), Marylene Gagne (http://goo.gl/knrlKc), Wei Liu (http://goo.gl/2xD4mc) & Dave Webb (http://goo.gl/cJNu30). Now, if you’re going to give $4m to a high-profile person to run a centre, and that person is ‘controversial’ (a gross understatement in this particular case), you had better make damn sure at least his/her track record is more robust than the average person in the field. Exceptions to the rule are irrelevant here. A man with no academic track record to speak of has no place in this system.

Like

27 04 2015
James

I *know* that he has no academic track record, and you *know* that, but it’s not because of his H-Index.
I was not looking for exceptions at all, only Assoc Prof at UWA outside the hard sciences. You look like you’ve gone trawling for economists (possibly with a climate bent), but underminining your own argument.
Yes google scholar is more optimistic – it overestimates your citations by ~53%. However, if you look up your own examples on Scopus, you’ll find that scholar has up to 370% more hits for some of them, and well over 200% for most. And in fact, our datasets have one point in common – you may be surprised to find that one of those people has an H-index of “only” 7. And another has 10. Which puts the great lie into your rule of thumb of “20-30” for this level. These are not exceptions. They are your own examples.

So yes, the New Zealand scientific community is outraged at this UWA appointment, but having outstanding academics such as yourself perpetrating very misleading rules of thumb for citation metrics is massively dispiriting to junior researchers outside of the hard sciences. Which I guess is my real point.

Like

27 04 2015
CJAB

No, Google Scholar doesn’t ‘over-estimate’ your citations, it merely includes more sources. So, if you follow my logic about Google Scholar (to which the 20-30 was referring, in fact), then I’m spot on regarding Level D appointments. Regardless, you seem to be changing the goal posts here – the point I was making was that Lomborg is not, no matter how you look at it, an academic. My goal was to bust the notion that he had anything approaching a reasonable (or even minimum) track record as measured by citation. Your frustration seems to be personal, so do not attack me. You make your own bed in academia, and the best way to do that is by publishing. I didn’t make the rules, but that’s how it works.

Like

27 04 2015
James

Sure it’s a bit personal. I’ve watched people with citation metrics similar to yours, giving advice to my junior colleagues, pretty similar to what you are giving to me.
I know that is *extremely* tangential to the point that you’re trying to make, but you’re perpetrating very damaging myths about the H-Index. For what I’d consider at least a quarter of the average university, an H-Index of 10-15 is entirely adequate to be a full level E professor. Within my own department, I have a colleague who is within 10% of each of your numbers, and full professor, and another full professor on an H-Index of 16. The latter is no more an exception than the former. This reflects not that the H-Index is reasonably avlid, but requires some consideration of discipline, but that it often several orders of magnitude out between, and sometimes within disciplines.
In a great irony, my own H-Index looks very reasonable, but for the wrong reasons. I publish across a few fields, and have some poor quality papers that superficially seems good (high journal IF, high number of citations) that I know are worse than papers that superficially seem bad (poor or no IF, few citations).
So, I guess my point is, I’m very pleased with your public advocacy against Lomberg. It’s important that academics speak out. But I think if you have massive citation metrics, it’s very easy not to realise how rotten the system really is. And in trashing Lomberg in this particular way, you’re giving it more credibility than it deserves. Whereas the reality is that your critique is much more embedded in your judgement that his output is crap. As a “hard” scientist, taking refuge in the (dodgy) numbers probably holds some comfort, but as a social scientist, you should back your (allegedly subjective but far more valid) judgement.

Like

27 04 2015
CJAB

If you’ve got a better (objective) metric, I’d be pleased to hear it. Whingeing about the system doesn’t help. As I mentioned in the post, h-index/years publishing (m-index) is probably the best metric we have, and Lomborg’s track record would be much, much worse according to that. All metrics are flawed in some way, but without objective alternatives, then it’s pointless to complain. I started out like everyone else – an h-index = 0 – then grew it over the years. If academic performance wasn’t assessed by citations, then it would be irrelevant. But it is.

Like

27 04 2015
James

If an individual version of the SNIP was developed, then it might show some potential.
For each paper (output), you’d have to determine it’s size (is it in a venue where a top researcher would do 1-2 outputs a year, or 20-30); the relative citation rate for that venue; where the citations were coming from, and potentially even control for the number of authors (*though I suspect this would merely disenfranchise lots of junior authors, rather than senior authors whose major contribution was their name).

So in theory, a sole authored output that was the product of 6 months of the academic’s own fieldwork that had received few citations would count orders of magnitude more than many citations to a paper by a large team, based on a week’s labwork. But it should also account for the possibility that the former paper might have drawn citations from papers in the latter venue (eg, an anthropologist’s paper being cited by geneticists).

Personally, I don’t hold out much hope. In my field, we prefer the higher validity of subjective measures to the higher reliability of objective measures. Actually, when I teach research methods, citation metrics are the punchline to the validity/reliability dilemma.

As a final note, academic performance is not really assessed by citations. Or at least they are so heavily contextualised as to not be recognisable in their original form. If citation metrics were actually useful, then the application forms for promotion would be a lot shorter.

Like

27 04 2015
CJAB

Theory is nice, but completely impractical. Why do think the Australian RQF then the ERA failed? Because they were too onerous, subjective, qualitative and corruptible to be of any realistic use. Yes, citation-based metrics are simplistic, but they are quantifiable and objective. Some, but very few, people might be disadvantaged, but if you do good science (& other types of research), your citations will more or less reflect this. Correcting for age is the only real thing we can do to compare among experience levels. I maintain that citation-based metrics are still the best indices we have.

Like

2 06 2015
MLM

You are looking at it from a top tier position, Is a D an Assoc Prof at a top tier university? I only ask because at regional state universities and at SLACs in the US, there are tons fo full professrs with h-indexes below 10. In herpetology, and this is based on Soc of European Herpetologist data, the overall h = 9.35 and no age bracket had a higher mean than 27.7 (herpetologists working > 50 years!). Further, having surfed the job market this year, I have seen a multitude of people get jobs whose h 9! So, ,I ask where you come up with this 30 h score because the Eminent Ecologist Eric Pianka (UT Austin) is only a 53, David Wake (member of NAS at UC Berkeley) is a 63, David Hillis (NAS member at UT Austin) = 66, and J. Losos (NAS member at Harvard) is a 68. Drop it down a notch, Chris Phillips (UIUC) h = 19, Bruce Kingsbury (IPFW) = 14. So, from where I stand, at least in herpetology, the best of the best who have been around 30 years + are in the 60s, and middle aged run between 10-20, even at world-renown American universities. Is Europe that much different? This is a question, not meant to be an attack!!!! :)

Like

27 04 2015
CJAB

Further, I can’t understand your personal frustration. I dare say your Google Scholar profile is rather respectable, you clearly have an appointment as a ‘junior’ researcher, and you massively overshadow the likes of Lomborg. Well done. He’s not an academic – you are. This is well-represented by your h-index.

Like

27 04 2015
NewzMonitors.com | The Australian Consensus Centre: what are the costs and benefits to UWA?

[…] of citation metrics is made complex by the way it is collected and reported. As an article in conservationbytes.com explains, depending on which database is used the citations increase. Further, since 2001 Lomborg […]

Like

27 04 2015
The Australian Consensus Centre: what are the costs and benefits to UWA? | Em News

[…] of citation metrics is made complex by the way it is collected and reported. As an article in conservationbytes.com explains, depending on which database is used the citations increase. Further, since 2001 Lomborg […]

Like

25 04 2015
jpsigur

Happy to see your reactions to this appointment. In Pondicherry University (India) the Vice-Chancellor with an h = 0, is accused of plagiarism and fake CV. Such jobs are obviously not obtained by merit and must be costing a lot. It is one and a half years and they can’t get rid of her, while the scientific output of the university is stalling. Does the negative h index exist?

Like

25 04 2015
intrepidaussies

Excellent explanation. Now we need an analysis of how many of those citations either support or discredit Lomborg’s view. A myriad of citations refuting one’s arguments should not add to one’s ‘approval’ rating as an academic.

Like

25 04 2015
Lomborg: a detailed citation analysis | Enjeux énergies et environnement

[…] This is a re-post from ConservationBytes […]

Like

24 04 2015
Dan Monceaux

Thanks Corey, I found your analysis very useful and informative. Good on you for having the tenacity to test this disparity and explain your process so clearly.

Like

24 04 2015
Bill DeMott

Nice analysis–rather surprising that an academic with some much “name recognition” would have such a poor citation record and such a low h-index.

Like

24 04 2015
Barry Brook

Also playing a further devil’s advocate – did anyone question Julia Gillard’s h-index of 9 (using the same criteria for inclusion that I used for Lomborg in the above comment) when she was appointed as Adjunct Professor at University of Adelaide. Or what about former Premier Geoff Gallop? (h-index = 3), even after a number of years as full Professor at University of Sydney. Let’s be consistent!

Like

24 04 2015
Barry Brook

Google Scholar puts his h-index at 13, which isn’t too bad. That includes his books + some popular articles, but they were nevertheless cited in academic journals etc. (majority, at least). His 2001 book has over 2,300 citations. Obviously much of its content is highly contestable (!), but that’s a high impact, especially when you also consider the debate it generated. Like or detest Lomborg, his influence is undeniable, and his academic track record is probably better than most political scientists. Context is vital.

Like

24 04 2015
A G (Tony) Fane

Thanks for this analysis. I am also an academic and had just checked Lomborg’s h index on web of science when I found your article. I was shocked to see it was so low. I am frequently asked to act as a referee for promotions (in engineering) and consider an h index in the 30s to be reasonable for a professorial appointment. As you point out an h index of 3 or 4 would barely rate appointment as a Lecturer ! I am very surprised that UWA has been lured into this charade which is an insult to all serious academics.

Like

24 04 2015
Andrew Smith

Hi CJAB, thanks for this. I don’™t know if you read the Guardian. I have attached their award-winning cartoonist First Dog on the Moon’™s take on the proposed institute. FD’s cartoons are well worth following (Guardian website, usually front page or under ‘Opinion’™).

Regards,
Andrew Smith

Like

24 04 2015
CJAB

I certainly do – First Dog is one of my favourite cartoonists!

Like

Leave a comment