New journal: Frontiers in Conservation Science

29 09 2020

Several months ago, Daniel Blumstein of UCLA approached me with an offer — fancy leading a Special Section in a new Frontiers journal dedicated to conservation science?

I admit that my gut reaction was a visceral ‘no’, both in terms of the extra time it would require, as well as my autonomous reflex of ‘not another journal, please‘.

I had, for example, spent a good deal of blood, sweat, and tears helping to launch Conservation Letters when I acted as Senior Editor for the first 3.5 years of its existence (I can’t believe that it has been nearly a decade since I left the journal). While certainly an educational and reputational boost, I can’t claim that the experience was always a pleasant one — as has been said many times before, the fastest way to make enemies is to become an editor.

But then Dan explained what he had in mind for Frontiers in Conservation Science, and the more I spoke with him, the more I started to think that it wasn’t a bad idea after all for me to join.

Read the rest of this entry »





A fairer way to rank conservation and ecology journals in 2014

1 08 2014

Normally I just report the Thomson-Reuters ISI Web of Knowledge Impact Factors for conservation-orientated journals each year, with some commentary on the rankings of other journals that also publish conservation-related material from time to time (see my lists of the 2008200920102011 and 2012 Impact Factor rankings).

This year, however, I’m doing something different given the growing negativity towards Thomson-Reuters’ secretive behaviour (which they’ve promised this year to rectify by being more transparent) and the generally poor indication of quality that the Impact Factor represents. Although the 2013 Impact Factors have just been released (very late this year, for some reason), I’m going to compare them to the increasingly reputable Google Scholar Journal Metrics, which intuitively make more sense to me, are transparent and turn a little of the rankings dogma on its ear.

In addition to providing both the Google metric as well as the Impact Factor rankings, I’ve come up with a composite (average) rank from the two systems. I think ranks are potentially more useful than raw corrected citation metrics because you must first explicitly set your set of journals to compare. I also go one step further and modify the average ranking with a penalty term that is essentially the addition of the coefficient of variation of rank disparity between the two systems.

Read on for the results.

Read the rest of this entry »





Conservation and ecology journal Impact Factors 2012

20 06 2013

smack2It’s the time of year that scientists love to hate – the latest (2012) journal ranking have been released by ISI Web of Knowledge. Many people despise this system, despite its major role in driving publishing trends.

I’ve previously listed the 2008, 2009, 2010 and 2011 IF for major conservation and ecology journals. As before, I’ve included the previous year’s IF alongside the latest values to see how journals have improved or worsened (but take note – journals increase their IF on average anyway merely by the fact that publication frequency is increasing, so small jumps aren’t necessarily meaningful; I suspect that declines are therefore more telling).

Read the rest of this entry »





Conservation and Ecology Impact Factors 2011

29 06 2012

Here we go – another year, another set of citations, and another journal ranking by ISI Web of Knowledge Journal Citation Reports. Love them or loathe them, Impact Factors (IF) are immensely important for dictating publication trends. No, a high Impact Factor doesn’t mean your paper will receive hundreds of citations, but the two are correlated.

I’ve previously listed the 2008, 2009 and 2010 IF for major conservation and ecology journals – now here are the 2011 IF fresh off the press (so to speak). I’ve included the 2010 alongside to see how journals have improved or worsened (but take note – journals increase their IF on average anyway merely by the fact that publication frequency is increasing, so small jumps aren’t necessarily meaningful).

Read the rest of this entry »





2010 ISI Impact Factors out now (with some surprises)

29 06 2011

It’s been another year of citations and now the latest list of ISI Impact Factors (2010) has come out. Regardless of how much stock you put in these (see here for a damning review), you cannot ignore their influence on publishing trends and author journal choices.

As I’ve done for 2008 and 2009, I’ve taken the liberty of providing the new IFs for some prominent conservation and ecology journals, and a few other journals occasionally publishing conservation-related material.

One particular journal deserves special attention here. Many of you might know that I was Senior Editor with Conservation Letters from 2008-2010, and I (with other editorial staff) made some predictions about where the journal’s first impact factor might be on the scale (see also here). Well, I have to say the result exceeded my expectations (although Hugh Possingham was closer to the truth in the end – bugger!). So the journal’s first 2010 impact factor (for which I take a modicum of credit ;-) is a whopping… 4.694 (3rd among all ‘conservation’ journals). Well done to all and sundry who have edited and published in the journal. My best wishes to the team that has to deal with the inevitable rush of submissions this will likely bring!

So here are the rest of the 2010 Impact Factors with the 2009 values for comparison: Read the rest of this entry »





Demise of the Australian ERA journal rankings

3 06 2011

Earlier this week Australian Senator Kim Carr (Minister for Innovation, Industry, Science and Research) announced the removal of the somewhat controversial ERA rankings for scientific journals.

Early last year I posted about the Excellence in Research for Australia (ERA) journal rankings for ecology and conservation journals. To remind you, the ERA has ranked > 20,000 unique peer-reviewed journals, with each given a single quality rating – and they are careful to say that “A journal’s quality rating represents the overall quality of the journal. This is defined in terms of how it compares with other journals and should not be confused with its relevance or importance to a particular discipline.”.

Now, after much to-ing and fro-ing about what the four rankings actually mean (A*, A, B & C), Senator Carr has announced that he’s dumping them under the advice of the Australian Research Council. Read the rest of this entry »