Normally I just report the Thomson-Reuters ISI Web of Knowledge Impact Factors for conservation-orientated journals each year, with some commentary on the rankings of other journals that also publish conservation-related material from time to time (see my lists of the 2008, 2009, 2010, 2011 and 2012 Impact Factor rankings).
This year, however, I’m doing something different given the growing negativity towards Thomson-Reuters’ secretive behaviour (which they’ve promised this year to rectify by being more transparent) and the generally poor indication of quality that the Impact Factor represents. Although the 2013 Impact Factors have just been released (very late this year, for some reason), I’m going to compare them to the increasingly reputable Google Scholar Journal Metrics, which intuitively make more sense to me, are transparent and turn a little of the rankings dogma on its ear.
In addition to providing both the Google metric as well as the Impact Factor rankings, I’ve come up with a composite (average) rank from the two systems. I think ranks are potentially more useful than raw corrected citation metrics because you must first explicitly set your set of journals to compare. I also go one step further and modify the average ranking with a penalty term that is essentially the addition of the coefficient of variation of rank disparity between the two systems.
Read on for the results.
Google Scholar Journal Metrics are to me a more intuitive and open way to categorise journal ‘quality’. They also seem to be more in keeping with my own subjective views of relative journal merit. While Google Journal Metrics have specific sub-categories (like Biodiversity & Conservation Biology, Ecology, and Marine Science & Fisheries), they rank only the top 20 journals in the sub-discipline and do not provide good cross-disciplinary comparisons. In other words, you have to search for specific journals manually.
I have therefore combed through various sub-categories and other generalist journals to provide a bit of a ranking guide for conservation ecologists. The first table lists 75 ecology/conservation journals along with a smattering of high-impact generalist journals that I personally think cover most of our discipline. I do not contend that the list is complete, and it is most certainly biased toward the journals in which I most regularly publish; however, I think if you’re in this business, the list represents the most common journals to which one would consider submitting.
Column 1 gives the Google Journal Metric (h5-index: the h-index for articles published in the last 5 complete years; it is the largest number h such that h articles published in 2009-2013 have at least h citations each), and Column 2 gives this year’s (and last year’s for comparison) ISI Impact Factor:
Of course, a rank-only assessment here is a little false, for you wouldn’t submit a paper on animals to a plant journal, nor a plant conservation or ecology paper to an insect journal. But in general if you’re fishing for a good target journal, these rankings could help you decide.
A few things jump out here. The Google rankings are in places quite a bit different to the IF-based ones. For example, check out PLoS One – it has a respectable, if not somewhat mediocre IF = 3.534, but its Google h5-index is a whopping 148, placing it 4th of all the 75 journals listed here (even higher than PLoS Biology, Current Biology, Trends in Ecology and Evolution and Ecology Letters). Maybe this open-access model is working after all.
Another ranking anomaly is Nature Climate Change: it has a good IF at 15.295, but it’s ranked 21st in this list according to Google. For conservation journals, Biological Conservation beats Conservation Biology according to Google, but Conservation Letters wins according to Impact Factors (followed by Conservation Biology and Biological Conservation).
Overall, most journals in this list (57 %) increased their IF from last year, but this isn’t really meaningful considering that journals increase their IF over time on average. However, there was one noticeable big drop: Ecology Letters went from 17.949 (2012) to 13.042 (2013).
As mentioned above, the next thing I did was to calculate a simple rank average from the two systems (Column 1). I also penalised the rank if there was considerable disparity between the two ranking systems. Here, I just added the coefficient of variation of the rank (rank SD/mean rank) to the average rank (Column 2), such that journals with identical ranks in the two systems received a zero penalty, and those with widely divergent ranks slipped partially in their average rank.
|Average Rank + CV|
Note that I’ve boldfaced what I consider to be mainly ‘conservation’ journals in these two lists to help focus your decision if a conservation audience is your primary target. The take-home message here is that Conservation Biology and Biological Conservation are more or less equally ranked (above Conservation Letters), and Frontiers in Ecology and the Environment (if you consider that to be a ‘conservation’ journal) is the clear winner.
Finally, some argue that the 5-year Impact Factor is a better reflection of a journal’s citation quality than the 2-year Impact Factor. Just for shits & giggles then, I recalculated the average rank between Google and the 5-year Impact Factor. The results are only a little different:
(Google + 5-yr IF)
I’m sure I’ve missed some people’s favourite journals here, but I think my approach can be used for any sub-discipline or collection of journals you might choose to compare. I’m just glad we no longer have to rely solely on Impact Factors to make important decisions about where we submit our work.