Normally I just report the Thomson-Reuters ISI Web of Knowledge Impact Factors for conservation-orientated journals each year, with some commentary on the rankings of other journals that also publish conservation-related material from time to time (see my lists of the 2008, 2009, 2010, 2011 and 2012 Impact Factor rankings).
This year, however, I’m doing something different given the growing negativity towards Thomson-Reuters’ secretive behaviour (which they’ve promised this year to rectify by being more transparent) and the generally poor indication of quality that the Impact Factor represents. Although the 2013 Impact Factors have just been released (very late this year, for some reason), I’m going to compare them to the increasingly reputable Google Scholar Journal Metrics, which intuitively make more sense to me, are transparent and turn a little of the rankings dogma on its ear.
In addition to providing both the Google metric as well as the Impact Factor rankings, I’ve come up with a composite (average) rank from the two systems. I think ranks are potentially more useful than raw corrected citation metrics because you must first explicitly set your set of journals to compare. I also go one step further and modify the average ranking with a penalty term that is essentially the addition of the coefficient of variation of rank disparity between the two systems.
Read on for the results.
Google Scholar Journal Metrics are to me a more intuitive and open way to categorise journal ‘quality’. They also seem to be more in keeping with my own subjective views of relative journal merit. While Google Journal Metrics have specific sub-categories (like Biodiversity & Conservation Biology, Ecology, and Marine Science & Fisheries), they rank only the top 20 journals in the sub-discipline and do not provide good cross-disciplinary comparisons. In other words, you have to search for specific journals manually.
I have therefore combed through various sub-categories and other generalist journals to provide a bit of a ranking guide for conservation ecologists. The first table lists 75 ecology/conservation journals along with a smattering of high-impact generalist journals that I personally think cover most of our discipline. I do not contend that the list is complete, and it is most certainly biased toward the journals in which I most regularly publish; however, I think if you’re in this business, the list represents the most common journals to which one would consider submitting.
Column 1 gives the Google Journal Metric (h5-index: the h-index for articles published in the last 5 complete years; it is the largest number h such that h articles published in 2009-2013 have at least h citations each), and Column 2 gives this year’s (and last year’s for comparison) ISI Impact Factor:
Of course, a rank-only assessment here is a little false, for you wouldn’t submit a paper on animals to a plant journal, nor a plant conservation or ecology paper to an insect journal. But in general if you’re fishing for a good target journal, these rankings could help you decide.
A few things jump out here. The Google rankings are in places quite a bit different to the IF-based ones. For example, check out PLoS One – it has a respectable, if not somewhat mediocre IF = 3.534, but its Google h5-index is a whopping 148, placing it 4th of all the 75 journals listed here (even higher than PLoS Biology, Current Biology, Trends in Ecology and Evolution and Ecology Letters). Maybe this open-access model is working after all.
Another ranking anomaly is Nature Climate Change: it has a good IF at 15.295, but it’s ranked 21st in this list according to Google. For conservation journals, Biological Conservation beats Conservation Biology according to Google, but Conservation Letters wins according to Impact Factors (followed by Conservation Biology and Biological Conservation).
Overall, most journals in this list (57 %) increased their IF from last year, but this isn’t really meaningful considering that journals increase their IF over time on average. However, there was one noticeable big drop: Ecology Letters went from 17.949 (2012) to 13.042 (2013).
As mentioned above, the next thing I did was to calculate a simple rank average from the two systems (Column 1). I also penalised the rank if there was considerable disparity between the two ranking systems. Here, I just added the coefficient of variation of the rank (rank SD/mean rank) to the average rank (Column 2), such that journals with identical ranks in the two systems received a zero penalty, and those with widely divergent ranks slipped partially in their average rank.
Average Rank |
Average Rank + CV |
|
|
Note that I’ve boldfaced what I consider to be mainly ‘conservation’ journals in these two lists to help focus your decision if a conservation audience is your primary target. The take-home message here is that Conservation Biology and Biological Conservation are more or less equally ranked (above Conservation Letters), and Frontiers in Ecology and the Environment (if you consider that to be a ‘conservation’ journal) is the clear winner.
Finally, some argue that the 5-year Impact Factor is a better reflection of a journal’s citation quality than the 2-year Impact Factor. Just for shits & giggles then, I recalculated the average rank between Google and the 5-year Impact Factor. The results are only a little different:
Average Rank (Google + 5-yr IF) |
|
I’m sure I’ve missed some people’s favourite journals here, but I think my approach can be used for any sub-discipline or collection of journals you might choose to compare. I’m just glad we no longer have to rely solely on Impact Factors to make important decisions about where we submit our work.
[…] the previous years’ rankings (2020, 2019, 2018, 2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009, […]
LikeLike
[…] also the previous years’ rankings (2019, 2018, 2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009, […]
LikeLike
[…] also the previous years’ rankings (2018, 2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009, […]
LikeLike
[…] also the previous years’ rankings (2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009, […]
LikeLike
[…] also the previous years’ rankings (2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009, […]
LikeLike
[…] journals, (iii) 21 conservation journals, just as I have done in previous years (2015, 2014, 2013, 2012, 2011, 2010, 2009, […]
LikeLike
[…] Talking about impact, I can segue to impact factors; you love them or you hate them. But inevitably, for many of our community they have become a part of our professional life. Last week ISI Web of Science has released its new impact factors. Functional Ecology has consolidated its position amongst the leading journals in ecology; with an impact factor of 5.63, it ranks 14 out of 153 listed journal. If you are interested in alternative approaches to calculating journal impacts, I would recommend going back to a late 2014 post on ConservationBytes.com. […]
LikeLike
[…] So in what has become a bit of an annual tradition, I’m publishing the ranks of a mixed list of ecology, conservation and multidisciplinary disciplines that probably cover most of the journals you might be interested in comparing. Like for last year, I make no claims that this list is comprehensive or representative. For previous lists based on ISI Impact Factors (except 2014), see the following links (2008, 2009, 2010, 2011, 2012, 2013). […]
LikeLike
[…] ecology according to their ISI® Impact Factor (see lists for 2008, 2009, 2010, 2011, 2012 & 2013). These lists have proven to be exceedingly […]
LikeLike
I agree that larger journals will get higher rank (We’ll never see Wildlife Monographs on this list because each issue only has 1 “article”) and that individual papers rather than the journal should be the base of the rank. Seems to me that each paper should be ranked by how often it’s cited and a then a journal rank created by averaging by the number of papers in that issue. This would increase the rank of smaller journals when their articles are great.
LikeLike
Reblogged this on The Waterthrush Blog and commented:
Reblogged from ConservationBytes.com – good stuff!
LikeLike
Reblogged this on Joey O'Gorman.
LikeLike
[…] the interest of providing greater transparency when ranking the ‘quality’ of scientific journals, we are interested in collecting ecologists’ views on the relative impact of different […]
LikeLike
Hi Corey,
Nice to see you pushing to try and address the limitations of our current ways of measuring the “quality” of published research. Goodness knows we need it. However, a true reflection of research quality is not offered when journals with large numbers of publications are favoured (thanks for raising the point above, Florian).
In your proposed system, as you note, the ranks of journals vary between Thomson-Reuters measure and your proposed ones. Given the differences in the various options you present for measuring quality, I’d be interested to see how journals rank when numbers of publications are accounted for.
This discussion raises the question, more importantly in our field, of what is the real impact of a piece of research on the ground? If conservation policy-makers and managers are searching for relevant and useful science and are guided by the Thomson-Reuters measure or the measures proposed here, then they’re more likely to be disappointed – overwhelmingly most of the literature is not policy relevant. It has no “impact”. My impression is that generally-speaking the higher impact factor the lower the relevance those implementing conservation action.
We need to be really careful about how we define the difference between “quality” and “impact”, and the relative importance of each. Large numbers of articles is definitely not a useful guide. Let’s be sending the most useful message to conservation scientists, policy-makers and managers.
LikeLike
[…] you are interested in alternative metrics and how they stack up against standard IFs, head over to Conservation Bytes for an interesting look at the […]
LikeLike
Hi Corey,
Thanks for the post. Most researches in China, at least in our university, are evaluated by ISI impact factor. As a result, we do take care about it. For example, I should at least publish a paper in a journal with its IF > 3 to apply my degree.
The average rank you proposed is very interesting. I’ve translated it into Chinese and posted on my website for a broader audience: http://sixf.org/cn/2014/08/google-scholar-journal-matrix-and-impact-factor/
You’ll find the original web-link of your post in the end of the translation, so that the audience will redirect here if they’d like to read the English post.
Regards,
Xingfeng
LikeLike
The google scholar index is a h-index per journal http://scholar.google.com/intl/en/scholar/metrics.html#metrics. Journals that publish more papers naturally have a higher chance to get a high index, which explains the high value of PLOS ONE.
I’m not a big fan of ISI, but I find the definition of the IF quite transparent and a lot more sensible as a measure of quality that the h-index of google, which is more an importance / size index of a journal.
LikeLike
My opinion is that it is papers rather than journals that have impact. Today, papers are found readily by topic in Google Scholar or other places.
LikeLike
So I’d look at the top 30 papers to see which journal was ‘better’ for my topic? Very transparent and some will prefer to make their own decisions, but will still lead to someone creating a summary by journal, leading us back to some kind of rank by journals.
LikeLike