Journal ranks 2021

4 07 2022

Now that Clarivate, Google, and Scopus have recently published their respective journal citation scores for 2021, I can now present — for the 14th year running on ConvervationBytes.com — the 2021 conservation/ecology/sustainability journal ranks based on my journal-ranking method.

Like last year, I’ve added a few journals. I’ve also included in the ranking the Journal Citation Indicator (JCI) in addition to the Journal Impact Factor and Immediacy Index from Clarivate ISI, and the CiteScore (CS) in addition to the Source-Normalised Impact Per Paper (SNIP) and SCImago Journal Rank (SJR) from Scopus. 

You can access the raw data for 2021 and use my RShiny app to derive your own samples of journal ranks.

I therefore present the new 2021 ranks for: (i) 106 ecology, conservation and multidisciplinary journals, (ii) 27 open-access (i.e., you have to pay) journals from the previous category, (iii) 64 ‘ecology’ journals, (iv) 32 ‘conservation’ journals, (v) 43 ‘sustainability’ journals (with general and energy-focussed journals included), and (vi) 21 ‘marine & freshwater’ journals.

Remember not to take much notice if a journal boasts about how its Impact Factor has increased this year, because these tend to increase over time anyway What’s important is a journal’s relative (to other journals) rank.

Here are the results:

Read the rest of this entry »




Journal ranks 2020

23 07 2021

This is the 13th year in a row that I’ve generated journal ranks based on the journal-ranking method we published several years ago.

There are few differences in how I calculated this year’s ranks, as well as some relevant updates:

  1. As always, I’ve added a few new journals (either those who have only recently been scored with the component metrics, or ones I’ve just missed before);
  2. I’ve included the new ‘Journal Citation Indicator’ (JCI) in addition to the Journal Impact Factor and Immediacy Index from Clarivate ISI. JCI “… a field-normalised metric, represents the average category-normalised citation impact for papers published in the prior three-year period.”. In other words, it’s supposed to correct for field-specific citation trends;
  3. While this isn’t my change, the Clarivate metrics are now calculated based on when an article is first published online, rather than just in an issue. You would have thought that this should have been the case for many years, but they’ve only just done it;
  4. I’ve also added the ‘CiteScore’ (CS) in addition to the Source-Normalised Impact Per Paper (SNIP) and SCImago Journal Rank (SJR) from Scopus. CS is “the number of citations, received in that year and previous 3 years, for documents published in the journal during that period (four years), divided by the total number of published documents … in the journal during the same four-year period”;
  5. Finally, you can access the raw data for 2020 (I’ve done the hard work for you) and use my RShiny app to derive your own samples of journal ranks (also see the relevant blog post). You can add new journal as well to the list if my sample isn’t comprehensive enough for you.

Since the Google Scholar metrics were just released today, I present the new 2020 ranks for: (i) 101 ecology, conservation and multidisciplinary journals, and a subset of (ii) 61 ‘ecology’ journals, (iii) 29 ‘conservation’ journals, (iv) 41 ‘sustainability’ journals (with general and energy-focussed journals included), and (v) 20 ‘marine & freshwater’ journals.

One final observation. I’ve noted that several journals are boasting about how their Impact Factors have increased this year, when they fail to mention that this is the norm across most journals. As you’ll see below, relative ranks don’t actually change that much for most journals. In fact, this is a redacted email I received from a journal that I will not identify here:

We’re pleased to let you know that the new Impact Factor for [JOURNAL NAME] marks a remarkable increase, as it now stands at X.XXX, compared to last year’s X.XXX. And what is even more important: [JOURNAL NAME] increased its rank in the relevant disciplines: [DISCIPLINE NAME].

Although the Impact Factor may not be the perfect indicator of success, it remains the most widely recognised one at journal level. Therefore, we’re excited to share this achievement with you, as it wouldn’t have been possible, had it not been for all of your contributions and support as authors, reviewers, editors and readers. A huge ‘THANK YOU’ goes to all of you!

What bullshit.

Anyway, on to the results:

Read the rest of this entry »





Rank your own sample of journals

29 12 2020

If you follow my blog regularly, you’ll know that around the middle of each year I publish a list of journals in conservation and ecology ranked according to a multi-index algorithm we developed back in 2016. The rank I release coincides with the release of the Web of Knowledge Impact Factors, various Scopus indices, and the Google Scholar journal ranks.

The reasons we developed a multi-index rank are many (summarised here), but they essentially boil down to the following rationale:

(i) No single existing index is without its own faults; (ii) ranks are only really meaningful when expressed on a relative scale; and (iii) different disciplines have wildly different index values, so generally disciplines aren’t easily compared.

That’s why I made the R code available to anyone wishing to reproduce their own ranked sample of journals. However, given that implementing the R code takes a bit of know-how, I decided to apply my new-found addiction to R Shiny to create (yet another) app.

Welcome to the JournalRankShiny app.

This new app takes a pre-defined list of journals and the required indices, and does the resampled ranking for you based on a few input parameters that you can set. It also provides a few nice graphs for the ranks (and their uncertainties), as well as a plot showing the relationship between the resulting ranks and the journal’s Impact Factor (for comparison).

Read the rest of this entry »




Journal ranks 2019

8 07 2020

journalstack_16x9

For the last 12 years and running, I’ve been generating journal ranks based on the journal-ranking method we published several years ago. Since the Google journal h-indices were just released, here are the new 2019 ranks for: (i) 99 ecology, conservation and multidisciplinary journals, and a subset of (ii) 61 ‘ecology’ journals, (iii) 27 ‘conservation’ journals, (iv) 41 ‘sustainability’ journals (with general and energy-focussed journals included), and (v) 20 ‘marine & freshwater’ journals.

See also the previous years’ rankings (2018, 20172016201520142013, 2012, 20112010, 2009, 2008).

Read the rest of this entry »





Does high exposure on social and traditional media lead to more citations?

18 12 2019

social mediaOne of the things that I’ve often wondered about is whether making the effort to spread your scientific article’s message as far and wide as possible on social media actually brings you more citations.

While there’s more than enough justification to promote your work widely for non-academic purposes, there is some doubt as to whether the effort reaps academic awards as well.

Back in 2011 (the Pleistocene of social media in science), Gunther Eysenbach examined 286 articles in the obscure Journal of Medical Internet Research, finding that yes, highly cited papers did indeed have more tweets. But he concluded:

Social media activity either increases citations or reflects the underlying qualities of the article that also predict citations …

Subsequent work has established similar positive relationships between social-media exposure and citation rates (e.g., for 208739 PubMed articles> 10000 blog posts of articles published in > 20 journals), weak relationships (e.g., using 27856 PLoS One articlesbased on 1380143 articles from PubMed in 2013), or none at all (e.g., for 130 papers in International Journal of Public Health).

While the research available suggests that, on average, the more social-media exposure a paper gets, the more likely it is to be cited, the potential confounding problem raised by Eysenbach remains — are interesting papers that command a lot of social-media attention also those that would garner scientific interest anyway? In other words, are popular papers just popular in both realms, meaning that such papers are going to achieve high citation rates anyway?

Read the rest of this entry »





Journal ranks 2018

23 07 2019

journal stacks

As has become my custom (11 years and running), and based on the journal-ranking method we published several years ago, here are the new 2018 ranks for (i) 90 ecology, conservation and multidisciplinary journals, and a subset of (ii) 56 ‘ecology’ journals, and (iii) 26 ‘conservation’ journals. I’ve also included two other categories — (iv) 40 ‘sustainability’ journals (with general and energy-focussed journals included), and 19 ‘marine & freshwater’ journals for the watery types.

See also the previous years’ rankings (20172016201520142013, 2012, 20112010, 2009, 2008).

Read the rest of this entry »





Journal ranks 2017

27 08 2018

book-piles

A few years ago we wrote a bibliometric paper describing a new way to rank journals, and I still think it is one of the better ways to rank them based on a composite index of relative citation-based metrics . I apologise for taking so long to do the analysis this year, but it took Google Scholar a while to post their 2017 data.

So, here are the 2017 ranks for (i) 88 ecology, conservation and multidisciplinary journals, and a subset of (ii) 55 ‘ecology’ journals, (iii) 24 ‘conservation’ journals. Also this year, I’ve included two new categories — (iv) 38 ‘sustainability’ journals (with general and energy-focussed journals included), and 19 ‘marine & freshwater’ journals for you watery types.

See also the previous years’ rankings (2016201520142013, 2012, 20112010, 2009, 2008).

Read the rest of this entry »





Journal ranks 2016

14 07 2017

Many books

Last year we wrote a bibliometric paper describing a new way to rank journals, which I contend is a fairer representation of relative citation-based rankings by combining existing ones (e.g., ISI, Google Scholar and Scopus) into a composite rank. So, here are the 2016 ranks for (i) 93 ecology, conservation and multidisciplinary journals, and a subset of (ii) 46 ecology journals, (iii) 21 conservation journals, just as I have done in previous years (201520142013, 2012, 20112010, 2009, 2008).

Read the rest of this entry »





Journal ranks 2015

26 07 2016

graduate_barsBack in February I wrote about our new bibliometric paper describing a new way to rank journals, which I still contend is a fairer representation of relative citation-based rankings. Given that the technique requires ISI, Google Scholar and Scopus data to calculate the composite ranks, I had to wait for the last straggler (Google) to publish the 2015 values before I could present this year’s rankings to you. Google has finally done that.

So in what has become a bit of an annual tradition, I’m publishing the ranks of a mixed list of ecology, conservation and multidisciplinary disciplines that probably cover most of the journals you might be interested in comparing. Like for last year, I make no claims that this list is comprehensive or representative. For previous lists based on ISI Impact Factors (except 2014), see the following links (2008, 2009, 2010, 2011, 2012, 2013).

So here are the following rankings of (i) 84 ecology, conservation and multidisciplinary journals, and a subset of (ii) 42 ecology journals, (iii) 21 conservation journals, and (iv) 12 marine and freshwater journals. Read the rest of this entry »





How to rank journals

18 02 2016

ranking… properly, or at least ‘better’.

In the past I have provided ranked lists of journals in conservation ecology according to their ISI® Impact Factor (see lists for 2008, 2009, 2010, 2011, 2012 & 2013). These lists have proven to be exceedingly popular.

Why are journal metrics and the rankings they imply so in-demand? Despite many people loathing the entire concept of citation-based journal metrics, we scientists, our administrators, granting agencies, award committees and promotion panellists use them with such merciless frequency that our academic fates are intimately bound to the ‘quality’ of the journals in which we publish.

Human beings love to rank themselves and others, the things they make, and the institutions to which they belong, so it’s a natural expectation that scientific journals are ranked as well.

I’m certainly not the first to suggest that journal quality cannot be fully captured by some formulation of the number of citations its papers receive; ‘quality’ is an elusive characteristic that includes inter alia things like speed of publication, fairness of the review process, prevalence of gate-keeping, reputation of the editors, writing style, within-discipline reputation, longevity, cost, specialisation, open-access options and even its ‘look’.

It would be impossible to include all of these aspects into a single ‘quality’ metric, although one could conceivably rank journals according to one or several of those features. ‘Reputation’ is perhaps the most quantitative characteristic when measured as citations, so we academics have chosen the lowest-hanging fruit and built our quality-ranking universe around them, for better or worse.

I was never really satisfied with metrics like black-box Impact Factors, so when I started discovering other ways to express the citation performance of the journals to which I regularly submit papers, I became a little more interested in the field of bibliometrics.

In 2014 I wrote a post about what I thought was a fairer way to judge peer-reviewed journal ‘quality’ than the default option of relying solely on ISI® Impact Factors. I was particularly interested in why the new kid on the block — Google Scholar Metrics — gave at times rather wildly different ranks of the journals in which I was interested.

So I came up with a simple mean ranking method to get some idea of the relative citation-based ‘quality’ of these journals.

It was a bit of a laugh, really, but my long-time collaborator, Barry Brook, suggested that I formalise the approach and include a wider array of citation-based metrics in the mean ranks.

Because Barry’s ideas are usually rather good, I followed his advice and together we constructed a more comprehensive, although still decidedly simple, approach to estimate the relative ranks of journals from any selection one would care to cobble together. In this case, however, we also included a rank-placement resampler to estimate the uncertainty associated with each rank.

I’m pleased to announce that the final version1 is now published in PLoS One2. Read the rest of this entry »





A fairer way to rank conservation and ecology journals in 2014

1 08 2014

Normally I just report the Thomson-Reuters ISI Web of Knowledge Impact Factors for conservation-orientated journals each year, with some commentary on the rankings of other journals that also publish conservation-related material from time to time (see my lists of the 2008200920102011 and 2012 Impact Factor rankings).

This year, however, I’m doing something different given the growing negativity towards Thomson-Reuters’ secretive behaviour (which they’ve promised this year to rectify by being more transparent) and the generally poor indication of quality that the Impact Factor represents. Although the 2013 Impact Factors have just been released (very late this year, for some reason), I’m going to compare them to the increasingly reputable Google Scholar Journal Metrics, which intuitively make more sense to me, are transparent and turn a little of the rankings dogma on its ear.

In addition to providing both the Google metric as well as the Impact Factor rankings, I’ve come up with a composite (average) rank from the two systems. I think ranks are potentially more useful than raw corrected citation metrics because you must first explicitly set your set of journals to compare. I also go one step further and modify the average ranking with a penalty term that is essentially the addition of the coefficient of variation of rank disparity between the two systems.

Read on for the results.

Read the rest of this entry »





Hate journal impact factors? Try Google rankings instead

18 11 2013

pecking orderA lot of people hate journal impact factors (IF). The hatred arises for many reasons, some of which are logical. For example, Thomson Reuters ISI Web of Knowledge® keeps the process fairly opaque, so it’s sometimes difficult to tell if journals are fairly ranked. Others hate IF because it does not adequately rank papers within or among sub disciplines. Still others hate the idea that citations should have anything to do with science quality (debatable, in my view). Whatever your reason though, IF are more or less here to stay.

Yes, individual scientists shouldn’t be ranked based only on the IF of the journals in which they publish; there are decent alternatives such as the h-index (which can grow even after you die), or even better, the m-index (or m-quotient; think of the latter as a rate of citation accumulation). Others would rather ditch the whole citation thing altogether and measure some element of ‘impact’, although that elusive little beast has yet to be captured and applied objectively.

So just in case you haven’t already seen it, Google has recently put its journal-ranking hat in the ring with its journal metrics. Having firmly wrested the cumbersome (and expensive) personal citation accumulators from ISI and Scopus (for example) with their very popular (and free!) Google Scholar (which, as I’ve said before, all researchers should set-up and make available), they now seem poised to do the same for journal rankings.

So for your viewing and arguing pleasure, here are the ‘top’ 20 journals in Biodiversity and Conservation Biology according to Google’s h5-index (the h-index for articles published in that journal in the last 5 complete years; it is the largest number h such that h articles published in 2008-2012 have at least h citations each):

Read the rest of this entry »





Conservation and ecology journal Impact Factors 2012

20 06 2013

smack2It’s the time of year that scientists love to hate – the latest (2012) journal ranking have been released by ISI Web of Knowledge. Many people despise this system, despite its major role in driving publishing trends.

I’ve previously listed the 2008, 2009, 2010 and 2011 IF for major conservation and ecology journals. As before, I’ve included the previous year’s IF alongside the latest values to see how journals have improved or worsened (but take note – journals increase their IF on average anyway merely by the fact that publication frequency is increasing, so small jumps aren’t necessarily meaningful; I suspect that declines are therefore more telling).

Read the rest of this entry »





Conservation and Ecology Impact Factors 2011

29 06 2012

Here we go – another year, another set of citations, and another journal ranking by ISI Web of Knowledge Journal Citation Reports. Love them or loathe them, Impact Factors (IF) are immensely important for dictating publication trends. No, a high Impact Factor doesn’t mean your paper will receive hundreds of citations, but the two are correlated.

I’ve previously listed the 2008, 2009 and 2010 IF for major conservation and ecology journals – now here are the 2011 IF fresh off the press (so to speak). I’ve included the 2010 alongside to see how journals have improved or worsened (but take note – journals increase their IF on average anyway merely by the fact that publication frequency is increasing, so small jumps aren’t necessarily meaningful).

Read the rest of this entry »





Arguing for scientific socialism in ecology funding

26 06 2012

What makes an ecologist ‘successful’? How do you measure ‘success’? We’d all like to believe that success is measured by our results’ transformation of ecological theory and practice – in a conservation sense, this would ultimately mean our work’s ability to prevent (or at least, slow down) extinctions.

Alas, we’re not that good at quantifying such successes, and if you use the global metric of species threats, deforestation, pollution, invasive species and habitat degradation, we’ve failed utterly.

So instead, we measure scientific ‘success’ via peer-reviewed publications, and the citations (essentially, scientific cross-referencing) that arise from these. These are blunt instruments, to be sure, but they are really the only real metrics we have. If you’re not being cited, no one is reading your work; and if no one is reading you’re work, your cleverness goes unnoticed and you help nothing and no one.

A paper I just read in the latest issue of Oikos goes some way to examine what makes a ‘successful’ ecologist (i.e., in terms of publications, citations and funding), and there are some very interesting results. Read the rest of this entry »





2010 ISI Impact Factors out now (with some surprises)

29 06 2011

It’s been another year of citations and now the latest list of ISI Impact Factors (2010) has come out. Regardless of how much stock you put in these (see here for a damning review), you cannot ignore their influence on publishing trends and author journal choices.

As I’ve done for 2008 and 2009, I’ve taken the liberty of providing the new IFs for some prominent conservation and ecology journals, and a few other journals occasionally publishing conservation-related material.

One particular journal deserves special attention here. Many of you might know that I was Senior Editor with Conservation Letters from 2008-2010, and I (with other editorial staff) made some predictions about where the journal’s first impact factor might be on the scale (see also here). Well, I have to say the result exceeded my expectations (although Hugh Possingham was closer to the truth in the end – bugger!). So the journal’s first 2010 impact factor (for which I take a modicum of credit ;-) is a whopping… 4.694 (3rd among all ‘conservation’ journals). Well done to all and sundry who have edited and published in the journal. My best wishes to the team that has to deal with the inevitable rush of submissions this will likely bring!

So here are the rest of the 2010 Impact Factors with the 2009 values for comparison: Read the rest of this entry »





Demise of the Australian ERA journal rankings

3 06 2011

Earlier this week Australian Senator Kim Carr (Minister for Innovation, Industry, Science and Research) announced the removal of the somewhat controversial ERA rankings for scientific journals.

Early last year I posted about the Excellence in Research for Australia (ERA) journal rankings for ecology and conservation journals. To remind you, the ERA has ranked > 20,000 unique peer-reviewed journals, with each given a single quality rating – and they are careful to say that “A journal’s quality rating represents the overall quality of the journal. This is defined in terms of how it compares with other journals and should not be confused with its relevance or importance to a particular discipline.”.

Now, after much to-ing and fro-ing about what the four rankings actually mean (A*, A, B & C), Senator Carr has announced that he’s dumping them under the advice of the Australian Research Council. Read the rest of this entry »





Conservation Letters citation statistics

15 07 2010

As most CB readers will know, the ‘new’ (as of 2008) conservation journal kid on the block that I co-edit, Conservation Letters, was ISI-listed this year. This allows us to examine our citation statistics and make some informed guesses about the journal’s Impact Factor that should be ascribed next year. Here are some stats:

  • We published 31 articles in 5 issues in 2008, 37 articles in 6 issues in 2009, and so far 24 articles in 3 issues in 2010
  • Most authors were from the USA (53), followed by Australia (28), UK (29), Canada (10), France (7) and South Africa (4)
  • The published articles have received a total of 248 citations, with an average citation rate per article of 2.70
  • The journal’s h-index = 8 (8 articles have been cited at least 8 times)
  • The 31 articles published in 2008 have received thus far 180 citations (average of 5.81 citations per article)
  • The top 10 most cited articles are (in descending order): Read the rest of this entry »




ISI 2009 Impact Factors now out

18 06 2010

Last year I reported the 2008 ISI Impact Factors for some prominent conservation journals and a few other journals occasionally publishing conservation-related material. ISI just released the 2009 Impact Factors, so I’ll do the same again this year, and add some general ecology journals as well. For all you Australians, I also recently reported the ERA Journal Rankings.

So here are the 2009 Impact Factors for the journals listed on this site’s Journals page and their 2008 values for comparison: Read the rest of this entry »





New Impact Factors for conservation journals

23 06 2009

For those of you who follow the ISI Impact Factors for journals (the ratio of the number of total citations i+3 for the papers published in years i and i+1 divided by the total number of citable papers published in years i and i+1), you might know that the 2008 IFs have just been published. Now, whether you put stock or not in these is somewhat irrelevant – enough people do to make it relevant to who publishes what where, and who cites or does not cite scientific papers. It’s also in our scientific culture – pretty much everyone in a field will have a rough idea of the range of IFs their specific discipline’s journals span, and so it acts as a kind of target for varying qualities of science. Far from perfect, but it’s what we have to deal with.

So, I thought I’d publish the 2008 Impact Factors for the journals listed on this site’s Journals page and compare them to the 2007 values:

and for some more general journals that occasionally publish conservation papers:

Almost across the board, conservation journals have seen an increase in their Impact Factors. There are many other good conservation papers published in other journals, but this list probably represents the main outlets. I hope we continue to focus more on conservation outcomes rather than scientific kudos per se, although I’m certainly cognisant of the hand that feeds. Good luck with your publishing.

CJA Bradshaw

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl








%d bloggers like this: