Journal ranks 2021

4 07 2022

Now that Clarivate, Google, and Scopus have recently published their respective journal citation scores for 2021, I can now present — for the 14th year running on ConvervationBytes.com — the 2021 conservation/ecology/sustainability journal ranks based on my journal-ranking method.

Like last year, I’ve added a few journals. I’ve also included in the ranking the Journal Citation Indicator (JCI) in addition to the Journal Impact Factor and Immediacy Index from Clarivate ISI, and the CiteScore (CS) in addition to the Source-Normalised Impact Per Paper (SNIP) and SCImago Journal Rank (SJR) from Scopus. 

You can access the raw data for 2021 and use my RShiny app to derive your own samples of journal ranks.

I therefore present the new 2021 ranks for: (i) 106 ecology, conservation and multidisciplinary journals, (ii) 27 open-access (i.e., you have to pay) journals from the previous category, (iii) 64 ‘ecology’ journals, (iv) 32 ‘conservation’ journals, (v) 43 ‘sustainability’ journals (with general and energy-focussed journals included), and (vi) 21 ‘marine & freshwater’ journals.

Remember not to take much notice if a journal boasts about how its Impact Factor has increased this year, because these tend to increase over time anyway What’s important is a journal’s relative (to other journals) rank.

Here are the results:

Read the rest of this entry »




Journal ranks 2020

23 07 2021

This is the 13th year in a row that I’ve generated journal ranks based on the journal-ranking method we published several years ago.

There are few differences in how I calculated this year’s ranks, as well as some relevant updates:

  1. As always, I’ve added a few new journals (either those who have only recently been scored with the component metrics, or ones I’ve just missed before);
  2. I’ve included the new ‘Journal Citation Indicator’ (JCI) in addition to the Journal Impact Factor and Immediacy Index from Clarivate ISI. JCI “… a field-normalised metric, represents the average category-normalised citation impact for papers published in the prior three-year period.”. In other words, it’s supposed to correct for field-specific citation trends;
  3. While this isn’t my change, the Clarivate metrics are now calculated based on when an article is first published online, rather than just in an issue. You would have thought that this should have been the case for many years, but they’ve only just done it;
  4. I’ve also added the ‘CiteScore’ (CS) in addition to the Source-Normalised Impact Per Paper (SNIP) and SCImago Journal Rank (SJR) from Scopus. CS is “the number of citations, received in that year and previous 3 years, for documents published in the journal during that period (four years), divided by the total number of published documents … in the journal during the same four-year period”;
  5. Finally, you can access the raw data for 2020 (I’ve done the hard work for you) and use my RShiny app to derive your own samples of journal ranks (also see the relevant blog post). You can add new journal as well to the list if my sample isn’t comprehensive enough for you.

Since the Google Scholar metrics were just released today, I present the new 2020 ranks for: (i) 101 ecology, conservation and multidisciplinary journals, and a subset of (ii) 61 ‘ecology’ journals, (iii) 29 ‘conservation’ journals, (iv) 41 ‘sustainability’ journals (with general and energy-focussed journals included), and (v) 20 ‘marine & freshwater’ journals.

One final observation. I’ve noted that several journals are boasting about how their Impact Factors have increased this year, when they fail to mention that this is the norm across most journals. As you’ll see below, relative ranks don’t actually change that much for most journals. In fact, this is a redacted email I received from a journal that I will not identify here:

We’re pleased to let you know that the new Impact Factor for [JOURNAL NAME] marks a remarkable increase, as it now stands at X.XXX, compared to last year’s X.XXX. And what is even more important: [JOURNAL NAME] increased its rank in the relevant disciplines: [DISCIPLINE NAME].

Although the Impact Factor may not be the perfect indicator of success, it remains the most widely recognised one at journal level. Therefore, we’re excited to share this achievement with you, as it wouldn’t have been possible, had it not been for all of your contributions and support as authors, reviewers, editors and readers. A huge ‘THANK YOU’ goes to all of you!

What bullshit.

Anyway, on to the results:

Read the rest of this entry »





Journal ranks 2019

8 07 2020

journalstack_16x9

For the last 12 years and running, I’ve been generating journal ranks based on the journal-ranking method we published several years ago. Since the Google journal h-indices were just released, here are the new 2019 ranks for: (i) 99 ecology, conservation and multidisciplinary journals, and a subset of (ii) 61 ‘ecology’ journals, (iii) 27 ‘conservation’ journals, (iv) 41 ‘sustainability’ journals (with general and energy-focussed journals included), and (v) 20 ‘marine & freshwater’ journals.

See also the previous years’ rankings (2018, 20172016201520142013, 2012, 20112010, 2009, 2008).

Read the rest of this entry »





Journal ranks 2018

23 07 2019

journal stacks

As has become my custom (11 years and running), and based on the journal-ranking method we published several years ago, here are the new 2018 ranks for (i) 90 ecology, conservation and multidisciplinary journals, and a subset of (ii) 56 ‘ecology’ journals, and (iii) 26 ‘conservation’ journals. I’ve also included two other categories — (iv) 40 ‘sustainability’ journals (with general and energy-focussed journals included), and 19 ‘marine & freshwater’ journals for the watery types.

See also the previous years’ rankings (20172016201520142013, 2012, 20112010, 2009, 2008).

Read the rest of this entry »





Journal ranks 2017

27 08 2018

book-piles

A few years ago we wrote a bibliometric paper describing a new way to rank journals, and I still think it is one of the better ways to rank them based on a composite index of relative citation-based metrics . I apologise for taking so long to do the analysis this year, but it took Google Scholar a while to post their 2017 data.

So, here are the 2017 ranks for (i) 88 ecology, conservation and multidisciplinary journals, and a subset of (ii) 55 ‘ecology’ journals, (iii) 24 ‘conservation’ journals. Also this year, I’ve included two new categories — (iv) 38 ‘sustainability’ journals (with general and energy-focussed journals included), and 19 ‘marine & freshwater’ journals for you watery types.

See also the previous years’ rankings (2016201520142013, 2012, 20112010, 2009, 2008).

Read the rest of this entry »





Journal ranks 2016

14 07 2017

Many books

Last year we wrote a bibliometric paper describing a new way to rank journals, which I contend is a fairer representation of relative citation-based rankings by combining existing ones (e.g., ISI, Google Scholar and Scopus) into a composite rank. So, here are the 2016 ranks for (i) 93 ecology, conservation and multidisciplinary journals, and a subset of (ii) 46 ecology journals, (iii) 21 conservation journals, just as I have done in previous years (201520142013, 2012, 20112010, 2009, 2008).

Read the rest of this entry »





Dealing with rejection

8 02 2017

6360351663382153201743264721_ls_crying-menWe scientists can unfortunately be real bastards to each other, and no other interaction brings out that tendency more than peer review. Of course no one, no matter how experienced, likes to have a manuscript rejected. People hate to be on the receiving end of any criticism, and scientists are certainly no different. Many reviews can be harsh and unfair; many reviewers ‘miss the point’ or are just plain nasty.

It is inevitable that you will be rejected outright many times after the first attempt. Sometimes you can counter this negative decision via an appeal, but more often than not the rejection is final no matter what you could argue or modify. So your only recourse is move on to a lower-ranked journal. If you consistently submit to low-ranked journals, you would obviously receive far fewer rejections during the course of your scientific career, but you would also probably minimise the number of citations arising from your work as a consequence.

So your manuscript has been REJECTED. What now? The first thing to remember is that you and your colleagues have not been rejected, only your manuscript has. This might seem obvious as you read these words, but nearly everyone — save the chronically narcissistic — goes through some feelings of self-doubt and inadequacy following a rejection letter. At this point it is essential to remind yourself that your capacity to do science is not being judged here; rather, the most likely explanation is that given your strategy to maximise your paper’s citation potential, you have probably just overshot the target journal. What this really means is that the editor (and/or reviewers) are of the opinion that your paper is not likely to gain as many citations as they think papers in their journal should. Look closely at the rejection letter — does it say anything about “… lacking novelty …”? Read the rest of this entry »





Journal ranks 2015

26 07 2016

graduate_barsBack in February I wrote about our new bibliometric paper describing a new way to rank journals, which I still contend is a fairer representation of relative citation-based rankings. Given that the technique requires ISI, Google Scholar and Scopus data to calculate the composite ranks, I had to wait for the last straggler (Google) to publish the 2015 values before I could present this year’s rankings to you. Google has finally done that.

So in what has become a bit of an annual tradition, I’m publishing the ranks of a mixed list of ecology, conservation and multidisciplinary disciplines that probably cover most of the journals you might be interested in comparing. Like for last year, I make no claims that this list is comprehensive or representative. For previous lists based on ISI Impact Factors (except 2014), see the following links (2008, 2009, 2010, 2011, 2012, 2013).

So here are the following rankings of (i) 84 ecology, conservation and multidisciplinary journals, and a subset of (ii) 42 ecology journals, (iii) 21 conservation journals, and (iv) 12 marine and freshwater journals. Read the rest of this entry »





How to rank journals

18 02 2016

ranking… properly, or at least ‘better’.

In the past I have provided ranked lists of journals in conservation ecology according to their ISI® Impact Factor (see lists for 2008, 2009, 2010, 2011, 2012 & 2013). These lists have proven to be exceedingly popular.

Why are journal metrics and the rankings they imply so in-demand? Despite many people loathing the entire concept of citation-based journal metrics, we scientists, our administrators, granting agencies, award committees and promotion panellists use them with such merciless frequency that our academic fates are intimately bound to the ‘quality’ of the journals in which we publish.

Human beings love to rank themselves and others, the things they make, and the institutions to which they belong, so it’s a natural expectation that scientific journals are ranked as well.

I’m certainly not the first to suggest that journal quality cannot be fully captured by some formulation of the number of citations its papers receive; ‘quality’ is an elusive characteristic that includes inter alia things like speed of publication, fairness of the review process, prevalence of gate-keeping, reputation of the editors, writing style, within-discipline reputation, longevity, cost, specialisation, open-access options and even its ‘look’.

It would be impossible to include all of these aspects into a single ‘quality’ metric, although one could conceivably rank journals according to one or several of those features. ‘Reputation’ is perhaps the most quantitative characteristic when measured as citations, so we academics have chosen the lowest-hanging fruit and built our quality-ranking universe around them, for better or worse.

I was never really satisfied with metrics like black-box Impact Factors, so when I started discovering other ways to express the citation performance of the journals to which I regularly submit papers, I became a little more interested in the field of bibliometrics.

In 2014 I wrote a post about what I thought was a fairer way to judge peer-reviewed journal ‘quality’ than the default option of relying solely on ISI® Impact Factors. I was particularly interested in why the new kid on the block — Google Scholar Metrics — gave at times rather wildly different ranks of the journals in which I was interested.

So I came up with a simple mean ranking method to get some idea of the relative citation-based ‘quality’ of these journals.

It was a bit of a laugh, really, but my long-time collaborator, Barry Brook, suggested that I formalise the approach and include a wider array of citation-based metrics in the mean ranks.

Because Barry’s ideas are usually rather good, I followed his advice and together we constructed a more comprehensive, although still decidedly simple, approach to estimate the relative ranks of journals from any selection one would care to cobble together. In this case, however, we also included a rank-placement resampler to estimate the uncertainty associated with each rank.

I’m pleased to announce that the final version1 is now published in PLoS One2. Read the rest of this entry »








%d bloggers like this: