Journal ranks 2022

21 07 2023

As I’ve done every year for the last 15 years, I can now present the 2022 conservation / ecology / sustainability journal ranks based on my (published) journal-ranking method.

Although both the Clarivate (Impact Factor, Journal Citation Indicator, Immediacy Index) and Scopus (CiteScore, Source-Normalised Impact Per Paper, SCImago Journal Rank) values have been out for about a month or so, the Google (h5-index, h5-median) scores only came out yesterday.

This year’s also a bit weird from the perspective of the Clarivate ranks. First, Impact Factors will no longer be provided to three significant digits, but only to one (e.g., 7.2 versus 7.162). That’s not such a big deal, but it does correct for relative ranks based on false precision. However, the biggest changes are more methdological — Impact Factors now take online articles into account (in the denominator), so most journals will have a lower Impact Factor this year compared to last. In fact, of the 105 journals in the ecology/conservation/multidisciplinary category that have data for both 2021 and 2022, the 2022 Impact Factors are a median 15% lower than the 2021 values.

Another effect in play appears to have been the pandemic. The worst of the pandemic happened right during the assessment period, and I’m pretty sure this is reflected both in terms of the number of articles published (down a median of 10%) and total number of citations in the assessment period (down 7%) per journal.

But using my method, these changes a somewhat irrelevant because I calculate relative ranks, not an absolute score.

I therefore present the new 2022 ranks for: (i) 108 ecology, conservation and multidisciplinary journals, (ii) 28 open-access (i.e., you have to pay) journals from the previous category, (iii) 66 ‘ecology’ journals, (iv) 31 ‘conservation’ journals, (v) 43 ‘sustainability’ journals (with general and energy-focussed journals included), and (vi) 21 ‘marine & freshwater’ journals.

Here are the results:

Read the rest of this entry »




Journal ranks 2021

4 07 2022

Now that Clarivate, Google, and Scopus have recently published their respective journal citation scores for 2021, I can now present — for the 14th year running on ConvervationBytes.com — the 2021 conservation/ecology/sustainability journal ranks based on my journal-ranking method.

Like last year, I’ve added a few journals. I’ve also included in the ranking the Journal Citation Indicator (JCI) in addition to the Journal Impact Factor and Immediacy Index from Clarivate ISI, and the CiteScore (CS) in addition to the Source-Normalised Impact Per Paper (SNIP) and SCImago Journal Rank (SJR) from Scopus. 

You can access the raw data for 2021 and use my RShiny app to derive your own samples of journal ranks.

I therefore present the new 2021 ranks for: (i) 106 ecology, conservation and multidisciplinary journals, (ii) 27 open-access (i.e., you have to pay) journals from the previous category, (iii) 64 ‘ecology’ journals, (iv) 32 ‘conservation’ journals, (v) 43 ‘sustainability’ journals (with general and energy-focussed journals included), and (vi) 21 ‘marine & freshwater’ journals.

Remember not to take much notice if a journal boasts about how its Impact Factor has increased this year, because these tend to increase over time anyway What’s important is a journal’s relative (to other journals) rank.

Here are the results:

Read the rest of this entry »




Journal ranks 2020

23 07 2021

This is the 13th year in a row that I’ve generated journal ranks based on the journal-ranking method we published several years ago.

There are few differences in how I calculated this year’s ranks, as well as some relevant updates:

  1. As always, I’ve added a few new journals (either those who have only recently been scored with the component metrics, or ones I’ve just missed before);
  2. I’ve included the new ‘Journal Citation Indicator’ (JCI) in addition to the Journal Impact Factor and Immediacy Index from Clarivate ISI. JCI “… a field-normalised metric, represents the average category-normalised citation impact for papers published in the prior three-year period.”. In other words, it’s supposed to correct for field-specific citation trends;
  3. While this isn’t my change, the Clarivate metrics are now calculated based on when an article is first published online, rather than just in an issue. You would have thought that this should have been the case for many years, but they’ve only just done it;
  4. I’ve also added the ‘CiteScore’ (CS) in addition to the Source-Normalised Impact Per Paper (SNIP) and SCImago Journal Rank (SJR) from Scopus. CS is “the number of citations, received in that year and previous 3 years, for documents published in the journal during that period (four years), divided by the total number of published documents … in the journal during the same four-year period”;
  5. Finally, you can access the raw data for 2020 (I’ve done the hard work for you) and use my RShiny app to derive your own samples of journal ranks (also see the relevant blog post). You can add new journal as well to the list if my sample isn’t comprehensive enough for you.

Since the Google Scholar metrics were just released today, I present the new 2020 ranks for: (i) 101 ecology, conservation and multidisciplinary journals, and a subset of (ii) 61 ‘ecology’ journals, (iii) 29 ‘conservation’ journals, (iv) 41 ‘sustainability’ journals (with general and energy-focussed journals included), and (v) 20 ‘marine & freshwater’ journals.

One final observation. I’ve noted that several journals are boasting about how their Impact Factors have increased this year, when they fail to mention that this is the norm across most journals. As you’ll see below, relative ranks don’t actually change that much for most journals. In fact, this is a redacted email I received from a journal that I will not identify here:

We’re pleased to let you know that the new Impact Factor for [JOURNAL NAME] marks a remarkable increase, as it now stands at X.XXX, compared to last year’s X.XXX. And what is even more important: [JOURNAL NAME] increased its rank in the relevant disciplines: [DISCIPLINE NAME].

Although the Impact Factor may not be the perfect indicator of success, it remains the most widely recognised one at journal level. Therefore, we’re excited to share this achievement with you, as it wouldn’t have been possible, had it not been for all of your contributions and support as authors, reviewers, editors and readers. A huge ‘THANK YOU’ goes to all of you!

What bullshit.

Anyway, on to the results:

Read the rest of this entry »





Rank your own sample of journals

29 12 2020

If you follow my blog regularly, you’ll know that around the middle of each year I publish a list of journals in conservation and ecology ranked according to a multi-index algorithm we developed back in 2016. The rank I release coincides with the release of the Web of Knowledge Impact Factors, various Scopus indices, and the Google Scholar journal ranks.

The reasons we developed a multi-index rank are many (summarised here), but they essentially boil down to the following rationale:

(i) No single existing index is without its own faults; (ii) ranks are only really meaningful when expressed on a relative scale; and (iii) different disciplines have wildly different index values, so generally disciplines aren’t easily compared.

That’s why I made the R code available to anyone wishing to reproduce their own ranked sample of journals. However, given that implementing the R code takes a bit of know-how, I decided to apply my new-found addiction to R Shiny to create (yet another) app.

Welcome to the JournalRankShiny app.

This new app takes a pre-defined list of journals and the required indices, and does the resampled ranking for you based on a few input parameters that you can set. It also provides a few nice graphs for the ranks (and their uncertainties), as well as a plot showing the relationship between the resulting ranks and the journal’s Impact Factor (for comparison).

Read the rest of this entry »




Journal ranks 2019

8 07 2020

journalstack_16x9

For the last 12 years and running, I’ve been generating journal ranks based on the journal-ranking method we published several years ago. Since the Google journal h-indices were just released, here are the new 2019 ranks for: (i) 99 ecology, conservation and multidisciplinary journals, and a subset of (ii) 61 ‘ecology’ journals, (iii) 27 ‘conservation’ journals, (iv) 41 ‘sustainability’ journals (with general and energy-focussed journals included), and (v) 20 ‘marine & freshwater’ journals.

See also the previous years’ rankings (2018, 20172016201520142013, 2012, 20112010, 2009, 2008).

Read the rest of this entry »





Journal ranks 2018

23 07 2019

journal stacks

As has become my custom (11 years and running), and based on the journal-ranking method we published several years ago, here are the new 2018 ranks for (i) 90 ecology, conservation and multidisciplinary journals, and a subset of (ii) 56 ‘ecology’ journals, and (iii) 26 ‘conservation’ journals. I’ve also included two other categories — (iv) 40 ‘sustainability’ journals (with general and energy-focussed journals included), and 19 ‘marine & freshwater’ journals for the watery types.

See also the previous years’ rankings (20172016201520142013, 2012, 20112010, 2009, 2008).

Read the rest of this entry »





Journal ranks 2017

27 08 2018

book-piles

A few years ago we wrote a bibliometric paper describing a new way to rank journals, and I still think it is one of the better ways to rank them based on a composite index of relative citation-based metrics . I apologise for taking so long to do the analysis this year, but it took Google Scholar a while to post their 2017 data.

So, here are the 2017 ranks for (i) 88 ecology, conservation and multidisciplinary journals, and a subset of (ii) 55 ‘ecology’ journals, (iii) 24 ‘conservation’ journals. Also this year, I’ve included two new categories — (iv) 38 ‘sustainability’ journals (with general and energy-focussed journals included), and 19 ‘marine & freshwater’ journals for you watery types.

See also the previous years’ rankings (2016201520142013, 2012, 20112010, 2009, 2008).

Read the rest of this entry »