Journal ranks 2015

26 07 2016

graduate_barsBack in February I wrote about our new bibliometric paper describing a new way to rank journals, which I still contend is a fairer representation of relative citation-based rankings. Given that the technique requires ISI, Google Scholar and Scopus data to calculate the composite ranks, I had to wait for the last straggler (Google) to publish the 2015 values before I could present this year’s rankings to you. Google has finally done that.

So in what has become a bit of an annual tradition, I’m publishing the ranks of a mixed list of ecology, conservation and multidisciplinary disciplines that probably cover most of the journals you might be interested in comparing. Like for last year, I make no claims that this list is comprehensive or representative. For previous lists based on ISI Impact Factors (except 2014), see the following links (2008, 2009, 2010, 2011, 2012, 2013).

So here are the following rankings of (i) 84 ecology, conservation and multidisciplinary journals, and a subset of (ii) 42 ecology journals, (iii) 21 conservation journals, and (iv) 12 marine and freshwater journals. Read the rest of this entry »

How to rank journals

18 02 2016

ranking… properly, or at least ‘better’.

In the past I have provided ranked lists of journals in conservation ecology according to their ISI® Impact Factor (see lists for 2008, 2009, 2010, 2011, 2012 & 2013). These lists have proven to be exceedingly popular.

Why are journal metrics and the rankings they imply so in-demand? Despite many people loathing the entire concept of citation-based journal metrics, we scientists, our administrators, granting agencies, award committees and promotion panellists use them with such merciless frequency that our academic fates are intimately bound to the ‘quality’ of the journals in which we publish.

Human beings love to rank themselves and others, the things they make, and the institutions to which they belong, so it’s a natural expectation that scientific journals are ranked as well.

I’m certainly not the first to suggest that journal quality cannot be fully captured by some formulation of the number of citations its papers receive; ‘quality’ is an elusive characteristic that includes inter alia things like speed of publication, fairness of the review process, prevalence of gate-keeping, reputation of the editors, writing style, within-discipline reputation, longevity, cost, specialisation, open-access options and even its ‘look’.

It would be impossible to include all of these aspects into a single ‘quality’ metric, although one could conceivably rank journals according to one or several of those features. ‘Reputation’ is perhaps the most quantitative characteristic when measured as citations, so we academics have chosen the lowest-hanging fruit and built our quality-ranking universe around them, for better or worse.

I was never really satisfied with metrics like black-box Impact Factors, so when I started discovering other ways to express the citation performance of the journals to which I regularly submit papers, I became a little more interested in the field of bibliometrics.

In 2014 I wrote a post about what I thought was a fairer way to judge peer-reviewed journal ‘quality’ than the default option of relying solely on ISI® Impact Factors. I was particularly interested in why the new kid on the block — Google Scholar Metrics — gave at times rather wildly different ranks of the journals in which I was interested.

So I came up with a simple mean ranking method to get some idea of the relative citation-based ‘quality’ of these journals.

It was a bit of a laugh, really, but my long-time collaborator, Barry Brook, suggested that I formalise the approach and include a wider array of citation-based metrics in the mean ranks.

Because Barry’s ideas are usually rather good, I followed his advice and together we constructed a more comprehensive, although still decidedly simple, approach to estimate the relative ranks of journals from any selection one would care to cobble together. In this case, however, we also included a rank-placement resampler to estimate the uncertainty associated with each rank.

I’m pleased to announce that the final version1 is now published in PLoS One2. Read the rest of this entry »

Scientists should blog

27 05 2014
© Bill Porter

© Bill Porter

As is about to tick over 1 million hits since its inception in mid-2008, I thought I’d share why I think more scientists should blog about their work and interests.

As many of you know, I regularly give talks and short courses on the value of social and other media for scientists; in fact, my next planned ‘workshop’ (Make Your Science Matter) on this and related subjects will be held at the Ecological Society of Australia‘s Annual Conference in Alice Springs later this year.

I’ve written before about the importance of having a vibrant, attractive and up-to-date online profile (along with plenty of other tips), but I don’t think I’ve ever put down my thoughts on blogging in particular. So here goes.

  1. The main reasons scientists should consider blogging is the hard, cold fact that not nearly enough people read scientific papers. Most scientists are lucky if a few of their papers ever top 100 citations, and I’d wager that most are read by only a handful of specialists (there are exceptions, of course, but these are rare). If you’re a scientist, I don’t have to tell you the disappointment of realising that the blood, sweat and tears shed over each and every paper is largely for nought considering just how few people will ever read our hard-won results. It’s simply too depressing to contemplate, especially considering that the sum of human knowledge is so vast and expanding that this trend will only ever get worse. For those reasons alone, blogging about your own work widens the readership by orders of magnitude. More people read my blog every day than will probably ever read the majority of my papers. Read the rest of this entry »

A posthumous citation tribute for Sodhi

6 11 2012

I’m sitting at a friend’s house in Sydney writing this quick entry before jumping on a plane to London. It’s been a busy few days, and will be an even busier next few weeks.

I met with Paul and Anne Ehrlich yesterday (who are visiting Australia) and we finalised the first complete draft of our book – I will keep you posted on that. In London, I will be meeting with the Journal of Animal Ecology crew on Wednesday night (I’m on the editorial board), followed by two very interesting days at the Zoological Society of London‘s Protected Areas Symposium at Regent’s Park. Then I’ll be off to the Universities of Liverpool and York for a quick lecture tour, followed by a very long trip back home. I’m already tired.

In the meantime, I thought I’d share a little bit of news about our dear and recently deceased friend and colleague, Navjot Sodhi. We’ve already written several times our personal tributes (see here, here and here) to this great mind of conservation thinking who disappeared from us far too soon, but this is a little different. Barry Brook, as is his wont to do, came up with a great idea to get Navjot up posthumously on Google Scholar.
Read the rest of this entry »

Conservation Letters citation statistics

15 07 2010

As most CB readers will know, the ‘new’ (as of 2008) conservation journal kid on the block that I co-edit, Conservation Letters, was ISI-listed this year. This allows us to examine our citation statistics and make some informed guesses about the journal’s Impact Factor that should be ascribed next year. Here are some stats:

  • We published 31 articles in 5 issues in 2008, 37 articles in 6 issues in 2009, and so far 24 articles in 3 issues in 2010
  • Most authors were from the USA (53), followed by Australia (28), UK (29), Canada (10), France (7) and South Africa (4)
  • The published articles have received a total of 248 citations, with an average citation rate per article of 2.70
  • The journal’s h-index = 8 (8 articles have been cited at least 8 times)
  • The 31 articles published in 2008 have received thus far 180 citations (average of 5.81 citations per article)
  • The top 10 most cited articles are (in descending order): Read the rest of this entry »

%d bloggers like this: