As the sharp eyed among you might have noticed, Nature Protocols is running an ad campaign at the moment. If you haven’t here is one of our banners:
With this campaign we are offering anyone the opportunity of a month’s access to Nature Protocols. There are no catches other than that we want to know who you are, that we will send you an email after your month is up (which you can cheerfully ignore if you wish) to ask what you thought of the journal and prompting you to recommend us to your institution’s library, and that you will need to create an account for yourself on nature.com (which is also free). For that you can spend a month happily exploring the entire archive of Nature Protocols content.
Anyway, in the ads we are making a very specific claim: that nature Protocols is the most referenced source for protocols. A bold claim which I think I should justify here.
Normally when journals want to talk about their citations they turn to the Impact Factors calculated by ISI. That’s fine in the main, if you treat that number with some caution and only use it to compare very similar journals covering similar topics. Nature Protocols had an IF of 8.4 for 2010 (the most recent number) which we are very happy with although we hope that the number for 2011 (released in June 2012) will be a little higher.
However, laboratory protocols are useful long past the two year window from the year of publication that IFs consider. We could look at a 5-year impact factor (also 8.4 although higher if you believe in accuracy to more decimal places, which I don’t) which would be better, but the fact is that ISI don’t index or calculate IFs for the other publishers of research protocols: Current Protocols, Cold Spring Harbor Protocols, or Springer’s Methods In … series; and those are the people we would like to compare ourselves against.
So I had to do the calculating myself using the Scopus database of citations maintained by Elsevier. Using that I looked at all the papers published by Nature Protocols and our big three rivals from 2006, when Nature Protocols launched, to the present (well January this year when I did the number crunching anyway). I looked at every citation to any paper classified as ‘Article’ or ‘Review’. I also restricted the “Methods In …” series to research in the biological and biomedical subject areas. These are the data I got:
|Published since 2006||Nature Protocols||Current Protocols||Cold Spring Harbor||Methods in …|
|Average Citation Per Article||23.46||1.45||1.98||2.15|
|Average Citation per Cited Article||25.60||3.39||2.90||4.10|
|No in top 100 cited||91||1||0||8|
As you can imagine that is very pleasing. Even if you mistrust simple means for a distribution that is far from Normal, Nature Protocols as a whole has been cited more than any of our competitors, and that despite publishing a much lesser number of Protocols than “Methods in…”. (I would say we are more selective but I feel I’m already crowing too much). And this isn’t due to one or two exceptionally highly cited protocols. It is true that one protocol* has been cited over 1300 times—1,533 times as of today—but of the 100 best cited protocols in this period from these sources 91 of them were published in Nature Protocols.
So there is the justification. Nature Protocols protocols are cited more often, and so presumably used more often, than any others.
* Da Wei Huang, Brad T Sherman & Richard A Lempicki, Systematic and integrative analysis of large gene lists using DAVID bioinformatics resources. Nature Protocols 4, 44-57 (2009) doi:10.1038/nprot.2008.211