In a recent trial Nature explored ways to improve the peer review system.
You are one of only a handful of experts who can really judge your latest findings. You know the experimental limitations of your model system best — not to mention the political realities (be it competition, overdue PhDs or grant deadlines). Why submit to the agonies of peer review to publish in a scientific journal? Why not just upload the data to a preprint server as physicists have done for years? Surely your data should speak for itself and powerful search engines will make it available to those who look for it. Well yes, but it's a jungle out there: biology is growing — 90% of biologists who ever lived are extant and the number of papers double every decade. Research funding is booming, but not keeping up with the growth in researchers and research — indeed, the spending shortfall has become acute in the US, resulting in unprecedented grant-rejection rates. Cell biology data are reproduced relatively easily and consequently the field is highly competitive. Competition energizes research, but in excess it can stifle cooperation and progress. The increasingly frenzied and secretive research landscape presents a slippery slope that in rare cases leads to plagiarism and data manipulation. A paper validated by peer review therefore serves as an important document of research achievement.
Although the archival role of science journals can be provided by databases, it is a second function that has made it ever more important to publish in prestigious journals: the filtering of information. Experimental (data quality, scope and ethics) and editorial (conceptual advance and general interest) filtering criteria define the quality of a journal. Thousands of papers are published even on relatively specialized topics and judging academic merit is all but trivial. An important corollary of filtering is the establishment of a journal based value system. In many fields there is a fairly well accepted hierarchy of journals — publication in a certain journal bestows a certain level of academic credit. We have previously discussed that an over-formalization of this assessment system by employers and grant agencies is convenient but hardly fair. Editors select the best and most appropriate papers for their journal, but they are not the sole arbiter of academic achievement.
Yet it is utopian to expect the academic credit system to be dissociated from journal publishing anytime soon. Given this, we have to ensure that the selection process is as good and as fair as possible. Confidential peer review is employed almost universally to provide an informed decision making process. Peers are essential to provide authoritative specialist expertise and confidentiality has been assumed to be necessary to ensure the views of the referees are incisive, as potential recriminations and behind the scenes manoeuvring can be circumvented. Although peer review is universal, journals have developed different versions of the process — with more or less prominent roles of academic or professional editors and editorial boards. We have commented previously on attempts to break the mould, which have as yet not demonstrably improved the system (Nature Cell Biology, editorial Nov 2005). Obviously, confidential peer review has to be undertaken carefully, as an anonymous referee can all too easily dominate a decision. The Nature journals have several safeguards against this: first, all journals employ professional editors with academic credentials, who are independent, yet sufficiently expert to adjudicate between author and referee. Second, referees receive feedback with the comments of the other referees — this cross-refereeing has helped balance the decision making process. Third, the process is kept as transparent as possible and authors may appeal. It is essentially impossible to anonymize manuscripts to referees and is not meaningful to publish referee comments unless they are rewritten as a 'news and views'. But are there other ways to improve the system?
Nature recently completed a four-month trial of a voluntary open peer review system: the authors of manuscripts selected for formal review were invited to allow posting of their manuscript on a preprint site. Anyone willing to sign their comments could post a review or comment during the conventional peer-review process. Substantive comments were made public and used alongside the referee reports to inform the editorial decision. Five percent of authors across all fields (71) took up the invitation to expose their manuscripts to community wide review, although many were concerned about the prepublication release of competitive information. Ninety-two comments were posted on 54% of the posted manuscripts (half of the comments centred on only eight manuscripts). The six cell/molecular biology manuscripts posted received 2.8 comments on average. Generally, the comments did not present information over and above the referee reports, and no decision was altered on account of the comments. Indeed, both biology and physics editors found the comments of limited help, although the majority of authors who received comments found them useful. The low comment rate seems not to reflect a lack of interest as there was high web traffic to both the manuscript preprint site and the associated invited commentaries on peer review. Clearly there is significant interest in alternative systems, but community peer review is not sufficiently informative at this time. It would be interesting to see if confidential comments would increase the rate and thoroughness of the responses and curated comments alongside published work could be a valuable addition.
We will keep an open mind about revisiting this hybrid peer review system in the future. However, the trial has shown that, for now, this process would be of limited use in improving the decision making process. Confidential peer review is a bit like democracy: the limitations are clear, but it seems to be the best we've got.
Rights and permissions
About this article
Cite this article
Opening up peer review. Nat Cell Biol 9, 1 (2007). https://doi.org/10.1038/ncb0107-1
Issue Date:
DOI: https://doi.org/10.1038/ncb0107-1
This article is cited by
-
Open peer review: promoting transparency in open science
Scientometrics (2020)
-
Blinding in peer review
Journal of Medical Toxicology (2008)