22 September 2022 – As part of Peer Review Week 2022, Bernd Pulverer, Head of EMBO Scientific Publications, talks about how EMBO Press journals are increasing trust in the research they publish with a special focus on their transparent peer review initiative
Why and when did EMBO initiate transparency in peer review?
EMBO Press journals first started transparent peer review at The EMBO Journal as an experiment when Pernille Rørth was Editor-in-Chief in 2009. The initial impetus for this was because we had regularly witnessed discussion about perceived issues with the peer review process at bioscience journals. These would range from concerns about biases, be they based on science, geography, gender or name, over incompetence or superficiality to gross misconduct – often captured with allusions to ‘that darn referee 3’. These concerns seemed hypothetical and even unusual to us, as in our experience the peer review process was almost invariably remarkably thorough, informed and constructive. We also felt a little sorry that the many hours that we knew most referees were putting into their reports at the EMBO journals was not more widely recognised. It occurred to us, based on earlier experiments with opening up the peer review process at other journals such as the British Medical Journal and Atmospheric Chemistry and Physics, that it would be easiest to showcase the referee reports alongside the published articles. We felt the quality was such that we had nothing to hide and instead much to add to the scholarly record.
Why did EMBO choose transparent peer review rather than fully open peer review?
The difference to open peer review is that we don’t mandate the signing of review reports, especially as it seems that the appetite for referees to sign their reports remains rather low. There are many pragmatic reasons for this, but the important point to note is that we wanted to focus on the scientific dialogue and scholarly argument, rather than name recognition. Consequently, we encourage, but don’t require, referees to sign their reports, and in fact very few do. This is a basic principle we have followed through with other publishing innovations such as anonymous Referee Cross-Commenting where referees are invited to comment on each other’s reports. This consultative practice has been immensely helpful in aiding the editorial decision-making process. In addition to being transparent about the referees’ comments, we always combine their feedback with the author point-by-point rebuttals – just like in any court case, this provides both sides of any argument. Finally, we also publish our editorial decision letters and author communication with all the relevant dates – full transparency of every step in the process.
How has it been going so far?
We were very pleased that both the referees and authors overwhelmingly accepted this initiative and as a result we have applied the policy across all our journals for a decade now. We came to realize there are actually four key benefits of transparent review. Firstly, the referee reports provided three detailed expert views on a given dataset to complement the authors’ own interpretation of their data; this is analogous to three published ‘News & Views’ on every article. Secondly, the reports would add potential academic credit for the referees. Since the reports are mostly anonymous, however, this recognition is not a trivial task, but we are now very actively working with funders and research institutions on mechanisms to make this a reality. Thirdly, the other side of credit is accountability; even if the reports are not signed, we feel that publishing the reports ensure referees think a little more carefully about how they write their reports. However, we struggle to quantify this argument, partially because our reports were already pretty good to start with! Finally, and very importantly, we believe transparent peer review is excellent for training and education purposes. Those who are new to the reviewing process can learn by example how to become expert referees in their areas of expertise.
The whole initiative has been unequivocally a success and we are so pleased that many other journals have also decided to adopt more transparency. The community feedback agrees with our perspective: for many years now our ‘opt out’ rate for both authors and referees has been below 1%.
Are there other ways to enhance the peer review process?
This year’s Peer Review Week focusses on research integrity, and quality peer review is absolutely part of research integrity: abuse of the review process is an integrity breach, albeit one that is rarely reported. This needs to change. If there is evidence that referees exploit the system, for example by rejecting a paper and then using the data to publish their own study without having declared a conflict of interest, this must have consequences and journals need to follow up with institutions in these cases.
I advocate inviting early career researchers to provide a more technical level of peer review, focussed on specific deep dives into real data. For example, because we now mandate the posting of source data at our journals, it is possible to uncover potential research integrity issues hidden deeper in the data through statistical analyses. This is what we are working on now at EMBO Press, and we are excited that the US White House OSTP recently released its data management and sharing directive, calling for mandatory data sharing along the lines we have encouraged for years. Open Science is about to become much more real!
Have you any other recent developments to share?
Yes, we are making the transparent review files citable formal units of publication in their own right, and we have also added a number of features to enhance the process further. Beyond Referee Cross-Commenting mentioned earlier, more recently we started to include an ‘author pre-consultation’ step, where the authors see the referee reports before the editors make their final decision and can respond with a revision plan, often via a Zoom call. Altogether, transparent peer review thus aims to level the playing field between authors, referees and editors, while at the same time opening the perceived black box of peer review to the public.
Another key enhancement is applying a scooping protection policy from the day of preprint posting or journal submission, which allows authors to undertake through revision without the constant fear of losing out on reporting priority.
In addition, our in-house research integrity screening process is designed to complement peer review. When potential issues arise, the editors will sometimes involve the referees again to contextualize the scientific impact of the issues uncovered.
Finally, we have for years now encouraged source data posting – that is publishing the data underlying the figures in our papers. Importantly, we offer a free data curation process that renders the figures and source data machine readable and thus discoverable by data-directed search technology. This is very important, as it will allow direct discovery of experimental data in published articles, rather than the blunt tool of searching for keywords in the authors’ textual interpretation of their data.
What else does EMBO do to support research integrity?
Incentivizing referees to do a thorough job in peer review – through training and by including peer review in research assessment – is essential for a reliable, professional process and we are focussing on both aspects.
Research integrity screening works at a different level to a scientific assessment and a separate, complementary process is in my view essential. In cooperation with other publishers, we are developing community tools and policies to support this process. It must be stated that these services are very resource intensive and add costs to publishing, which will hopefully be covered by funders. Not addressing research integrity would lead to the far higher costs of an unreliable literature.
Finally, EMBO has been very active for a decade in supporting open science initiatives such as data sharing and posting of preprints. We believe that journals can play an important role in fostering open science and don’t believe these modalities of sharing science necessarily compete. Of note, the OSTP policy directive and the NIH Policy for Data Management and Sharing both specifically note the importance of data sharing and research integrity. We strongly agree with this position at EMBO Press.
The interview is re-published with kind permission from Wiley.