Imprimir Republish

Interview

Richard Sever: Speeding up scientific communication

Biochemist suggests all researchers should be required to publish draft manuscripts on public repositories

Personal archive Sever believes publishing research results on preprint servers should be a mandatory requirement for agency fundingPersonal archive

Amid discussions on how to implement Plan S, a European initiative that requires all publicly funded articles to be published via open access by 2021, British biochemist Richard Sever is proposing a simpler strategy for making scientific knowledge readily accessible: all research results should be made immediately available on preprint servers—open repositories of draft manuscripts that have not yet undergone peer review by scientific journals. In the model suggested by Sever, this would be a mandatory requirement imposed by funding agencies.

Sever is one of the founders of bioRxiv, a life sciences preprint repository run by Cold Spring Harbor Laboratory in the USA, where he is assistant director. He presented his proposal, which he calls Plan U (for Universal), in an article published in PLOS Biology in June, alongside John Inglis, cofounder of bioRxiv, and Michael Eisen, one of the creators of the Public Library of Science (PLOS), which publishes a collection of open-access journals. In the following interview, Sever talks about how the system could accelerate scientific progress.

What are the main benefits of adopting Plan U?
Preprints help accelerate research developments and communication by giving other scientists immediate access to academic findings. Researchers can use preprints as a basis for new studies or to refine ongoing research. Preprints also help increase and speed up the exchange of information between authors and their peers around the world. However, the benefits of this new approach to scientific communication can only be fully realized when all papers are published in preprint repositories. For that to happen, research funding agencies need to demand that the results of studies they fund be shared as preprints before being submitted to scientific journals for peer review.

Why is this important?
Because funding agencies control the resources—they are the only ones with the power to impose this requirement on researchers. If they did so, it would be quicker, simpler, and cheaper to implement a system of free and fast access to the world’s scientific output.

Have any funders already adopted this requirement?
There has been some movement in this direction in recent years. The Chan Zuckerberg Initiative, created by Mark Zuckerberg [Facebook founder] and his wife Priscila Chan in 2015, requires funding recipients to publish their research results in preprint repositories. The Wellcome Trust has also adopted this policy for studies whose findings may be of immediate public interest.

How would the peer review system work if Plan U were adopted?
The idea is that studies will continue to go through the peer review process, but only after they have been published as preprints. The current process for publishing scientific articles is long. In general, researchers submit their manuscript to a journal, which decides whether or not to submit it for peer review. If accepted, the reviewers often ask authors to conduct more experiments to support their conclusions. Once this is done, the article is submitted again, and has to go through the same steps all over again before it is published. This entire process takes an average of eight months. That is a lot of wasted time that could have been used by other scientists to access the findings and advance their research. By separating the publication of preprints from this slow review process performed by journals, we could accelerate the dissemination of scientific results.

And the peer review process would remain the same?
Yes. The difference is that the findings behind the articles would already be public and available on preprint repositories for anyone interested.

This practice is already common in some fields, such as physics. What is the consequence of this kind of experience?
Yes, thanks to arXiv, which has been in operation for almost 30 years. The initiative was a success and inspired the emergence of other servers. Several fields now have similar repositories, such as the life sciences with bioRxiv, the chemical sciences with chemRxiv, and the earth sciences with EarthArXiv.

What are the implications of Plan U for the business model of scientific publications?
It’s hard to say what the financial impact would be, but it would certainly put the onus on restricted-access journals to be much more transparent about the article evaluation process and encourage them to add value to justify their costs.

One of the main concerns about preprints is that they may be seen as inferior articles because they have not undergone peer review. How do you respond to those concerns?
I think the point is that manuscripts posted on preprint repositories are not of inferior quality, but they are “unreviewed”. The peer review process will continue to be performed by scientific journals.

Some researchers argue that there is little feedback on repositories in terms of critique and comments. How does Plan U intend to encourage researchers to comment more on preprints?
Yes, at the moment there is a relatively low level of feedback on papers on bioRxiv, but we know that many people receive responses privately via email and other channels, such as Twitter. The biomedical sciences were slower to adopt preprints, but bioRxiv is experiencing exponential growth, with more and more users every day. I believe the amount of public feedback will increase with greater adoption and people just getting used to the idea. And that’s where Plan U comes in.

Could the implementation of Plan S accelerate the adoption of Plan U?
Plan S is more concerned with changing the traditional publishing model, and while it encourages the use of preprints, I don’t see it as a stimulus for the adoption of Plan U, even though the two are compatible and complementary in a way.

What are you doing to put Plan U into practice?
For now, we are trying to spread the work and get more people to think about the possibilities.

ArXiv’s success paved the way for repositories in other fields

Created in August 1991 by American physicist Paul Ginsparg of Cornell University in the USA, arXiv is the world’s best-known preprint repository. It offers free access to nearly 1.5 million articles, with 140,000 manuscripts added each year and over 1.2 million hits per day. Initially created as an online repository for physics articles, arXiv has since been expanded and now includes manuscripts from fields such as astronomy, computer science, mathematics, and statistics. The repository is maintained by donations from libraries and philanthropic institutions and is moderated by approximately 150 volunteers. They do not perform a traditional peer review, but they assess the manuscripts submitted and reject them if they cannot be classified as a scientific paper. Some of the papers available on arXiv are later published in scientific journals. Some, however, remain as preprints only. One example of the latter is a 2002 manuscript by Russian mathematician Grigori Parelman, which gave an overview of Thurston’s geometrization conjecture, one of the biggest unsolved problems in mathematics at the time. The paper was never published in a scientific journal, but it earned Parelman the Fields Medal in 2006. In Brazil, the SciELO online library, established in 1997 with funding from FAPESP, last year announced a new partnership with the Public Knowledge Project (PKP) to create a preprints server by 2020.

Republish