Imprimir Republish

Good practices

Cash reward for finding errors in influential articles

Over the next four years, a program based at the University of Bern, Switzerland, will pay experts to look for errors in over one hundred influential scientific articles. The more problems they find, the more money they will receive. Each will be paid a minimum of US$285 if they find nothing, rising to more than US$1,000 if they find significant errors. If a reviewer’s analysis leads to a recommendation that the paper should be retracted, they will receive an extra US$2,835.

The program, named ERROR (Estimating the Reliability and Robustness of Research), is led by psychologist Malte Elson, who says that one of the objectives is to show funding agencies the importance of investing in reviewing and replicating the studies they fund. “When I build my research on top of something that’s erroneous, that’s a cost because my research is built on false assumptions,” he told the Chronicle of Higher Education. Elson, a professor at the University of Bern who specializes in the psychology of digitalization, is managing the initiative with fellow psychologist Ruben Arslan, who is doing a postdoctoral fellowship at the Leipzig University, Germany.

The reviewers have to look for errors of all kinds, including miscalculations, discrepancies between what was observed and reported, and biases in conclusions, and then compile them in a report. An ERROR supervisor will then evaluate the results and submit them to the article authors, giving them the opportunity to defend themselves and provide explanations. At the end of the process, a summary of the main problems will be generated, with a recommendation on how to respond, from corrections of minor mistakes to a retraction in the event of serious errors.

The program will begin with the review of three articles, including one published in 2020 that described a strategy for discouraging the online spread of misinformation about COVID-19. Psychologist Gordon Pennycook, lead author of the study and a researcher at Cornell University, USA, said he is excited to see the results of the work being replicated by other groups.

Articles will only be evaluated if the authors agree to subject them to the scrutiny, since the reviewers will only be able to perform a thorough analysis if they have full access to the data, codes, and other materials related to the paper. Elson believes the process will be useful for article authors. “People can then point to the error report that will be public and say, ‘They checked my work and it’s fine,’” he said. But some are concerned about damage to their reputation if the results are bad. So far, the authors of two articles selected by the program have declined to participate.

Republish