Imprimir Republish

Good practices

Authenticated research results

Initiative compiles scientific articles accompanied by studies that verify their conclusions

Artmannwitte  / Getty Images

The Framework for Open and Reproducible Research Training (FORRT) has compiled an experimental database of 1,100 psychology, medicine, and administration articles that are accompanied by replication studies (carried out by other researchers exclusively to verify the original findings). The FORRT initiative was created in 2018 by researchers from 28 countries with the aim of improving access to replication studies, a self-correction mechanism that serves to determine the reliability of scientific articles and highlight results that cannot be repeated.

“We wanted to make it easier to track replication studies. It’s often difficult to find them, partly because journals tend to not link them to the original work,” psychologist Lukas Röseler, one of the leaders of the project and director of the University of Münster’s Center for Open Science in Germany, told the journal Nature Index. Another of FORRT’s goals is to encourage researchers to publicize their attempts to confirm results obtained by colleagues. “It’s often difficult to publish these studies, regardless of their outcome,” said the researcher, referring to the limited interest of most prestigious scientific journals in replication studies.

The original articles and their confirmations have so far been compiled manually: project volunteers search for replication studies and then link them to the original findings in an online spreadsheet. Some papers have been linked to up to four other studies, verifying one or more of their results. The initiative has benefited from efforts made by the scientific community over the last 15 years to reevaluate articles from various fields in order to determine whether they are truly reliable.

The objective was to address what has been called the “replication crisis,” sparked by a succession of articles that were discredited after being disproved in subsequent experiments or yielding less successful result than initially promised, including potential cancer treatments. Most of the research listed in the database, for now, is from the field of psychology, which has been particularly affected by this breach of trust, at least in part due to scandals such as the case of social psychologist Diederick Stapel, who was fired from Tilburg University, Netherlands, in 2011 after he had 58 articles retracted for data fabrication and manipulation. In 2016, an international collaboration replicated 100 experimental psychology studies published in three journals and was only able to reproduce the results of 36 of them. In the same year, some 50,000 psychology articles were scrutinized by software capable of detecting statistical inconsistencies, with problems found in half of them (see Pesquisa FAPESP issue n° 253).

Among the list of articles gathered by FORRT, the outcomes varied. In 40% of them, attempts to confirm the original findings were successful. One example is an article published in Personality and Social Psychology Bulletin in 2015 by Benjamin Cheung and Steven Heine, from the University of British Columbia, Canada. The study stated that using genetic factors to explain criminal behavior leads to different perceptions than when the same crime is attributed to environmental influences. This, however, does not produce any advantage for a defendant facing a jury, according to the authors. The reason is that despite being favorable to the accused, the perception that they had no control over their actions is offset by the negative perception that they remain a threat and could easily reoffend. The conclusion was based on more than 600 interviews with university students and individuals who responded to a paid online questionnaire. In 2016, psychologist Jarret Crawford of The College of New Jersey, USA, repeated the study and reviewed 16 effects observed by Cheung and Heine. He obtained the same findings for 14 of them, and a different but similar result in the fifteenth. Only one effect was not observed. “The authors of the original paper paid their research participants three times as much as I did, but this does not appear to have influenced the results,” Crawford wrote.

Replication attempts failed for 58% of the FORRT database articles, for various reasons. In some, the data pointed in the opposite direction to the original finding, while in others, the results were similar but statistically insignificant, or the samples did not allow for a reliable comparison. One unsuccessful case was a 2008 article published in the Journal of Personality and Social Psychology by a group from Leiden University in the Netherlands. According to their analysis, individuals who display anger in the midst of a negotiation are not necessarily more likely to persuade their opponents to give in than those who display happiness while bargaining, as the academic literature has previously described. According to the paper, in some specific situations, the use of anger can hinder negotiations. The replication study, carried out in 2014 by two psychologists from the University of Vienna, did not observe this disadvantage when the experiment was repeated with 27 men and 53 women from Austria using the same methodology. The remaining 2% of the replication studies in the FORRT database were either inconclusive or had mixed results, with some attempts confirming the original data while others did not.

The initiative also has other ambitions, such as to generate knowledge about reproducibility based on analyses of the database. Röseler offers an example: scientists today are encouraged to publish registered reports on their research projects before carrying them out, stating the design, methodology, and hypotheses to be evaluated. The aim is to prevent researchers from manipulating these premises to fit the results when they publish their findings. “We would like to empirically see whether interventions such as these affect how likely a study is to be replicable,” he said. The group also plans to launch an open-access, peer-reviewed scientific journal exclusively for replication studies from different disciplines.

Republish