Imprimir Republish

Good Practices

Strictness and transparency in biomedical research

009_Boas_práticasEduardo CesarIn an opinion article published in the journal Cell Metabolism, Daniel Drucker, a Canadian, made five suggestions to help biomedical researchers overcome what he calls a “crisis in reproducibility on translational research.” He defined this as the chasm between the hopes sparked by preclinical tests that suggest the viability of new therapies and drugs and the later failures to convert these into actual victories. Drucker, a professor at the Lunenfeld-Tanenbaum Research Institute of Mount Sinai Hospital in Toronto, Canada, known for having helped develop new treatments for diabetes as well as methods to combat a disease called short bowel syndrome suggests, first, that the same rules as adopted in clinical testing be followed in preclinical tests. He observes that in the former, the omission of unfavorable results and publishing only the positive ones is prohibited. But in preclinical testing, it is common for researchers to conduct a large number of basic science experiments with animals and publish only those that were successful.

“Most negative, contradictory, or divergent results are not reported,” writes Drucker, who recommends that those test results be presented in an organized way when the article is submitted to a journal. “Transparency can frame the promising results in a broader and more realistic context.” The researcher also suggests that experiments be done on different animal models before being considered for testing with human cells and tissue. According to Drucker, this strategy would slow down the research but reduce the possibility that an encouraging finding would prove to be a false hope.

Drucker’s other recommendations are more generic. One of them is to organize panels to discuss problems related to reproducibility of research at medical and scientific conferences that would illustrate the recent experiences of scientists, funding agencies and journal publishers. “The most common problems could be highlighted and new solutions proposed,” says Drucker in his article.

Another suggestion is to insist that research project leaders, in submitting their applications for funding, list the most frequently cited works that they have already published and give examples of how their principal results were validated in other papers.

Lastly, Drucker argues that researchers should furnish more details about their experiments at the time they are published, describing the reagents and reporting their source as well as the cell lines and animal models they used. That would facilitate the work by people who want to reproduce the results.

Republish