Imprimir Republish

Good practices

A new reward system

World conference in Hong Kong proposes changes to evaluation processes to help improve scientific integrity

The 6th World Conference on Scientific Integrity, held in Hong Kong, China, on June 2–5, took a different approach to the usual debate on ethics in research. Rather than addressing issues directly related to the problem, such as plagiarism, fabrication, and fraud, or the need to train students and researchers on good practices, the meeting focused on the reliability of science and the influence that evaluation processes have on cases of misconduct. The suggestion is that research reward systems, which are still largely based on quantitative performance indicators and a limited set of scientific activities, foster negative behavior, from publishing in poor quality journals to inflate a scientist’s number of published papers, to publishing biased results to reinforce the impact of a study. “It is important to change the way we assess research,” epidemiologist Lex Bouter, co-chair of the conference and dean of Vrije Universiteit Amsterdam from 2006 to 2013, told German public radio station Deutschlandfunk. According to Bouter, the emphasis on evaluation criteria such as number of publications and citations is not helpful. “Other activities, which are also important to research, end up neglected. It sends the wrong message and creates negative incentives.”

The discussions at the conference were guided by a draft manifesto written by the organizers, the final version of which will be released shortly. The Hong Kong Manifesto lists five principles to consider when promoting researchers and assessing their contributions. The first is to reduce the emphasis on quantitative metrics, such as number of articles published, the H-index, and journal impact factor, and to instead reward practices that encourage research integrity and quality. One example given by the document is to publicly register research protocols in advance, making the premises and methods of a study transparent and reinforcing confidence in the results.

The second principle is to recognize the efforts of researchers who disclose all of their findings, including negative results. By only publishing positive results, scientists only give a partial picture of the challenges they faced, making it more difficult for colleagues to reproduce the research and replicate the results. The manifesto encourages the disclosure of preliminary results in preprint repositories, allowing a wide range of experts on the subject matter to scrutinize them prior to publication, as well as the publication of registered reports—papers that detail the peer-reviewed methods and analysis plans of a study before it begins. Journals that specialize in publishing these reports commit to publish the final results even if they are null, enabling a true comparison between the hypothesis and the outcome.

The other three principles address the need to make science more open by sharing more data from trials and experiments; recognizing a broader range of scientific activities, such as the replication of studies by other researchers; and rewarding essential tasks such as peer review and the mentoring of students and young researchers.

According to Sonia Vasconcelos, a researcher with the Bioscience Education, Management, and Dissemination Program at the Institute of Medical Biochemistry of the Federal University of Rio de Janeiro (UFRJ), the manifesto reinforces the need to adopt more responsible assessment metrics, an issue that has already been raised in other documents, such as the San Francisco Declaration on Research Assessment (DORA), 2012. “The emphasis is stronger now and the document places a greater focus on the responsibility of funding agencies to reward responsible practices, which should be better explained and represented in requirements for grant and funding applications,” says Vasconcelos, who presented a study in Hong Kong on the role of corresponding authors in building a responsible authorship culture in emerging countries. In 2015, she was in charge of local organization for the conference, which was held in Rio de Janeiro in its 4th year.

Olavo Amaral, a researcher from UFRJ who presented the Brazilian Reproducibility Initiative at the conference, notes that discussions on scientific integrity have evolved in recent years: “Historically, the issue of scientific integrity was structured around the debate on misconduct and rule-breaking by scientists. But it is becoming increasingly clear that most of the problems come from the way researchers respond to existing rules and incentives—and the issue may be that these incentives do not favor the development of reliable science.”

Participation by Brazilian researchers

Eight Brazilian researchers attended the 6th World Conference on Scientific Integrity in Hong Kong, a smaller number than in previous years. In Amsterdam in 2017, there were about 20 Brazilians, and in Rio de Janeiro in 2015, more than 200. But the work they presented this year still had significant repercussions. Biochemist Mariana Dias Ribeiro was one of three winners of the Excellence in Doctoral Research Awards at the Doctoral Forum, a conference session where PhD students present their projects to a panel of experts to receive comments and recommendations. Ribeiro, who is studying her PhD in Biosciences Education, Management, and Communication at UFRJ’s Institute of Medical Biochemistry, plans to evaluate what impact the retraction of a scientific article has on the author’s career. She will select participants from a sample of 2,000 biomedical researchers who have had articles retracted, and then interview them to find out what kind of consequences they suffered and whether they lost students, jobs, or funding. The project should be completed in 2022. “It was very important to present the project in its early stages and receive positive feedback.
The judges considered my research ambitious,” says Ribeiro, who is supervised by Sonia Vasconcelos, from UFRJ.

Another highlight was the Brazilian Reproducibility Initiative, presented by physician Olavo Amaral, another researcher at UFRJ. Amaral’s project is aiming to reproduce the results of up to 100 biomedical studies described in scientific papers by Brazilians authors. “The initiative was highly praised at the conference and it was a great opportunity to discuss similar projects and results from other groups in other countries,” says Sonia Vasconcelos. Rafaelly Stavale, a nurse and master’s student at the University of Brasília (UnB), presented the results of a survey on health science articles retracted between 2004 and 2017 whose authors were affiliated with Brazilian institutions. She identified 65 articles and found that most were retracted due to misconduct—60% involved plagiarism. She also observed that some of these articles were still available in various databases, with no warning that they had been retracted: 63% of them continued to be cited even after retraction. The results were published in the journal PLOS ONE in April. “My goal is to turn the study into a PhD project, to assess why these researchers were involved in misconduct and how much they understand the seriousness of their actions,” Stavale explains.

Republish