Imprimir Republish

Good practices

A recipe to follow

Consortium proposes checklists to ensure that microscopy images in articles can be understood and confirmed

Sergey Lobodenko / Getty Images

A group of 554 researchers from 39 countries has created a set of guidelines on publishing optical microscopy images in biological and biomedical science articles. The objective of the initiative is to ensure figures are correctly interpreted and above all, to facilitate the replication of results by other research groups. In an article published in the journal Nature Methods in September, a group of representatives from the consortium recommended that authors, reviewers, and scientific journal editors use checklists to determine whether an image was properly processed and whether it contains enough information to be understood and replicated. The goal is to detect and correct problems before the figure is published. “These checklists give clear and easy instructions on how to publish and analyze images, in a great step towards making them more reproducible, understandable, and accessible,” said the paper’s lead author, data scientist Christopher Schmied of the Leibniz Research Institute for Molecular Pharmacology in Berlin, Germany, on his LinkedIn profile.

The proposed checklists cover different image parameters, such as formatting standards, color treatment, and annotations or labels added by the author, and they are divided into three requirement levels: minimum, recommended, and ideal. The minimum level covers indispensable requirements needed for scientific results represented by an image to be confirmed by other groups, such as data on the origin of the figure and the processing methods used. Fulfilling these requirements—which include sharing methods used to enlarge details of an image, brightness and contrast adjustments, and original data—can help identify crucial shortfalls before a paper is published, preventing questions and disputes.

The recommended requirements are more numerous, seeking to ensure that the finding demonstrated in the image is easily understood. One example is the suggestion to provide a color intensity scale to help explain the meaning of the colors displayed. The third level, classified as the ideal, establishes additional practices, such as uploading copies of original figures to specialized databases and offering grayscale versions of images to allow for comparison with colored originals.

Publishing in black and white is sufficient, according to the guidelines. “We advise everyone to publish their images in black and white rather than in color, because your eye is much more sensitive to details in monochrome,” explained English biologist Alison North, one of the participants in the initiative and director of the Bio-Imaging Resource Center (BIRC) at Rockefeller University, USA, to the ScienceDaily website. “Many investigators like color images because they’re pretty and they look impressive. But they don’t realize they’re actually throwing away a lot of information.” As well as suggesting strategies for obtaining reliable images, the group also listed methods that should be avoided in certain situations. One such example is interpolation, which is the creation of new pixels based on those already in the image, used to reinforce sharpness. This technique is not recommended for enlarging details of an image with low resolution, because the risk of distortion is high.

The guidelines are the result of a community effort that began in 2020 with the formation of Quality Assessment and Reproducibility for Instruments & Images in Light Microscopy (QUAREP-LiMi), an international initiative started by researchers from universities and companies interested in creating standards and protocols for microscopy images. One of the group’s primary concerns was to reduce the number of mistakes and unnecessary manipulations of images, which are a frequent cause of articles being retracted. The aim is to combat what is known as the “replication crisis,” which has seen a number of high-profile cases where scientific articles, especially in fields such as medicine, life sciences, and psychology, have been discredited because it was not possible to replicate their results in subsequent studies.

“If scientists start adopting even the minimal standard for image publication, reproducibility would be so much easier for everyone,” BIRC image analyst Ved Sharma told ScienceDaily. “There is so much information that could be included for each image in a paper, but most of the time it’s not available, or the reader has to dig deep into the paper to find out where the information is in order to make sense of the image.”

The QUAREP-LiMi checklists also include details of phases and protocols for obtaining and processing images, dubbed workflows, which can include steps such as reconstruction, segmentation, labeling, and statistical analysis. Workflows can be grouped into three categories: established, novel, and machine learning. For an established workflow, defining every step is a minimum requirement, while providing a tutorial on how the image was processed is part of ideal-level checklist. For novel workflows, the requirements are broader: it is essential to detail every step and component so that other researchers can create the same conditions and verify the results. For workflows that use machine learning, the recommendation is to also state what data was used to train the AI models and to make the code available to interested parties.

According to the Nature Methods article, the guidelines could also be useful for improving the reliability of other types of scientific images, such as photographs, diagnostic tests, or images obtained by scanning electron microscopes, although they were not specifically created for these purposes. One obstacle to implementing guidelines like these is the additional cost they tend to impose on researchers. Such charges can be prohibitive for scientists from low- and middle-income countries. To overcome this problem, the QUAREP-LiMi scientists decided to relax some of the requirements, especially those in the ideal level. Storing data in specialist image databases, which is expensive, is considered a negotiable requirement. “To be inclusive, we do not enforce the use of online repositories but require as a minimal measure that scientists be prepared to share image data,” they wrote in the article.

Republish