{"id":421222,"date":"2022-01-19T12:25:28","date_gmt":"2022-01-19T15:25:28","guid":{"rendered":"https:\/\/revistapesquisa.fapesp.br\/?p=421222"},"modified":"2022-01-19T12:25:28","modified_gmt":"2022-01-19T15:25:28","slug":"war-on-photoshop","status":"publish","type":"post","link":"https:\/\/revistapesquisa.fapesp.br\/en\/war-on-photoshop\/","title":{"rendered":"War on Photoshop"},"content":{"rendered":"<p>In September, a group of scientific communication companies released a good practices manual to help editors deal with a type of misconduct that can be hard to detect: the alteration or duplication of images in articles and academic papers. The document, <a href=\"https:\/\/osf.io\/xp58v\/\" target=\"_blank\" rel=\"noopener\">available on the OSF repository<\/a>, is currently receiving suggestions for improvements, with a final version due in December.<\/p>\n<p>The manual provides precise guidelines on three different levels of manipulation. Level 1 refers to photos that have been duplicated or slightly modified, but the changes do not influence the conclusions of the paper. This includes, for example, contrast or color adjustments made to highlight a finding, or the same image being posted twice in the same manuscript to illustrate different experiments. Journal editors are advised to accept a corrected version of the images and close the case if the problem is discovered before the article is published and the authors prove that they did not act in bad faith. All coauthors of the manuscript, however, must be informed of and agree with the rectification. If the altered images are identified after publication and are not malicious, the article should receive a correction.<\/p>\n<p>Level 2 covers significant modifications that are contrary to normally acceptable standards and alter critical characteristics of an image. One example is the inversion or repositioning of bands in a western blot, a test used to identify proteins in molecular biology. In cases like these, the authors&#8217; institution must be notified so that it can investigate potential misconduct and any messages or information exchanged between the authors and the editor must be recorded in the peer review file. If the authors are able to show that it was a mistake and not fraud, editors can accept a corrected version of the image. If the work has already been published, it can be retracted and republished, or simply retracted, if there is no justification for the manipulation.<\/p>\n<p>Level 3 includes articles in which multiple images have been modified using editing software, where an intention to doctor the images is evident and the original data are no longer represented due to selectivity. Unless the researchers can provide an excellent explanation, the manuscript should be rejected and the authors&#8217; institutions informed so that they can be investigated. If the fraud is discovered after publication, an \u201cexpression of concern\u201d should be issued for the paper, stating that it may contain errors and is being reviewed and could be retracted.<\/p>\n<p>The manual was written by representatives from publishers Elsevier, Wiley, Springer Nature, Embo Press, and Taylor &amp; Francis and journal collections <em>JAMA,<\/em> <em>Cell,<\/em> and the <em>American Chemical Society<\/em>, coordinated by the scientific publishers association STM, based in the United Kingdom. It offers more detailed advice than the guidelines created in 2018 by the Committee on Publication Ethics (COPE), a UK-based group that discusses issues related to integrity in science. COPE designed a flowchart for editors to follow\u2014a step-by-step guide of what to do when suspicion is aroused. But it does not define separate categories of image manipulation.<\/p>\n<p>Microbiologist Elizabeth Bik, an expert at identifying altered scientific images, believes the new manual is more advanced. According to her, it is not uncommon for institutions to put off investigating complaints about their researchers, neglect to punish them even when falsification is clear, or fail to inform editors about the results of internal investigations. \u201cThe recommendations state that journals can act by themselves, even if they disagree with the institutions\u2019 conclusions,&#8221; Bik told the journal <em>Nature<\/em>. &#8220;The rules will not prevent science misconduct, but they provide stronger scrutiny both at the submission stage, as well as after publication.&#8221;<\/p>\n<p>The guide advises that strong accusations of tampering be investigated even if they are made anonymously, and says editors are responsible for protecting the identity of accusers. It is common for suspicions to be reported on online scientific discussion forums and even on social networks. Editors may respond to these comments at their discretion.<\/p>\n<p>Image alteration and duplication are recurrent problems faced by scientific journal editors. In 2016, Elizabeth Bik manually analyzed over 20,000 biomedical articles and found some form of adulteration in 4% of them (<a href=\"https:\/\/revistapesquisa.fapesp.br\/en\/the-specter-of-duplicate-images\/\" target=\"_blank\" rel=\"noopener\"><em>see<\/em> Pesquisa FAPESP <em>issue n\u00ba 245<\/em><\/a>). Detecting this type of fraud still relies largely on the human eye, although software to help is under development.<\/p>\n<p>Misconduct cases relating to doctored images can be complex and do not always involve aberrant changes. One of the main challenges faced by editors today is identifying manuscripts produced by \u201cpaper mills\u201d\u2014illegal services that sell scientific papers on demand, often containing falsified data (<a href=\"https:\/\/revistapesquisa.fapesp.br\/en\/punishment-for-paper-mills\/\" target=\"_blank\" rel=\"noopener\"><em>see<\/em> Pesquisa FAPESP <em>issue n\u00ba 296<\/em><\/a>). A group of researchers recently identified 400 articles with images so similar that they must have come from a common origin\u2014a paper mill. To identify this type of misconduct, the human eye alone is not enough. All of the images in an article need to be analyzed by a computer and compared to databases of images from other papers. &#8220;There arevarious ways to accomplish this systematic, universal screening, including algorithmic methods that are now coming online and need to be vetted for effectiveness by comparing them to visual screening,&#8221; wrote Mike Rossner, former editor of the <em>Journal of Cell Biology<\/em>, in the comments section of the draft manual. Rossner is president of Image Data Integrity, an American consultancy that advises research institutions, funding agencies, and scientific journals on image manipulation in biomedical science studies (<a href=\"https:\/\/revistapesquisa.fapesp.br\/en\/outsourced-quality-control\/\" target=\"_blank\" rel=\"noopener\"><em>see<\/em> Pesquisa FAPESP <em>issue n\u00ba 287<\/em><\/a>). &#8220;The editors&#8217; working group might consider creating recommendations for this screening process.&#8221;<\/p>\n","protected":false},"excerpt":{"rendered":"Scientific publishers write a manual on identifying articles containing doctored or duplicated images and how to combat this type of misconduct","protected":false},"author":11,"featured_media":420633,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_exactmetrics_skip_tracking":false,"_exactmetrics_sitenote_active":false,"_exactmetrics_sitenote_note":"","_exactmetrics_sitenote_category":0,"footnotes":""},"categories":[155],"tags":[230],"coauthors":[98],"class_list":["post-421222","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-good-practices","tag-ethics"],"acf":[],"_links":{"self":[{"href":"https:\/\/revistapesquisa.fapesp.br\/en\/wp-json\/wp\/v2\/posts\/421222","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/revistapesquisa.fapesp.br\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/revistapesquisa.fapesp.br\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/revistapesquisa.fapesp.br\/en\/wp-json\/wp\/v2\/users\/11"}],"replies":[{"embeddable":true,"href":"https:\/\/revistapesquisa.fapesp.br\/en\/wp-json\/wp\/v2\/comments?post=421222"}],"version-history":[{"count":3,"href":"https:\/\/revistapesquisa.fapesp.br\/en\/wp-json\/wp\/v2\/posts\/421222\/revisions"}],"predecessor-version":[{"id":421224,"href":"https:\/\/revistapesquisa.fapesp.br\/en\/wp-json\/wp\/v2\/posts\/421222\/revisions\/421224"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/revistapesquisa.fapesp.br\/en\/wp-json\/wp\/v2\/media\/420633"}],"wp:attachment":[{"href":"https:\/\/revistapesquisa.fapesp.br\/en\/wp-json\/wp\/v2\/media?parent=421222"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/revistapesquisa.fapesp.br\/en\/wp-json\/wp\/v2\/categories?post=421222"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/revistapesquisa.fapesp.br\/en\/wp-json\/wp\/v2\/tags?post=421222"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/revistapesquisa.fapesp.br\/en\/wp-json\/wp\/v2\/coauthors?post=421222"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}