An online tool has been winning over users with its ability to help researchers identify reputable scientific journals in which to publish their work. The Think.Check.Submit. website provides a list of ways to prevent unsuspecting authors from falling for the false advertising of predatory journals—low-reputation publications that publish articles in exchange for a fee without conducting a genuine peer review.
The website’s name is based on the key steps a scientist should take when choosing a journal. The first step (think) is rhetorical. Its purpose is to draw attention to the danger of predatory titles and the importance of identifying reliable journals and publishers. The second (check) is to use the checklist itself. Researchers are given a series of questions to ask about any journal they are considering. Some relate to their reputation and characteristics: Do you know the journal? Have you read any articles published in it before? Is the journal name easily confused with that of another?
Another set of questions examines a journal’s accessibility: Is the publisher name clearly displayed on the journal website? Can you contact the publisher by telephone, email, and post? The checklist also focuses on the peer-review process, which should seek to identify errors and reject articles that contain inconsistencies: Does the journal’s website mention whether the process involves independent/external reviewers? How many reviewers per paper? One question on the list deserves special mention because it relates to an irregular practice that is highly typical of predatory journals—the absence of authentic peer review: Does the journal guarantee acceptance or a very short peer review time? There are also questions about whether a copy of the article may be shared on open-access repositories and whether there are guidelines on potential conflicts of interest for authors, editors, and reviewers.
Finally, authors are encouraged to verify whether the journal is affiliated with any recognized initiatives that ensure good publishing practices, such as the UK-based Committee on Publication Ethics (COPE), which is dedicated to the formulation of standards on scientific integrity, or the Directory of Open Access Journals (DOAJ), signatories to which agree to meet minimum quality requirements. “A ‘no’ to this question is another red flag because the industry initiatives listed carry out stringent checks on a publisher’s integrity before accepting them as a member,” explained Lorraine Estelle, head of communications for Think.Check.Submit., in an article published on the COPE website.
The last step (submit) is short and simple: researchers are advised to submit their work only if they are comfortable with the answers to most or all of the questions on the checklist.
The service was founded in 2015 by nine institutions, including COPE, publishers such as Springer Nature and Biomed Central, and the Association of European Research Libraries. Today, it is also supported by several organizations that promote open-access publications. Gradually, it has expanded its reach. The checklist is currently offered in more than 40 languages and has recently been updated, with a new version tailored to authors seeking to publish books rather than articles. Scientific integrity training programs also began to use it, such as Texas Tech University (TTU) in the USA.
According to Estelle, the checklist was designed to help authors make conscious choices. “Our website provides a unique resource for researchers. Preparing to publish a research output can be daunting, especially if it is for the first time,” she says. She explains that the main way of identifying dishonest journals—by consulting lists posted online—can be problematic. “Some of these lists follow subjective criteria and include publishers who do not intentionally deceive anyone, but lack the resources needed to implement the best editorial or technical standards.”
Furthermore, the rate at which new predatory journals appear means these lists are always out of date. A 2014 study by scientists from Hanken School of Economics in Finland estimated the number of predatory journals at the time at 8,000. Many disappear as soon as they are identified, while others appear in emerging research topics. In 2021, a database managed by American analytics company Cabell International reached 15,059 predatory journals.
In an article published on the Think.Check.Submit. website, Spanish biochemist Iratxe Puebla highlighted a new front predatory journals may be using to dupe authors: preprint servers. Preprints—papers describing preliminary results that have not yet undergone peer review—are often assessed by editors and later published in journals. But some authors have been approached by suspicious publications offering to publish their results. “Predatory journals are brazen in their solicitation practices,” says Puebla, associate director of ASAPbio, an organization created in 2015 to promote the use of preprints in the life sciences. According to her, dishonest journals usually approach authors based on previously published articles, but the growing use of preprints makes their task even easier.
The checklist is also valid for preprint authors, but Puebla highlights some other suspicious practices to be aware of. If an email is not addressed to the researcher by name or has no editor’s signature, it is probably a template used to obtain information rather than a real invitation. If the email does not include specific comments about the work described in the preprint, it is likely that the sender has not even read the paper in question.Republish