One of the most common questions Cabells is asked about its Predatory Reports database of journals is whether it has ever “changed its mind” about listing a journal. As Simon Linacre reports, it is less a question of changing the outcome of a decision, but more of a leopard changing its spots.

This week saw the annual release of Journal Impact Factors from Clarivate Analytics, and along with it the rather less august list of journals whose Impact Factors have been suppressed in Web of Science. This year there were 33 journals suspended, all of which for “anomalous citation patterns found in the 2019 citation data” which pertained to high levels of self-citation. Such a result is the worst nightmare for a publisher, as while they can be due to gaming citation levels, they can also sometimes reflect the niche nature of a subject area, or other anomalies about a journal.

Sometimes the decision can be changed, although it is often a year or two before the data can prove a journal has changed its ways. Similarly, Cabells offers a review process for every journal it lists in its Predatory Reports database, and when I arrived at the company in 2018, like many people one of the first things I asked was: has Cabells ever had a successful review to delist a journal?

Open for debate

The answer is yes, but the details of those cases are quite instructive as to why journals are included on the database in the first place, and perhaps more importantly whey they are not. Firstly, however, some context. It is three years since the Predatory Reports database was first launched, and in that time almost 13,500 journals have been included. Each journal has a link next to the violations on its report for anyone associated with that journal to view the policy and appeal the decision:


This policy clearly states:

The Cabells Review Board will consider Predatory Journal appeals with a frequency of one appeal request per year, per journal. Publications in Predatory Reports, those with unacceptable practices, are encouraged to amend their procedures to comply with accepted industry standards.

Since 2017, there have been just 20 appeals against decisions to list journals in Predatory Reports (0.15% of all listed journals), and only three have been successful (0.02%). In the first case (Journal A), the journal’s peer review processes were checked and it was determined that some peer reviews were being completed, albeit very lightly. In addition, Cabells’ investigators found a previous example of dual publication. However, following the listing, the journal dealt with the problems and retracted the article it had published as it seemed the author had submitted two identical articles simultaneously. This in turn led to Cabells revising its evaluations so that particular violation does not penalize journals for something where an author was to blame.

In the second review (Journal B), Cabells evaluated the journal’s peer review process and found that it was also not completing full peer reviews and had a number of other issues. It displayed metrics in a misleading way, lacked editorial policies on its website and did not have a process for plagiarism screening. After its listing in PR, the journal’s publisher fixed the misleading elements on its website and demonstrated improvements to its editorial processes. In this second case, it was clear that the journal’s practices were misleading and deceptive, but they chose to change and improve their practices.”

Finally, a third journal (Journal C) has just had a successful appeal completed. In this case, there were several problems that the journal was able to correct by being more transparent on its website. It added or cleared up confusion about the necessary policies and made information about its author fees available. Cabells was also able to evaluate its peer review process after it submitted peer review notes on a few articles and it was evident the journal editor was managing a good quality peer review, hence it has now been removed from the Predatory Reports database (it should be noted that, as with the other two successful appeals, journals removed from Predatory Reports are not then automatically included in the Cabells Journalytics database).

Learning curve

Cabells’ takeaway from all of these successful reviews was they were indeed successful – they showed that the original identification was correct, and they enabled improvements that identified them as better, and certainly non-predatory, journals. They also fed into the continuing improvement Cabells seeks in refining its Predatory Reports criteria, with a further update due to be published later this summer.

There are also things to learn from unsuccessful reviews. In one case a publisher appealed a number of its journals that were included on Predatory Reports. However, their appeal only highlighted how bad the journals actually were. Indeed, an in-depth review of each journal not only uncovered new violations that were subsequently added to the journals, but also led to the addition of a brand new violation that is to be included in the upcoming revision of the Predatory Reports criteria.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.