What to know about ISSNs

There are many ways to skin a cat, and many ways to infer a journal could be predatory. In his latest blog post, Simon Linacre looks at the role the International Standard Serial Number, or ISSN, can play in the production of predatory journals. 

For many reasons, the year 2020 will be remembered for the sheer volume of numbers that have invaded our consciousness. Some of these are big numbers – 80 million votes for Joe Biden, four million cases of COVID in the US in 2020 – and some of these will be small, such as the number of countries (1) leaving the EU at the end of the year. Wherever we look, we see numbers of varying degrees of import at seemingly every turn.

While numbers have been previously regarded as gospel, however, data has joined news and UFO sightings (seemingly one of the few phenomena NOT to increase in 2020) as something to be suspicious about or faked in some way. And one piece of data trusted by many authors in determining the validity or otherwise of a journal is the International Standard Serial Number, or ISSN.

An ISSN can be obtained relatively easily via either a national or international office as long as a journal can be identified as an existing publication. As the ISSN’s own website states, an ISSN is “a digital code without any intrinsic meaning” and does not include any information about the contents of that publication. Perhaps most importantly, an ISSN “does not guarantee the quality or the validity of the contents”. This perhaps goes some way to explain why predatory journals can often include an ISSN on their websites. Indeed, more than 40% of the journals included in Cabells’ Predatory Reports database include an ISSN in their journal information.

But sometimes predatory publishers can’t obtain an ISSN – or at least can’t be bothered to – and will fake the ISSN code. Of the 6,000 or so journals with an ISSN in Predatory Reports, 288 or nearly 5% have a fake ISSN, and this is included as one of the database’s behavioural indicators to help identify predatory activity. It is instructive to look at these fake ISSNs to see the lengths predatory publishers will go to in order to achieve some semblance of credibility in their site presence.

For some journals, it is obvious that the ISSN is fake as it looks wrong. In the example above for the Journal of Advanced Statistics and Probability, the familiar two groups of four digits followed by a hyphen format is missing, replaced by nine digits and a forward slash, which is incorrect.

For other journals, such as the Global Journal of Nuclear Medicine and Biology below, the format is correct, but a search using the ISSN portal brings up no results, so the ISSN code is simply made up.

More worrying are the few publications that have hijacked existing, legitimate journals and appropriated their identity, including the ISSN. In the example below, the Wulfenia Journal has had its identity hijacked, with the fake journal website pictured below.

If you compare it to the genuine journal shown below (the German homepage can be found here), you can see they list the same ISSN.

One can only imagine the chaos caused for a legitimate journal when its identity is hijacked, and this is just part of wider concerns on the effects of fake information being shared have on society. As always, arming yourself with the right information – and taking a critical approach to any information directed your way – will help see you through the morass of misinformation we seem to be bombarded with in the online world.

Guest Post: A look at citation activity of predatory marketing journals

This week we are pleased to feature a guest post from Dr. Salim Moussa, Assistant Professor of Marketing at ISEAH at the University of Gafsa in Tunisia. Dr. Moussa has recently published insightful research on the impact predatory journals have had on the discipline of marketing and, together with Cabells’ Simon Linacre, has some cautionary words for his fellow researchers in that area.

Academic journals are important to marketing scholars for two main reasons: (a) journals are the primary medium through which they transmit/receive scholarly knowledge; and (b) tenure, promotion, and grant decisions depend mostly on the journals in which they have published. Selecting the right journal to which one would like to submit a manuscript is thus a crucial decision. Furthermore, the overabundance of academic marketing journals -and the increasing “Publish or Perish” pressure – makes this decision even more difficult.

The “market” of marketing journals is extremely broad, with Cabells’ Journalytics indexing 965 publication venues that that are associated with “marketing” in their aims and scope. While monitoring the market of marketing journals for the last ten years, I have noticed that a new type of journal has tapped into it: open access (OA) journals.

The first time I have ever heard about OA journals was during a dinner in an international marketing conference held in April 2015 in my country, Tunisia. Many of the colleagues at the dinner table were enthusiastic about having secured publications in a “new” marketing journal published by “IBIMA”. Back in my hometown (Gafsa), I took a quick look at IBIMA Publishing’s website. The thing that I remember the most from that visit is that IBIMA’s website looked odd to me. Then a few years later, while conducting some research on marketing journals, I noticed some puzzling results for a particular journal. Investigating the case of that journal, I came to realize that a scam journal was brandjacking the identity of the flagship journal of the UK-based Academy of Marketing’s, Journal of Marketing Management.

Undertaking this research, terms such“Predatory publishers”, “Beall’s List”, and “Think, Check, Submit” were new discoveries for me. This was also the trigger point of a painful yet insightful research experience that lasted an entire year (from May 2019 to May 2020).

Beall’s list was no longer available (shutdown in January 2017), and I had no access to Cabells’ Predatory Reports. Freely available lists were either outdated or too specialized (mainly Science, Technology, and Medicine) to be useful. So, I searched for journals that have titles that are identical or confusingly similar to those of well-known, prestigious, non-predatory marketing journals. Using this procedure, I identified 12 journals and then visited the websites of each of these 12 journals to collect information about both the publisher and the journal; that is, is the journal OA or not, its Article Processing Charges, whether the journal had an editor in chief or not, the names of its review board members and their affiliations (if any), the journal’s ISSNs, etc. I even emailed an eminent marketing scholar that I was stunned to see his name included in the editorial board of a suspicious journal.

With one journal discarded, I had a list of 11 suspicious journals (Journal A to Journal K).

Having identified the 11 publishers of these 11 journals, I then consulted three freely available and up-to-date lists of predatory publishers: the Dolos List, the Kscien List, and the Stop Predatory Journals List. The aim of consulting these lists was to check whether I was wrong or right in qualifying these publishers as predatory. The verdict was unequivocal; each of the 11 publishers were listed in all three of them. These three lists, however, provided no reasons for the inclusion of a particular publisher or a particular journal.

To double-check the list, I used the Directory of Open Access Journals, which is a community-curated online directory that indexes and provides access to high-quality, OA, peer-reviewed journals. None of the 11 journals were indexed in it. To triple-check the list, I used both the 2019 Journal Quality List of the Australian Business Deans Council and the 2018 Academic Journal Guide by the Chartered Association of Business Schools. None of the 11 journals were ranked in these two lists either.

To be brief, the one year of endeavor resulted in a paper I submitted to the prestigious academic journal, Scientometrics, published by Springer Nature, and my paper was accepted and published online in late October 2020. In that paper, I reported the findings of a study that examined the extent of citations received by articles published in ten predatory marketing journals (as one of the 11 journals under scrutiny was an “empty journal”; that is, with no archives). The results indicated that some of these journals received quite a few citations with a median of 490 citations, with one journal receiving 6,296 citations (see also Case Study below).

I entitled the article “Citation contagion: A citation analysis of selected predatory marketing journals.” Some people may or may not like the framing in terms of “contagion” and “contamination” (especially in these COVID times), but I wanted the title to be striking enough to attract more readership. Those who read the article may see it as a call for marketing researchers to “not submit their (possibly interesting) knowledge products to any journal before checking that the publication outlet they are submitting to is a non-predatory journal.” Assuming that the number of citations an article receives signals its quality, the findings in my study indicate that some of the articles published in these predatory journals deserved better publication venues. I believe that most of the authors of these articles were well-intentioned and did not know that the journals they were submitting to were predatory. 

A few months earlier and having no access to Cabells databases, I read each of the posts in their blog trying identify marketing journals that were indexed in Predatory Reports. Together with Cabells, our message to the marketing research community is that 10 of the 11 journals that I have investigated were already listed (or under-review for inclusion) in Predatory Reports. I believe my study has revealed only the tip of the iceberg. Predatory Reports now indexes 140 journals related to the subject of Marketing (which represents 1% of the total number of journals listed in Predatory Reports). Before submitting your papers to an OA marketing journal, you can use Predatory Reports to verify that it is legitimate.

Case Study

The study completed by Dr. Moussa provides an excellent primer on how to research and identify predatory journals (writes Simon Linacre). As such, it is instructive to look at one of the journals highlighted in Dr. Moussa’s article in more detail.

Dr. Moussa rightly suspected that the British Journal of Marketing Studies looked suspicious due to its familiar-sounding title. This is a well-used strategy by predatory publishers to deceive authors who do not make themselves familiar with the original journal. In this case, the British Journal of Marketing Studies sounds similar to a number of potential journals in this subject discipline.

As Dr. Moussa also points out, a questionable journal’s website will often fail to stand up to a critical eye. For example, the picture below shows the “offices” of BJMS – a small terraced house in Southern England, which seems an unlikely location for an international publishing house. This journal’s website also contains a number of other tells that while not singularly defining of predatory publishers, certainly provide indicators: prominent phone numbers, reference to an ‘Impact Factor’ (not from Clarivate), fake indexation in databases (eg DOAJ), no editor contact details, and/or fake editor identity.

What is really interesting about Dr. Moussa’s piece is his investigation of citation activity. We can see from the data below that ‘Journal I’ (which is the British Journal of Marketing Studies) that both total citations and the most citations received by a single article are significant, and represent what is known as ‘citation leakage’ where citations are made to and from predatory journals. As articles in these journals are unlikely to have had any peer review, publication ethics checks or proof checks, their content is unreliable and skews citation data for reputable research and journals.

  • Predatory journal: Journal I (BJMS)
  • Total number of citations received: 1,331
  • Number of citations received by the most cited article: 99
  • The most cited article was published in: 2014
  • Number of citations received from SSCI-indexed journals: 3
  • Number of citations received from FT50 listed journals: 0
Predatory Reports entry for BJMS

It is a familiar refrain from The Source, but it bears repeating – as an author you should do due diligence on where you publish your work and ‘research your research’. Using your skills as a researcher for publication and not just what you want to publish will save a huge amount of pain in the future, both for avoiding the bad journals and choosing the good ones.

Cabells and Inera present free webinar: Flagging Predatory Journals to Fight “Citation Contamination”

Cabells and Inera are excited to co-sponsor the free on-demand webinar “Flagging Predatory Journals to Fight ‘Citation Contamination'” now available to stream via SSP OnDemand. Originally designed as a sponsored session for the 2020 SSP Annual Meeting, this webinar is presented by Kathleen Berryman of Cabells and Liz Blake of Inera, with assistance from Bruce Rosenblum and Sylvia Izzo Hunter, also from Inera.

The webinar outlines an innovative collaborative solution to the problem of “citation contamination”—citations to content published in predatory journals with a variety of bad publication practices, such as invented editorial boards and lack of peer review, hiding in plain sight in authors’ bibliographies.

DON’T MISS IT! On Thursday, November 12, 2020, at 11:00 am Eastern / 8:00 am Pacific, Kathleen and Liz will host a live screening with real-time chat followed by a Q&A!

For relevant background reading on this topic, we recommend these Scholarly Kitchen posts:

The A-Z’s of predatory publishing

Earlier this year Cabells (@CabellsPublish) published an A-Z list of issues regarding predatory publishing practices, with one Tweet a week going through the entire alphabet. In this week’s blog, Simon Linacre republishes all 26 tweets in one place as a primer on how to successfully deal with the phenomenon

A = American. US probable source of #PredatoryJournals activity as it lends credence to claims of legitimacy a journal may adopt to hoodwink authors into submitting articles #PredatoryAlphabet #PredatoryJournal #PublicationEthics

B = British. Questionable journals often use ‘British’ as a proxy for quality. Over 600 include this descriptor in the Predatory Reports database, many more than in the Journalytics database of recommended journals

C = Conferences. Predatory conferences bear the same hallmarks as #PredatoryJournals – few named academics involved, too-good-to-be-true fees & poorly worded/designed websites. Some tips here.

D = Dual Publication. Publishing in #PredatoryJournals can lead to dual publication if the same article is then published in a legitimate journal

E = Editor (or lack of). Always check if a journal has an Editor, with an institutional email address and corroborated affiliation. Lack of an Editor or their details can indicate poor quality or predatory nature

F = Font. Look at fonts used by journals in emails or on web pages – clashing colors, outsized lettering or mixed up fonts may indicate predatory behavior

G = Germany, which takes #PredatoryJournals seriously through university-level checks, highlighting the issue and exposing problems in a 2018 investigation

H = H-Index. Authors’ and journals’ #H_index scores can be skewed by #PredatoryJournal activity

I = ISSN. Over 4,000 out of 13,000 (30%) journals on @CabellsPublish Predatory Reports database claim an ISSN (some genuine, some fake)

J = JIF. Always check the Journal Impact Factor (JIF) claims on journal websites as many #PredatoryJournals falsely claim them. Check via @clarivate Master Journal List

K = Knowledge. Spotting #PredatoryJournals and recognising legitimate #Journals can be important as subject knowledge for publication. ‘Research your research’ by using tools such as @CabellsPublish databases

L = Legitimacy. First responsibility for authors is to check legitimacy of journals – use checklists, peer recommendations and @CabellsPublish Predatory Reports database

M = Membership scams. Beware any journal that offers membership of a group as a condition of publication. Check existence of group and journal credentials on @CabellsPublish Journal databases

N = Nature. Using a trusted scholarly brand such as @nature can help identify, understand and define #PredatoryJournals, with dozens of articles on the subject via @NatureNews

O = OMICS. In 2019, the FTC fined publisher OMICS over $50m for deceptive publishing practices

P = Predatory Reports. The new name for the @CabellsPublish database of 13,400 #PredatoryJournals

Q = Quick publication. Peer review is a long process  typically lasting months. Beware journals offering quick #PeerReview and publication in days or even weeks

R = Research. Academic #Research should also include research on #Journals to enable #PublicationEthics and #ResearchIntegrity are followed. Use @CabellsPublish Predatory Reports to support your research

S = Spam. Journal emails to solicit articles are often predatory (see below for ironic example). Authors can check the legitimacy of journals by using the @CabellsPublish Predatory Reports database

T = Top Tips. Use our research on #PredatoryJournals and how to spot #PredatoryPublishing practices

U = Unknown. Never trust your research to any unknown editor, journal or publisher #PredatoryJournals #PredatoryAlphabet via @CabellsPublish

V = Vocabulary. Look out for ‘unusual’ vocabulary in predatory journal solicitation emails (i.e., “…your precious paper…”) #PredatoryJournals #PredatoryAlphabet via
@CabellsPublish

W = Website. Look for red flags on a journal’s website such as dead links, poor grammar/spelling, no address for journal or publisher, or no contact info (just web-form)

X = There are no current ISSNs with more than one ‘X’ at the end, which serves as a check digit. Be aware of fake ISSNs with multiple X’s

Y = Yosemite Sam. Always check the backgrounds of Editors and Editorial Board members of journals – if they are a cartoon character, maybe don’t submit your article

Z = Zoology. The US Open Zoology Journal is one of 35 zoology titles on the @CabellsPublish Predatory Reports database. It is not open, is not from the US and has no articles, but will accept your $300 to submit an article

Empowering India’s Academia

According to some research, India has the unfortunate distinction of having both the highest number of predatory journals based there, and the highest number of authors publishing in them. In this week’s blog, Simon Linacre answers some of the questions researchers in that country have regarding authentic publishing practices.

During the latter part of 2020, instead of jetting off to typical destinations in the scholarly communications calendar such as Frankfurt and Charleston, some of my energies have been focused on delivering a number of short webinars on predatory publishing to a variety of Indian institutions. According to an oft-quoted article by Shen and Bjork in 2015, India has the largest number of authors who have published in predatory journals, and Cabells knows from its own data that one of the most common countries to find predatory journals originating is in India.

There are probably a number of reasons that account for this, but rather than speculate it is perhaps better to try to act and support Indian authors who do not want to fall into the numerous traps laid for them. One aspect of this are the recent activities of the University Grants Commission (UGC) and its Consortium for Academic and Research Ethics (CARE), which have produced a list of recommended journals for Indian scholars to use.

However, many in India are still getting caught out as some journals have been cloned or hijacked, while others use the predatory journal tactic of simply lying about their listing by UGC-CARE, Scopus, Web of Science or Cabells in order to attract authorship. Following one webinar last week to an Indian National Institute of Technology (NIT), there was a flurry of questions from participants that deserved more time than allowed, so I thought it would be worth sharing some of these questions and my answers here so others can hopefully pick up a few tips when they are making that crucial decision to publish their research.

  1. What’s the difference between an Impact Factor and CiteScore? Well, they both look and feel the same, but the Impact Factor (IF) records a journal’s published articles over a two year period and how they have been cited in the following year in other Web of Science-indexed journals, whereas the CiteScore records a journal’s published documents over a three year period before counting citations the following year.
  2. How do I know if an Impact Factor is fake? Until recently, this was tricky, but now Clarivate Analytics has made its IFs for the journals it indexes for the previous year available on its Master Journal List.
  3. If I publish an article in a predatory journal, can the paper be removed? A submission can be made to the publisher for an article to be retracted, however a predatory publisher is very unlikely to accede to the request and will probably just ignore it. If they do respond, they have been known to ask for more money – in addition to the APC that has already been paid – to carry out the request, effectively blackmailing the author.
  4. If I publish an article in a predatory journal, can I republish it elsewhere? Sadly, dual publication is a breach of publication ethics whether the original article is published in a predatory journal or a legitimate one. The best course of action is to build on the original article with further research so that it is substantially different from the original and publish the subsequent paper in a legitimate journal.
  5. How can I tell if a journal is from India or not? If the origin of the journal is important, the information should be available on the journal’s website. It can also be checked using other sources, such as Scimago which provides country-specific data on journals.

Simon Linacre, Cabells

A case study of how bad science spreads

Fake news has been the go-to criticism of the media for some politicians, which in turn has been rejected as propaganda and fear-mongering by journalists. However, as former journalist Simon Linacre argues, the fourth estate needs to have its own house is in order first, and ensure they are not tripped up by predatory journals.


I class myself as a ‘runner’, but only in the very loosest sense. Inspired by my wife taking up running a few years ago, I decided I should exercise consistently instead of the numerous half-hearted, unsuccessful attempts I had made over the years. Three years later I have done a couple of half-marathons, run every other day, and track my performance obsessively on Strava. I have also recently started to read articles on running online, and have subscribed to the magazine Runners World. So yes, I think I may actually be a runner now.

But I’m also an experienced journalist, a huge cynic, and have a bulls*** radar the size of the Grand Canyon, so even while relaxing with my magazine I like to think I can spot fakery a mile off. And so it proved earlier this summer while reading a piece on how hill running can improve your fitness. This was music to my ears as someone who lives half-way up a valley side, but my interest was then piqued when I saw a reference to the study that formed the basis for the piece, which was to an article in the International Journal of Scientific Research. Immediately, I smelt a rat. “There is no way that is the name of a reputable, peer-reviewed journal,” I thought. And I was right.

But that wasn’t even half of the problem.

After checking Cabells’ Predatory Reports database, I found not one but TWO journals are listed on the database with that name, both with long lists of breaches of the Cabells’ criteria that facilitate the identification of predatory journals. I was still curious as to the nature of the research, as it could have been legitimate research in an illegitimate journal, or just bad research, full stop. As it turned out, neither journal had ever published any research on hill running and the benefits of increasing VO2 max. So where was the story from?

After some more digging, an article matching the details in the Runners World piece could be found in a third similarly-named journal, the International Journal of Scientific and Research Publications. The article, far from the recent breakthrough suggested in the August 2020 issue of Runners World, was actually published in August 2017 by two authors from Addis Ababa University in Ethiopia. While the science of the article seems OK, the experiment that produced the results was on just 32 people over 12 weeks, which means it really needs further validation across greater subjects to confirm its findings. Furthermore, while the journal itself was not included in Cabells’ Predatory Reports database, a review found significant failings, including unusually quick peer review processes and, more seriously, that the “owner/Editor of the journal or publisher falsely claims academic positions or qualifications”. The journal has subsequently been added to Predatory Reports, and the article itself has never been cited in the three years since publication.

Yet one question remains: how did a relatively obscure article, published in a predatory journal and that has never been cited, find its way into a news story in a leading consumer magazine? Interestingly, similar research was also quoted on MSN.com in May 2020 which also quoted the International Journal of Scientific Research, while other sites have also quoted the same research but from the International Journal of Scientific and Research Publications. It appears likely that, having been quoted online once, the same story has been doing the rounds for three years like a game of ‘Telephone,’ all based on uncited research that may not have been peer reviewed in the first place, that used a small sample size and was published in a predatory journal.

While no damage has been done here – underlying all this, it does make sense that hill running can aid overall performance – one need only need to think about the string of recent health news stories around the coronavirus to see how one unverified article could sweep like wildfire through news outlets and online. This is the danger that predatory journals pose.

They’re not doctors, but they play them on TV

Recently, while conducting investigations of suspected predatory journals, our team came across a lively candidate. At first, as is often the case, the journal in question seemed to look the part of a legitimate publication. However, after taking a closer look and reading through one of the journal’s articles (“Structural and functional brain differences in key opinion journal leaders“) it became clear that all was not as it seemed.

Neurology and Neurological Sciences: Open Access, from MedDocs Publishers, avoids a few of the more obvious red flags that indicate deceitful practices, even to neophyte researchers, but lurking just below the surface are several clear behavioral indicators common to predatory publications.

1a

With a submission date of August 22, 2018, and a publication date November 13, 2018, the timeline suggests that some sort of peer review of this article may have been carried out. A closer examination of the content makes it evident that little to no peer review actually took place. The first tip-off was the double-take inducing line in the “Material and methods” section, “To avoid gender bias, we recruited only males.” Wait, what? That’s not how that works.

It soon became clear to our team that even a rudimentary peer review process (or perhaps two minutes on Google) would have led to this article’s immediate rejection. While predatory journals are no laughing matter, especially when it comes to medical research in the time of a worldwide pandemic, it is hard not to get a chuckle from some of the “easter eggs” found within articles intended to expose predatory journals. Some of our favorites from this article:

  • Frasier Crane, a listed author, is the name of the psychiatrist from the popular sitcoms Cheers and Frasier
  • Another author, Alfred Bellow, is the name of the NASA psychiatrist from the TV show I Dream of Jeannie
  • Marvin Monroe is the counselor from The Simpsons
  • Katrina Cornwell is a therapist turned Starfleet officer on Star Trek: Discovery
  • Faber University is the name of the school in Animal House (Faber College in the film)
  • Orbison University, which also doesn’t exist, is likely a tribute to the late, great musician Roy Orbison

And, perhaps our favorite find and one we almost missed:

  • In the “Acknowledgments” section the authors thank “Prof Joseph Davola for his advice and assistance.” This is quite likely an homage to the Seinfeld character “Crazy Joe Davola.”

Though our team had a few laughs with this investigation, they were not long-lived as this is yet another illustration of the thousands (Predatory Reports currently lists well over 13,000 titles) of journals such as this one in operation. Outlets that publish almost (or literally) anything, usually for a fee, with no peer review or other oversight in place and with no consideration of the detrimental effect it may have on science and research.

MedDocs PR card
Predatory Reports listing for Neurology and Neurological Sciences: Open Access

A more nuanced issue that deceptive publications create involves citations. If this was legitimate research, the included citations would not ‘count’ or be picked up anywhere since this journal is not indexed in any citation databases. Furthermore, any citation in a predatory journal that cites a legitimate journal is ‘wasted’ as the legitimate journal cannot count or use that citation appropriately as a foundation for its legitimacy. However, these citations could be counted via Google Scholar, although (thankfully) this journal has zero. Citation ‘leakage’ can also occur, where a legitimate journal’s articles cite predatory journals, effectively ‘leaking’ those citations out of the illegitimate scholarly publishing sphere into legitimate areas. These practices can have the effect of skewing citation metrics which are measures often relied upon (sometimes exclusively, often too heavily) to gauge the legitimacy and impact of academic journals.

When all is said and done, as this “study” concludes, “the importance of carefully selecting journals when considering the submission of manuscripts,” cannot be overstated. While there is some debate around the use of “sting” articles such as this one to expose predatory publications, not having them exposed at all is far more dangerous.

Cabells’ top 7 palpable points about predatory publishing practices

In his latest post, Simon Linacre looks at some new stats collated from the Cabells Predatory Reports database that should help inform and educate researchers, better equipping them to evade the clutches of predatory journals.


In recent weeks Cabells has been delighted to work with both The Economist and Nature Index to highlight some of the major issues for scholarly communication that predatory publishing practices represent. As part of the research for both pieces, a number of facts have been uncovered that not only help us understand the issues inherent in this malpractice much better, but should also point researchers away from some of the sadly typical behaviors we have come to expect.

So, for your perusing pleasure, here are Cabells’ Top 7 Palpable Points about Predatory Publishing Practices:

  1. There are now 13,500 predatory journals listed in the Predatory Reports database, which is currently growing by approximately 2,000 journals a year
  2. Over 4,300 journals claim to publish articles in the medical field (this includes multidisciplinary journals) – that’s a third of the journals in Predatory Reports. By discipline, medical and biological sciences have many more predatory journals than other disciplines
  3. Almost 700 journals in Predatory Reports start with ‘British’ (5.2%), while just 50 do on the Journalytics database (0.4%). Predatory journals often call themselves American, British or European to appear well established and legitimate, when in reality relatively few good quality journals have countries or regions in their titles
  4. There are over 5,300 journals listed in Predatory Reports with an ISSN (40%), although many of these are copied, faked, or simply made up. Having an ISSN is not a guarantee of legitimacy for journals
  5. Around 41% of Predatory Reports journals are based in the US, purport to be from the US, or are suspected of being from the US, based on information on journal websites and Cabells’ investigations. This is the highest count for any country, but only a fraction will really have their base in North America
  6. The average predatory journal publishes about 50 articles a year according to recent research from Bo-Christer Björk of the Hanken School of Economics in Helsinki, less than half the output of a legitimate title. Furthermore, around 60% of papers in such journals receive no future citations, compared with 10% of those in reliable ones
  7. Finally, it is worth noting that while we are in the throes of the Coronavirus pandemic, there are 41 journals listed on in Predatory Reports (0.3%) specifically focused on epidemiology and another 35 on virology (0.6% in total). There could be further growth over the next 12 months, so researchers in these areas should be particularly careful now about where they submit their papers.

Reversal of fortune

One of the most common questions Cabells is asked about its Predatory Reports database of journals is whether it has ever “changed its mind” about listing a journal. As Simon Linacre reports, it is less a question of changing the outcome of a decision, but more of a leopard changing its spots.


This week saw the annual release of Journal Impact Factors from Clarivate Analytics, and along with it the rather less august list of journals whose Impact Factors have been suppressed in Web of Science. This year there were 33 journals suspended, all of which for “anomalous citation patterns found in the 2019 citation data” which pertained to high levels of self-citation. Such a result is the worst nightmare for a publisher, as while they can be due to gaming citation levels, they can also sometimes reflect the niche nature of a subject area, or other anomalies about a journal.

Sometimes the decision can be changed, although it is often a year or two before the data can prove a journal has changed its ways. Similarly, Cabells offers a review process for every journal it lists in its Predatory Reports database, and when I arrived at the company in 2018, like many people one of the first things I asked was: has Cabells ever had a successful review to delist a journal?

Open for debate

The answer is yes, but the details of those cases are quite instructive as to why journals are included on the database in the first place, and perhaps more importantly whey they are not. Firstly, however, some context. It is three years since the Predatory Reports database was first launched, and in that time almost 13,500 journals have been included. Each journal has a link next to the violations on its report for anyone associated with that journal to view the policy and appeal the decision:

1a

This policy clearly states:

The Cabells Review Board will consider Predatory Journal appeals with a frequency of one appeal request per year, per journal. Publications in Predatory Reports, those with unacceptable practices, are encouraged to amend their procedures to comply with accepted industry standards.

Since 2017, there have been just 20 appeals against decisions to list journals in Predatory Reports (0.15% of all listed journals), and only three have been successful (0.02%). In the first case (Journal A), the journal’s peer review processes were checked and it was determined that some peer reviews were being completed, albeit very lightly. In addition, Cabells’ investigators found a previous example of dual publication. However, following the listing, the journal dealt with the problems and retracted the article it had published as it seemed the author had submitted two identical articles simultaneously. This in turn led to Cabells revising its evaluations so that particular violation does not penalize journals for something where an author was to blame.

In the second review (Journal B), Cabells evaluated the journal’s peer review process and found that it was also not completing full peer reviews and had a number of other issues. It displayed metrics in a misleading way, lacked editorial policies on its website and did not have a process for plagiarism screening. After its listing in PR, the journal’s publisher fixed the misleading elements on its website and demonstrated improvements to its editorial processes. In this second case, it was clear that the journal’s practices were misleading and deceptive, but they chose to change and improve their practices.”

Finally, a third journal (Journal C) has just had a successful appeal completed. In this case, there were several problems that the journal was able to correct by being more transparent on its website. It added or cleared up confusion about the necessary policies and made information about its author fees available. Cabells was also able to evaluate its peer review process after it submitted peer review notes on a few articles and it was evident the journal editor was managing a good quality peer review, hence it has now been removed from the Predatory Reports database (it should be noted that, as with the other two successful appeals, journals removed from Predatory Reports are not then automatically included in the Cabells Journalytics database).

Learning curve

Cabells’ takeaway from all of these successful reviews was they were indeed successful – they showed that the original identification was correct, and they enabled improvements that identified them as better, and certainly non-predatory, journals. They also fed into the continuing improvement Cabells seeks in refining its Predatory Reports criteria, with a further update due to be published later this summer.

There are also things to learn from unsuccessful reviews. In one case a publisher appealed a number of its journals that were included on Predatory Reports. However, their appeal only highlighted how bad the journals actually were. Indeed, an in-depth review of each journal not only uncovered new violations that were subsequently added to the journals, but also led to the addition of a brand new violation that is to be included in the upcoming revision of the Predatory Reports criteria.

Announcement regarding brand-wide language changes, effective immediately

Since late last year, Cabells has been working on developing new branding for our products that better embody our ideals of integrity and equality in academic publishing and society as a whole. We set out to ensure that the changes represent a total departure from the symbolism inextricably tied to the idea of blacklists and whitelists. In support of, and in solidarity with, the fight against systemic racism that our country is facing, Cabells is implementing brand-wide language changes, effective immediately. The changes implemented today represent only a fraction of those that we will be launching in 2020, but it is an important start.

Users may experience temporary outages as the changes roll out, but normal operations should resume quickly. Customer access will function identically as before the changes, but look for the term “Journalytics” in place of “whitelist” and “Predatory Reports” in place of “blacklist.”

Please contact Mike Bisaccio at michael.bisaccio@cabells.com or (409) 767-8506 with any questions or for additional information.

Cabells thanks the entire community for their support of this effort.

Sincerely,
The Cabells Team