Cabells adds journals to its Predatory Reports database continuously, with over 10,000 added to the original 4,000 it launched with in 2017. But can we learn anything from the journals that have been added recently? To find out, Simon Linacre takes a look at the predatory journals listed throughout June 2021.
Fancy reading up on some research to learn about COVID-19? A lot of people will have been doing the same thing over the last 18 months as they try and figure out for themselves what on earth has been happening. They will do some Google searches and read articles published in journals like the British Medical Journal, New England Journal of Medicine and the Open Journal of Epidemiology. Sure, the third one doesn’t sound quite as prestigious as the other two, but it has a bunch of articles on epidemics, and it’s all free to read – so that’s good, right?
Sadly, it’s not so good. The Open Journal of Epidemiology is one of 99 journals that Cabells listed in its Predatory Reports database last month, and is published by a well-known predatory publisher known as SCIRP (Scientific Research Publishing) based in China. The journal – not to be confused with the British Open Journal of Epidemiology or the American Open Epidemiology Journal, both in Predatory Reports as well – has dubious publication practices such as falsely claiming indexation in well-known databases, promising unusually quick peer review and publishing authors several times in the same journal and/or issue.
The journal’s search function points a handful articles relating to ‘COVID’, including one on ex-patients and their recovery which has been downloaded 200 times and viewed nearly 600 times according to the website. But we know that this journal was unlikely to have been given a full peer review, if any at all, and the data on the website is difficult to trust – the Open Journal of Epidemiology was just one of 26 journals from the same publisher which Cabells listed last month.
In total there were eight publishers who had journals listed in June, with the biggest being Bilingual Published Co. based in Singapore, with 30 journals in total. Other publishers had fewer journals listed and were based in several different countries – India, Hong Kong, Kenya and even Belgium – and it is worth pointing out that Cabells reviews each journal independently rather than the publisher as a whole.
What else can we glean from this selection of predatory titles? Just 11 out of 99 had no ISSN, further underlining the folly of using the existence of an ISSN as evidence of legitimacy. On average the journals were four-to-five years old, so reasonably well established, and predominantly based in STEM research areas. Of the 99 journals listed, just 13 were in non-STEM areas such as Education and Management. The most common subject was Medicine with no fewer than 38 journals represented. However, it is worth pointing out that many predatory publishers are either hopelessly generic, or will publish anything even if the article has nothing to do with the core topics of the journal.
Cabells is being kept rather busy by reviewing all these journals, but if you do spot a suspicious journal or receive those annoying spam emails, do let us know at firstname.lastname@example.org and we can perform a review so that others won’t be deceived or fall into the numerous traps being laid for them.
The latest meme to sweep Twitter in the last week has been a satirical look at typical journal articles. Simon Linacre introduces Cabells’ own take on the theme and reflects on the impact they can have on our shared conscience.
We all like memes, right? Those social media nuggets that we can all relate to and laugh at, a form of in-joke without having to be with a group of people, which under current circumstances has meant a kind of gold rush for this form of humor. Whether it is the boyfriend looking over his shoulder at another woman or the excerpt from the film Downfall with Hitler going berserk, the number of situations and news items that lend themselves to this form of parody is literally endless.
So, when the meme spotlight fell on our own corner of the scholarly publishing world, we couldn’t help but join in and adapt the scientific paper meme to predatory journals (see image). To be honest, it wasn’t too difficult to think of 12 journal titles that highlighted the problems predatory publishing causes, and a whole series of memes could easily be created to underscore the joke that is the predatory journal phenomenon.
It’s easy to spot the themes we chose to lampoon, although however much we become familiar with the predatory journal tropes, publications and new journals are emerging all the time, as the total number of journals listed in Cabells’ Predatory Reports hitting 14,500 this week testifies. Among the issues we put under the spotlight in the graphic are both the unethical and unaware authors publishing in predatory titles, how poor research or plagiarized content can easily be published, and some of the poor excuses those who end up publishing in dodgy journals have provided.
However, underneath the tomfoolery there is a serious point to be made. A recent op-ed in The Atlantic took the opportunity of highlighting not just the shared joy and geekiness of the scientific paper meme, but also the existential dread it spotlighted. As the article expertly points out, while academics recognize the hamster-in-a-wheel absurdity the meme represents, they cannot help but see themselves in the wheel, unable to stop running. For some, they will just shrug their shoulders and find the next piece of clickbait; for others, there is little consolation in the humor and plenty of angst to try and control to preserve their sanity.
When it comes to predatory journals, from a pure eyeballs perspective we can see that articles and social media posts about the often bizarre world of predatory publishing get the most traction, such as the fact that one predatory journal lists Yosemite Sam on the editorial board. And yet there is always a serious point behind these fun stories, which is that predatory journals can make an unholy mess of scientific research, causing millions of funding dollars to be wasted and allowing either junk or rank bad science to contaminate legitimate published research. This is the real punchline and it rings pretty hollowly sometimes.
Turkey has been making great strides in recent years as a force to be reckoned with on the international research stage. However, it seems to have encountered more problems than other countries with regard to predatory journals. Simon Linacre looks at the problems facing the country and highlights some resources available to help Turkish scholars.
A simple Google search of “predatory journals Turkey” provides quick insight into the concerns academic researchers there have regarding these deceptive publications. Numerous articles fill the first pages of results highlighting the particular issue Turkey seems to share with a few other countries such as India and Nigeria. Alongside, however, are anonymous websites offering unsupported claims about predatory publications. Validated information appears to be thin on the ground.
Luckily, the Turkish government understands there is a problem and in the Spring of 2019 it decided to take action. According to Professor Zafer Koçak in his article ‘Predatory Publishing and Turkey’, the Turkish Council of Higher Education decreed that “scientific papers published in predatory journals would not be taken into account in academic promotion and assignment. Thus, Turkey has taken the step of becoming one of the first countries to implement this in the world”.
According to its website, the Turkish Council of Higher Education believed the phenomenon was increasing, and was doing so internationally. A number of articles have been published recently that back this up – for example here and here – and there is the potential for Turkish authors to get caught up in this global swell due to their increasing publication output.
To support Turkish authors and institutions, Cabells has translated its information video on its Journalytics and Predatory Reports products, as well as translating this page, into Turkish. Hopefully, the availability of independently verified information on predatory journals and greater dialogue will improve the conditions for Turkey and its scholars to continue to grow their influence in global research.
Türkiye son yıllarda uluslararası araştırma sahnesinde yabana atılamayacak büyük bir aşama kaydetmektedir. Ancak yağmacı dergilerle diğer ülkelerde olduğundan daha fazla sorunlarla karşılaşıyor gibi görünüyor. Simon Linacre bu konuda ülkenin karşı karşıya olduğu sorunlara bakıyor ve Türk bilim insanlarına yardımcı olacak mevcut kaynakların altını çiziyor.
Basit bir “predatory journals Turkey” Google taraması akademik araştırmacıların bu aldatıcı yayınlarla ilgili endişelere sahip oldukları konusunda hızlı bir anlayış sağlıyor. Taramanın ilk sayfaları, Türkiye’nin bu sorunu Hindistan ve Nijerya gibi diğer bir kaç ülke ile paylaştığını gösteren sonuçlarla dolu. Fakat bu sonuçların bir kısmı da yağmacı yayınlar hakkında desteklenmeyen iddialar sunan anonim web sayfaları. Doğrulanmış ve güvenilir bilgi nadir görülüyor.
Neyse ki, Türk hükümeti bir sorun olduğunun farkında ve 2019 Baharında önlem almaya karar verdi. Profesör Zafer Koçak’ın ‘Predatory Publishing and Turkey’ makalesine göre, Yükseköğretim Kurulu tarafından alınan kararla “yağmacı dergilerde yayımlanan bilimsel makaleler akademik yükseltmelerde dikkate alınmayacak. Böylece Türkiye dünyada bu kararı yürürlüğe koyan ilk ülkelerden biri olma adımını attı”.
Yükseköğretim Kurulu web sitesine göre, Kurul hem ulusal hem de uluslararası ortamlarda yağmacı yayıncılığın arttığına inanıyor. Son zamanlarda bunu destekleyen bir çok makale yayımlandı – örneklerini burada ve burada görebilirsiniz – ve yayın sayıları ile birlikte hızla artan küresel yağmacılığa Türk yazarların yakalanma olasılığı var. Cabells, Türk yazarları ve kurumları desteklemek için Journalytics ve Predatory Reports ürünlerinin bilgilendirici videosu ile birlikte bu sayfayı da Türkçeye çevirdi. Umarız ki, yağmacı dergiler hakkında bağımsız olarak onaylanmış bilginin ulaşılabilirliği ve daha güçlü iletişim, Türkiye’nin ve akademisyenlerinin global araştırmadaki etkilerini arttırarak devam ettirmeleri konusunda şartları iyileştirecek.
This week we are pleased to feature a guest post from Dr. Salim Moussa, Assistant Professor of Marketing at ISEAH at the University of Gafsa in Tunisia. Dr. Moussa has recently published insightful research on the impact predatory journals have had on the discipline of marketing and, together with Cabells’ Simon Linacre, has some cautionary words for his fellow researchers in that area.
Academic journals are important to marketing scholars for two main reasons: (a) journals are the primary medium through which they transmit/receive scholarly knowledge; and (b) tenure, promotion, and grant decisions depend mostly on the journals in which they have published. Selecting the right journal to which one would like to submit a manuscript is thus a crucial decision. Furthermore, the overabundance of academic marketing journals -and the increasing “Publish or Perish” pressure – makes this decision even more difficult.
The “market” of marketing journals is extremely broad, with Cabells’ Journalytics indexing 965 publication venues that that are associated with “marketing” in their aims and scope. While monitoring the market of marketing journals for the last ten years, I have noticed that a new type of journal has tapped into it: open access (OA) journals.
The first time I have ever heard about OA journals was during a dinner in an international marketing conference held in April 2015 in my country, Tunisia. Many of the colleagues at the dinner table were enthusiastic about having secured publications in a “new” marketing journal published by “IBIMA”. Back in my hometown (Gafsa), I took a quick look at IBIMA Publishing’s website. The thing that I remember the most from that visit is that IBIMA’s website looked odd to me. Then a few years later, while conducting some research on marketing journals, I noticed some puzzling results for a particular journal. Investigating the case of that journal, I came to realize that a scam journal was brandjacking the identity of the flagship journal of the UK-based Academy of Marketing’s, Journal of Marketing Management.
Undertaking this research, terms such“Predatory publishers”, “Beall’s List”, and “Think, Check, Submit” were new discoveries for me. This was also the trigger point of a painful yet insightful research experience that lasted an entire year (from May 2019 to May 2020).
Beall’s list was no longer available (shutdown in January 2017), and I had no access to Cabells’ Predatory Reports. Freely available lists were either outdated or too specialized (mainly Science, Technology, and Medicine) to be useful. So, I searched for journals that have titles that are identical or confusingly similar to those of well-known, prestigious, non-predatory marketing journals. Using this procedure, I identified 12 journals and then visited the websites of each of these 12 journals to collect information about both the publisher and the journal; that is, is the journal OA or not, its Article Processing Charges, whether the journal had an editor in chief or not, the names of its review board members and their affiliations (if any), the journal’s ISSNs, etc. I even emailed an eminent marketing scholar that I was stunned to see his name included in the editorial board of a suspicious journal.
With one journal discarded, I had a list of 11 suspicious journals (Journal A to Journal K).
Having identified the 11 publishers of these 11 journals, I then consulted three freely available and up-to-date lists of predatory publishers: the Dolos List, the Kscien List, and the Stop Predatory Journals List. The aim of consulting these lists was to check whether I was wrong or right in qualifying these publishers as predatory. The verdict was unequivocal; each of the 11 publishers were listed in all three of them. These three lists, however, provided no reasons for the inclusion of a particular publisher or a particular journal.
To double-check the list, I used the Directory of Open Access Journals, which is a community-curated online directory that indexes and provides access to high-quality, OA, peer-reviewed journals. None of the 11 journals were indexed in it. To triple-check the list, I used both the 2019 Journal Quality List of the Australian Business Deans Council and the 2018 Academic Journal Guide by the Chartered Association of Business Schools. None of the 11 journals were ranked in these two lists either.
To be brief, the one year of endeavor resulted in a paper I submitted to the prestigious academic journal, Scientometrics, published by Springer Nature, and my paper was accepted and published online in late October 2020. In that paper, I reported the findings of a study that examined the extent of citations received by articles published in ten predatory marketing journals (as one of the 11 journals under scrutiny was an “empty journal”; that is, with no archives). The results indicated that some of these journals received quite a few citations with a median of 490 citations, with one journal receiving 6,296 citations (see also Case Study below).
I entitled the article “Citation contagion: A citation analysis of selected predatory marketing journals.” Some people may or may not like the framing in terms of “contagion” and “contamination” (especially in these COVID times), but I wanted the title to be striking enough to attract more readership. Those who read the article may see it as a call for marketing researchers to “not submit their (possibly interesting) knowledge products to any journal before checking that the publication outlet they are submitting to is a non-predatory journal.” Assuming that the number of citations an article receives signals its quality, the findings in my study indicate that some of the articles published in these predatory journals deserved better publication venues. I believe that most of the authors of these articles were well-intentioned and did not know that the journals they were submitting to were predatory.
A few months earlier and having no access to Cabells databases, I read each of the posts in their blog trying identify marketing journals that were indexed in Predatory Reports. Together with Cabells, our message to the marketing research community is that 10 of the 11 journals that I have investigated were already listed (or under-review for inclusion) in Predatory Reports. I believe my study has revealed only the tip of the iceberg. Predatory Reports now indexes 140 journals related to the subject of Marketing (which represents 1% of the total number of journals listed in Predatory Reports). Before submitting your papers to an OA marketing journal, you can use Predatory Reports to verify that it is legitimate.
The study completed by Dr. Moussa provides an excellent primer on how to research and identify predatory journals (writes Simon Linacre). As such, it is instructive to look at one of the journals highlighted in Dr. Moussa’s article in more detail.
Dr. Moussa rightly suspected that the British Journal of Marketing Studies looked suspicious due to its familiar-sounding title. This is a well-used strategy by predatory publishers to deceive authors who do not make themselves familiar with the original journal. In this case, the British Journal of Marketing Studies sounds similar to a number of potential journals in this subject discipline.
As Dr. Moussa also points out, a questionable journal’s website will often fail to stand up to a critical eye. For example, the picture below shows the “offices” of BJMS – a small terraced house in Southern England, which seems an unlikely location for an international publishing house. This journal’s website also contains a number of other tells that while not singularly defining of predatory publishers, certainly provide indicators: prominent phone numbers, reference to an ‘Impact Factor’ (not from Clarivate), fake indexation in databases (eg DOAJ), no editor contact details, and/or fake editor identity.
What is really interesting about Dr. Moussa’s piece is his investigation of citation activity. We can see from the data below that ‘Journal I’ (which is the British Journal of Marketing Studies) that both total citations and the most citations received by a single article are significant, and represent what is known as ‘citation leakage’ where citations are made to and from predatory journals. As articles in these journals are unlikely to have had any peer review, publication ethics checks or proof checks, their content is unreliable and skews citation data for reputable research and journals.
Predatory journal: Journal I (BJMS)
Total number of citations received: 1,331
Number of citations received by the most cited article: 99
The most cited article was published in: 2014
Number of citations received from SSCI-indexed journals: 3
Number of citations received from FT50 listed journals: 0
It is a familiar refrain from The Source, but it bears repeating – as an author you should do due diligence on where you publish your work and ‘research your research’. Using your skills as a researcher for publication and not just what you want to publish will save a huge amount of pain in the future, both for avoiding the bad journals and choosing the good ones.
Earlier this year Cabells (@CabellsPublish) published an A-Z list of issues regarding predatory publishing practices, with one Tweet a week going through the entire alphabet. In this week’s blog, Simon Linacre republishes all 26 tweets in one place as a primer on how to successfully deal with the phenomenon
B = British. Questionable journals often use ‘British’ as a proxy for quality. Over 600 include this descriptor in the Predatory Reports database, many more than in the Journalytics database of recommended journals
C = Conferences. Predatory conferences bear the same hallmarks as #PredatoryJournals – few named academics involved, too-good-to-be-true fees & poorly worded/designed websites. Some tips here.
D = Dual Publication. Publishing in #PredatoryJournals can lead to dual publication if the same article is then published in a legitimate journal
E = Editor (or lack of). Always check if a journal has an Editor, with an institutional email address and corroborated affiliation. Lack of an Editor or their details can indicate poor quality or predatory nature
F = Font. Look at fonts used by journals in emails or on web pages – clashing colors, outsized lettering or mixed up fonts may indicate predatory behavior
H = H-Index. Authors’ and journals’ #H_index scores can be skewed by #PredatoryJournal activity
I = ISSN. Over 4,000 out of 13,000 (30%) journals on @CabellsPublish Predatory Reports database claim an ISSN (some genuine, some fake)
J = JIF. Always check the Journal Impact Factor (JIF) claims on journal websites as many #PredatoryJournals falsely claim them. Check via @clarivateMaster Journal List
K = Knowledge. Spotting #PredatoryJournals and recognising legitimate #Journals can be important as subject knowledge for publication. ‘Research your research’ by using tools such as @CabellsPublish databases
L = Legitimacy. First responsibility for authors is to check legitimacy of journals – use checklists, peer recommendations and @CabellsPublish Predatory Reports database
M = Membership scams. Beware any journal that offers membership of a group as a condition of publication. Check existence of group and journal credentials on @CabellsPublish Journal databases
P = Predatory Reports. The new name for the @CabellsPublish database of 13,400 #PredatoryJournals
Q = Quick publication. Peer review is a long process typically lasting months. Beware journals offering quick #PeerReview and publication in days or even weeks
R = Research. Academic #Research should also include research on #Journals to enable #PublicationEthics and #ResearchIntegrity are followed. Use @CabellsPublish Predatory Reports to support your research
S = Spam. Journal emails to solicit articles are often predatory (see below for ironic example). Authors can check the legitimacy of journals by using the @CabellsPublish Predatory Reports database
T = Top Tips. Use our research on #PredatoryJournals and how to spot #PredatoryPublishing practices
U = Unknown. Never trust your research to any unknown editor, journal or publisher #PredatoryJournals #PredatoryAlphabet via @CabellsPublish
V = Vocabulary. Look out for ‘unusual’ vocabulary in predatory journal solicitation emails (i.e., “…your precious paper…”) #PredatoryJournals #PredatoryAlphabet via @CabellsPublish
W = Website. Look for red flags on a journal’s website such as dead links, poor grammar/spelling, no address for journal or publisher, or no contact info (just web-form)
X = There are no current ISSNs with more than one ‘X’ at the end, which serves as a check digit. Be aware of fake ISSNs with multiple X’s
Y = Yosemite Sam. Always check the backgrounds of Editors and Editorial Board members of journals – if they are a cartoon character, maybe don’t submit your article
Z = Zoology. The US Open Zoology Journal is one of 35 zoology titles on the @CabellsPublish Predatory Reports database. It is not open, is not from the US and has no articles, but will accept your $300 to submit an article
Fake news has been the go-to criticism of the media for some politicians, which in turn has been rejected as propaganda and fear-mongering by journalists. However, as former journalist Simon Linacre argues, the fourth estate needs to have its own house is in order first, and ensure they are not tripped up by predatory journals.
I class myself as a ‘runner’, but only in the very loosest sense. Inspired by my wife taking up running a few years ago, I decided I should exercise consistently instead of the numerous half-hearted, unsuccessful attempts I had made over the years. Three years later I have done a couple of half-marathons, run every other day, and track my performance obsessively on Strava. I have also recently started to read articles on running online, and have subscribed to the magazine Runners World. So yes, I think I may actually be a runner now.
But I’m also an experienced journalist, a huge cynic, and have a bulls*** radar the size of the Grand Canyon, so even while relaxing with my magazine I like to think I can spot fakery a mile off. And so it proved earlier this summer while reading a piece on how hill running can improve your fitness. This was music to my ears as someone who lives half-way up a valley side, but my interest was then piqued when I saw a reference to the study that formed the basis for the piece, which was to an article in the International Journal of Scientific Research. Immediately, I smelt a rat. “There is no way that is the name of a reputable, peer-reviewed journal,” I thought. And I was right.
But that wasn’t even half of the problem.
After checking Cabells’ Predatory Reports database, I found not one but TWO journals are listed on the database with that name, both with long lists of breaches of the Cabells’ criteria that facilitate the identification of predatory journals. I was still curious as to the nature of the research, as it could have been legitimate research in an illegitimate journal, or just bad research, full stop. As it turned out, neither journal had ever published any research on hill running and the benefits of increasing VO2 max. So where was the story from?
After some more digging, an article matching the details in the Runners World piece could be found in a third similarly-named journal, the International Journal of Scientific and Research Publications. The article, far from the recent breakthrough suggested in the August 2020 issue of Runners World, was actually published in August 2017 by two authors from Addis Ababa University in Ethiopia. While the science of the article seems OK, the experiment that produced the results was on just 32 people over 12 weeks, which means it really needs further validation across greater subjects to confirm its findings. Furthermore, while the journal itself was not included in Cabells’ Predatory Reports database, a review found significant failings, including unusually quick peer review processes and, more seriously, that the “owner/Editor of the journal or publisher falsely claims academic positions or qualifications”. The journal has subsequently been added to Predatory Reports, and the article itself has never been cited in the three years since publication.
Yet one question remains: how did a relatively obscure article, published in a predatory journal and that has never been cited, find its way into a news story in a leading consumer magazine? Interestingly, similar research was also quoted on MSN.com in May 2020 which also quoted the International Journal of Scientific Research, while other sites have also quoted the same research but from the International Journal of Scientific and Research Publications. It appears likely that, having been quoted online once, the same story has been doing the rounds for three years like a game of ‘Telephone,’ all based on uncited research that may not have been peer reviewed in the first place, that used a small sample size and was published in a predatory journal.
While no damage has been done here – underlying all this, it does make sense that hill running can aid overall performance – one need only need to think about the string of recent health news stories around the coronavirus to see how one unverified article could sweep like wildfire through news outlets and online. This is the danger that predatory journals pose.
Recently, while conducting investigations of suspected predatory journals, our team came across a lively candidate. At first, as is often the case, the journal in question seemed to look the part of a legitimate publication. However, after taking a closer look and reading through one of the journal’s articles (“Structural and functional brain differences in key opinion journal leaders“) it became clear that all was not as it seemed.
With a submission date of August 22, 2018, and a publication date November 13, 2018, the timeline suggests that some sort of peer review of this article may have been carried out. A closer examination of the content makes it evident that little to no peer review actually took place. The first tip-off was the double-take inducing line in the “Material and methods” section, “To avoid gender bias, we recruited only males.” Wait, what? That’s not how that works.
It soon became clear to our team that even a rudimentary peer review process (or perhaps two minutes on Google) would have led to this article’s immediate rejection. While predatory journals are no laughing matter, especially when it comes to medical research in the time of a worldwide pandemic, it is hard not to get a chuckle from some of the “easter eggs” found within articles intended to expose predatory journals. Some of our favorites from this article:
Frasier Crane, a listed author, is the name of the psychiatrist from the popular sitcoms Cheers and Frasier
Another author, Alfred Bellow, is the name of the NASA psychiatrist from the TV show I Dream of Jeannie
Marvin Monroe is the counselor from The Simpsons
Katrina Cornwell is a therapist turned Starfleet officer on Star Trek: Discovery
Faber University is the name of the school in Animal House (Faber College in the film)
Orbison University, which also doesn’t exist, is likely a tribute to the late, great musician Roy Orbison
And, perhaps our favorite find and one we almost missed:
In the “Acknowledgments” section the authors thank “Prof Joseph Davola for his advice and assistance.” This is quite likely an homage to the Seinfeld character “Crazy Joe Davola.”
Though our team had a few laughs with this investigation, they were not long-lived as this is yet another illustration of the thousands (Predatory Reports currently lists well over 13,000 titles) of journals such as this one in operation. Outlets that publish almost (or literally) anything, usually for a fee, with no peer review or other oversight in place and with no consideration of the detrimental effect it may have on science and research.
A more nuanced issue that deceptive publications create involves citations. If this was legitimate research, the included citations would not ‘count’ or be picked up anywhere since this journal is not indexed in any citation databases. Furthermore, any citation in a predatory journal that cites a legitimate journal is ‘wasted’ as the legitimate journal cannot count or use that citation appropriately as a foundation for its legitimacy. However, these citations could be counted via Google Scholar, although (thankfully) this journal has zero. Citation ‘leakage’ can also occur, where a legitimate journal’s articles cite predatory journals, effectively ‘leaking’ those citations out of the illegitimate scholarly publishing sphere into legitimate areas. These practices can have the effect of skewing citation metrics which are measures often relied upon (sometimes exclusively, often too heavily) to gauge the legitimacy and impact of academic journals.
When all is said and done, as this “study” concludes, “the importance of carefully selecting journals when considering the submission of manuscripts,” cannot be overstated. While there is some debate around the use of “sting” articles such as this one to expose predatory publications, not having them exposed at all is far more dangerous.
In his latest post, Simon Linacre looks at some new stats collated from the Cabells Predatory Reports database that should help inform and educate researchers, better equipping them to evade the clutches of predatory journals.
In recent weeks Cabells has been delighted to work with both The Economist and Nature Index to highlight some of the major issues for scholarly communication that predatory publishing practices represent. As part of the research for both pieces, a number of facts have been uncovered that not only help us understand the issues inherent in this malpractice much better, but should also point researchers away from some of the sadly typical behaviors we have come to expect.
So, for your perusing pleasure, here are Cabells’ Top 7 Palpable Points about Predatory Publishing Practices:
There are now 13,500 predatory journals listed in the Predatory Reports database, which is currently growing by approximately 2,000 journals a year
Over 4,300 journals claim to publish articles in the medical field (this includes multidisciplinary journals) – that’s a third of the journals in Predatory Reports. By discipline, medical and biological sciences have many more predatory journals than other disciplines
Almost 700 journals in Predatory Reports start with ‘British’ (5.2%), while just 50 do on the Journalytics database (0.4%). Predatory journals often call themselves American, British or European to appear well established and legitimate, when in reality relatively few good quality journals have countries or regions in their titles
There are over 5,300 journals listed in Predatory Reports with an ISSN (40%), although many of these are copied, faked, or simply made up. Having an ISSN is not a guarantee of legitimacy for journals
Around 41% of Predatory Reports journals are based in the US, purport to be from the US, or are suspected of being from the US, based on information on journal websites and Cabells’ investigations. This is the highest count for any country, but only a fraction will really have their base in North America
The average predatory journal publishes about 50 articles a year according to recent research from Bo-Christer Björk of the Hanken School of Economics in Helsinki, less than half the output of a legitimate title. Furthermore, around 60% of papers in such journals receive no future citations, compared with 10% of those in reliable ones
Finally, it is worth noting that while we are in the throes of the Coronavirus pandemic, there are 41 journals listed on in Predatory Reports (0.3%) specifically focused on epidemiology and another 35 on virology (0.6% in total). There could be further growth over the next 12 months, so researchers in these areas should be particularly careful now about where they submit their papers.
One of the most common questions Cabells is asked about its Predatory Reports database of journals is whether it has ever “changed its mind” about listing a journal. As Simon Linacre reports, it is less a question of changing the outcome of a decision, but more of a leopard changing its spots.
This week saw the annual release of Journal Impact Factors from Clarivate Analytics, and along with it the rather less august list of journals whose Impact Factors have been suppressed in Web of Science. This year there were 33 journals suspended, all of which for “anomalous citation patterns found in the 2019 citation data” which pertained to high levels of self-citation. Such a result is the worst nightmare for a publisher, as while they can be due to gaming citation levels, they can also sometimes reflect the niche nature of a subject area, or other anomalies about a journal.
Sometimes the decision can be changed, although it is often a year or two before the data can prove a journal has changed its ways. Similarly, Cabells offers a review process for every journal it lists in its Predatory Reports database, and when I arrived at the company in 2018, like many people one of the first things I asked was: has Cabells ever had a successful review to delist a journal?
Open for debate
The answer is yes, but the details of those cases are quite instructive as to why journals are included on the database in the first place, and perhaps more importantly whey they are not. Firstly, however, some context. It is three years since the Predatory Reports database was first launched, and in that time almost 13,500 journals have been included. Each journal has a link next to the violations on its report for anyone associated with that journal to view the policy and appeal the decision:
This policy clearly states:
The Cabells Review Board will consider Predatory Journal appeals with a frequency of one appeal request per year, per journal. Publications in Predatory Reports, those with unacceptable practices, are encouraged to amend their procedures to comply with accepted industry standards.
Since 2017, there have been just 20 appeals against decisions to list journals in Predatory Reports (0.15% of all listed journals), and only three have been successful (0.02%). In the first case (Journal A), the journal’s peer review processes were checked and it was determined that some peer reviews were being completed, albeit very lightly. In addition, Cabells’ investigators found a previous example of dual publication. However, following the listing, the journal dealt with the problems and retracted the article it had published as it seemed the author had submitted two identical articles simultaneously. This in turn led to Cabells revising its evaluations so that particular violation does not penalize journals for something where an author was to blame.
In the second review (Journal B), Cabells evaluated the journal’s peer review process and found that it was also not completing full peer reviews and had a number of other issues. It displayed metrics in a misleading way, lacked editorial policies on its website and did not have a process for plagiarism screening. After its listing in PR, the journal’s publisher fixed the misleading elements on its website and demonstrated improvements to its editorial processes. In this second case, it was clear that the journal’s practices were misleading and deceptive, but they chose to change and improve their practices.”
Finally, a third journal (Journal C) has just had a successful appeal completed. In this case, there were several problems that the journal was able to correct by being more transparent on its website. It added or cleared up confusion about the necessary policies and made information about its author fees available. Cabells was also able to evaluate its peer review process after it submitted peer review notes on a few articles and it was evident the journal editor was managing a good quality peer review, hence it has now been removed from the Predatory Reports database (it should be noted that, as with the other two successful appeals, journals removed from Predatory Reports are not then automatically included in the Cabells Journalytics database).
Cabells’ takeaway from all of these successful reviews was they were indeed successful – they showed that the original identification was correct, and they enabled improvements that identified them as better, and certainly non-predatory, journals. They also fed into the continuing improvement Cabells seeks in refining its Predatory Reports criteria, with a further update due to be published later this summer.
There are also things to learn from unsuccessful reviews. In one case a publisher appealed a number of its journals that were included on Predatory Reports. However, their appeal only highlighted how bad the journals actually were. Indeed, an in-depth review of each journal not only uncovered new violations that were subsequently added to the journals, but also led to the addition of a brand new violation that is to be included in the upcoming revision of the Predatory Reports criteria.
Publication ethics is at the core of everything that Cabells does, and it continually promotes all scholarly communication bodies which seek to uphold the highest standard of publishing practices. As such, we would like to express our support for Simon Linacre (Cabells’ Director of International Marketing and Development) in his candidacy to become a COPE Trustee. COPE plays an essential role in ensuring scholarly publishing maintains the highest standards, and if you are a COPE member is it important you use your vote to support the organization’s progress.
Simon has been with Cabells two years, and involved in academic publishing for almost 20 years. In that time he has gained wide experience of all aspects of journal publishing, and in particular Open Access issues which this role focuses on.
If you would like to vote in the election, please go to the COPE website, log in and cast your vote for your favored candidate.
Thanks, The Cabells Team
It is three years since Cabells first launched its database on predatory journals, and a good deal has happened in that time in the world of scholarly publishing. In his latest post, Simon Linacre reflects on these changes and offers some ‘dos and don’ts’ on the latest version of the database.
In June 2017 – which seems a lifetime ago now for all sorts of reasons – Cabells launched a new database that included details on over 4,000 predatory journals. It was the first time that a resource of that size had been made available to researchers who wanted to check the legitimacy or otherwise of journals they may be considering as a destination for their articles. In the intervening years, it is to be hoped many authors have been alerted to the dangers of publishing their research in such journals and benefited from worthwhile publishing experiences in good journals.
At the time, Cabells chose to name the database the ‘Blacklist’ as the most straightforward description of the intent of the database. As some may have seen, we brought forward the decision to change its name to ‘Predatory Reports’ last week in the first of a number of changes Cabells intends to introduce in 2020 and beyond.
The new name includes the word ‘Reports’ for an important reason. The database has been designed as more than a simple list of predatory, fake or questionable journals – it has also been put together so that researchers can use the information that has been collated on all 13,400 journals to inform their understanding of scholarly communications, and as a result, make better decisions about their research publications and career into the future. In this spirit, here are FIVE DOS AND DON’TS of how to use the Cabells Predatory Reports database:
DO check all violations listed for each journal on Predatory Reports to fully understand what the journal is NOT doing properly, as this can to help identify predatory behavior in future
DON’T trust a journal because it has an ISSN on its website – over 40% of journals listed on Predatory Reports include one, with many copied from legitimate journals or simply invented
DO check the publisher’s name in the ‘Advanced Search’ option if a journal is not included on the database, as the same publisher could have created a new journal with the same predatory behaviors
DON’T visit a predatory journal website unnecessarily as they could contain malware – hover the cursor over the website to view the full URL to see if it corresponds to that of the journal being checked out
DO send Cabells updates or information on potential new predatory journals by sending an email to ‘email@example.com’
And as a final ‘DO’, do click the link to our 70+ criteria that we use to identify predatory journals – these will be updated soon to streamline and clarify the process of reviewing journals for inclusion in Predatory Reports, and offer a much more robust checklist than currently exists to help researchers avoid falling into the predatory journal trap.