The rise and rise of predatory journals and conferences

Editor’s Note: Today’s post is by Tracey Elliott, Ph.D. Dr. Elliott is the Project Director at InterAcademy Partnership (IAP), currently overseeing Combatting Predatory Academic Journals and Conferences.


Predatory academic journals and, even more so, predatory conferences have been given surprisingly little attention in academic circles, despite their rapid growth and sophistication in recent years.  Juxtaposed with the pervasive “publish or perish” research assessment culture, where quantity trumps quality, the research community risks sleepwalking into a perfect storm.  Predatory academic practices are one manifestation of a surge in online scams and deceit that are deluging many sectors, fuelled further by changes in (post-) pandemic lifestyles, but their impact on the knowledge economy, research enterprise, and public policy is potentially profound. 

The InterAcademy Partnership (IAP) – the global network of over 140 academies of science, engineering and medicine – is leading an international project “Combatting predatory journals and conferences” which seeks to better understand the growing menace of these practices, gauge their extent and impact, what drives them and what actions are required to curb them.  With the number of predatory journals now estimated to be at least 14,500 (Cabells) and predatory conferences believed to outnumber legitimate ones (THES), this project is imperative and our recent survey of researchers all over the world is illuminating.

Conducted in November-December 2020, the survey gives concerning insight into the extent and impact of predatory practices across the world.  Based on the 1800+ respondents, two headlines are particularly striking:

1. Over 80% of respondents perceived predatory practices to be a serious problem or on the rise in their country.
2. At least a quarter of respondents had either published in a predatory journal, participated in a predatory conference, or did not know if they had.  Reasons cited for this included a lack of awareness of such scams and encouragement by their peers. Indeed, there is anecdotal evidence to suggest that the use of predatory journals and conferences is embedded, or at least tolerated, in some institutions/networks.

Contrary to some studies citing that early career researchers are especially vulnerable, we found no correlation between a researcher’s career stage, or their discipline, with their likelihood to publish in a predatory journal or participate in a predatory conference.  However, there is a small correlation with the economic status of the country in which they work, with those in lower- and middle-income countries more likely to participate or publish than those in high-income countries. If left unchecked, the research gap between higher- and lower-income countries risks widening. Putting definitive guidance on predatory journals behind paywalls, whilst sometimes unavoidable, risks exacerbating this further.

A challenge for such essential services, whether paywalled or not, is how to distinguish fraudulent, deceitful journals from low quality but well-intentioned and legitimate ones. Whilst bringing the clarity researchers crave, journal safelists and watchlists force an in or out binary decision that is increasingly inadequate and unfair.  In reality, there is a spectrum of fast-evolving and highly nuanced publishing practices that makes Cabell’s and its counterparts’ work very difficult. IAP is currently exploring a subset of Cabell’s-listed predatory journals using internet scraping and spidering techniques for data on predatory publishing.

Our project report, anticipated by early 2022, will include recommendations for all key stakeholder communities – researchers, research funders, publishers, academies and universities, libraries, and indexing services. With IAP as a conduit to academies and research communities throughout the world, we will focus on awareness-raising, training, and mentoring resources, and mobilising governments, multilateral and intergovernmental organisations.

No laughing matter

The latest meme to sweep Twitter in the last week has been a satirical look at typical journal articles. Simon Linacre introduces Cabells’ own take on the theme and reflects on the impact they can have on our shared conscience.


We all like memes, right? Those social media nuggets that we can all relate to and laugh at, a form of in-joke without having to be with a group of people, which under current circumstances has meant a kind of gold rush for this form of humor. Whether it is the boyfriend looking over his shoulder at another woman or the excerpt from the film Downfall with Hitler going berserk, the number of situations and news items that lend themselves to this form of parody is literally endless.

So, when the meme spotlight fell on our own corner of the scholarly publishing world, we couldn’t help but join in and adapt the scientific paper meme to predatory journals (see image). To be honest, it wasn’t too difficult to think of 12 journal titles that highlighted the problems predatory publishing causes, and a whole series of memes could easily be created to underscore the joke that is the predatory journal phenomenon.

It’s easy to spot the themes we chose to lampoon, although however much we become familiar with the predatory journal tropes, publications and new journals are emerging all the time, as the total number of journals listed in Cabells’ Predatory Reports hitting 14,500 this week testifies. Among the issues we put under the spotlight in the graphic are both the unethical and unaware authors publishing in predatory titles, how poor research or plagiarized content can easily be published, and some of the poor excuses those who end up publishing in dodgy journals have provided.

However, underneath the tomfoolery there is a serious point to be made. A recent op-ed in The Atlantic took the opportunity of highlighting not just the shared joy and geekiness of the scientific paper meme, but also the existential dread it spotlighted. As the article expertly points out, while academics recognize the hamster-in-a-wheel absurdity the meme represents, they cannot help but see themselves in the wheel, unable to stop running. For some, they will just shrug their shoulders and find the next piece of clickbait; for others, there is little consolation in the humor and plenty of angst to try and control to preserve their sanity.

When it comes to predatory journals, from a pure eyeballs perspective we can see that articles and social media posts about the often bizarre world of predatory publishing get the most traction, such as the fact that one predatory journal lists Yosemite Sam on the editorial board. And yet there is always a serious point behind these fun stories, which is that predatory journals can make an unholy mess of scientific research, causing millions of funding dollars to be wasted and allowing either junk or rank bad science to contaminate legitimate published research. This is the real punchline and it rings pretty hollowly sometimes.

Rewriting the scholarly* record books

Are predatory journals to academic publishing what PEDs are to Major League Baseball?


The 2021 Major League Baseball season is underway and for fans everywhere, the crack of the bat and pop of the mitt have come not a moment too soon. America’s ‘National Pastime’ is back and for at least a few weeks, players and fans for all 30 teams have reason to be optimistic (even if your team’s slugging first baseman is already out indefinitely with a partial meniscus tear…).

In baseball, what is known as the “Steroid Era” is thought to have run from the late ‘80s through the early 2000s. During this period, many players (some for certain, some suspected) used performance-enhancing drugs (PEDs) which resulted in an offensive explosion across baseball. As a result, homerun records revered by generations of fans were smashed and rendered meaningless.

It wasn’t just star players looking to become superstars that were using PEDs, it was also the fringe players, the ones struggling to win or keep jobs as big league ball players. They saw other players around them playing better, more often, and with fewer injuries. This resulted in promotions, from the minor leagues to the major leagues or from bench player to starter, and job security, in the form of multi-year contracts.

So, there now existed a professional ecosystem in baseball where those who were willing to skirt the rules could take a relatively quick and easy route to the level of production necessary to succeed and advance in their industry. Shortcuts that would enhance their track record and improve their chances of winning and keeping jobs and help build their professional profiles to ‘superstar’ levels, greatly increasing compensation as a result.

Is this much different than the situation for researchers in today’s academic publishing ecosystem?

Where some authors – called “parasite authors” by Dr. Serihy Kozmenko in a guest post for The Source – deliberately “seek symbiosis with predatory journals” in order to boost publication records, essentially amassing publication statistics on steroids. Other authors, those not willing to use predatory journals as a simple path to publication, must operate in the same system, but under a different set of rules that make it more difficult to generate the same levels of production. In this situation, how many authors who would normally avoid predatory journals would be drawn to them, just to keep up with those who use them to publish easily and frequently?

Is it time for asterisks on CVs?

At academic conferences, on message boards, and other forums for discussing issues in scholarly communication, a familiar refrain is that predatory journals are easy to identify and avoid, so predatory publishing, in general, is not a big problem for academic publishing. While there is some level of truth to the fact that many, though not all, predatory journals are relatively easy to spot and steer clear of, this idea denies the existence of parasite authors. These researchers are unconcerned about the quality of the journal as they are simply attempting to publish enough papers for promotion or tenure purposes.

Parasite authors are also likely to be undeterred by the fact that although many predatory journals are indexed in platforms such as Google Scholar, articles published in these journals have low visibility due to the algorithms used to rank research results in these engines. Research published in predatory journals is not easily discovered, not widely read, and not heavily cited, if at all. The work is marginalized and ultimately, the reputation of the researcher is damaged.

There are myriad reasons why an author might consider publishing in a predatory journal, some born out of desperation. The ‘publish or perish’ system places pressure on researchers in all career stages – how much blame for this should be placed on universities? In addition, researchers from the Global South are fighting an uphill battle when dealing with Western publishing institutions. Lacking the same resources, training, language skills, and overall opportunities as their Western counterparts, researchers from the developing world often see no other choice but to use predatory journals (the majority located in their part of the world) to keep pace with their fellow academics’ publishing activity.

To a large degree, Major League Baseball has been able to remove PEDs from the game, mostly due to increased random testing and more severe penalties for those testing positive. Stemming the flow of predatory publishing activity in academia will not be so straightforward. At the very least, to begin with, the scholarly community must increase monitoring and screening for predatory publishing activity (with the help of resources like Cabells’ Predatory Reports) and institute penalties for those found to have used predatory journals as publishing outlets. As in baseball, there will always be those looking to take shortcuts to success, having a system in place to protect those who do want to play by the rules should be of paramount importance.

Spotlight on Turkey

Turkey has been making great strides in recent years as a force to be reckoned with on the international research stage. However, it seems to have encountered more problems than other countries with regard to predatory journals. Simon Linacre looks at the problems facing the country and highlights some resources available to help Turkish scholars.

A simple Google search of “predatory journals Turkey” provides quick insight into the concerns academic researchers there have regarding these deceptive publications. Numerous articles fill the first pages of results highlighting the particular issue Turkey seems to share with a few other countries such as India and Nigeria. Alongside, however, are anonymous websites offering unsupported claims about predatory publications. Validated information appears to be thin on the ground.

Luckily, the Turkish government understands there is a problem and in the Spring of 2019 it decided to take action. According to Professor Zafer Koçak in his article ‘Predatory Publishing and Turkey’, the Turkish Council of Higher Education decreed that “scientific papers published in predatory journals would not be taken into account in academic promotion and assignment. Thus, Turkey has taken the step of becoming one of the first countries to implement this in the world”.

According to its website, the Turkish Council of Higher Education believed the phenomenon was increasing, and was doing so internationally. A number of articles have been published recently that back this up – for example here and here – and there is the potential for Turkish authors to get caught up in this global swell due to their increasing publication output.

To support Turkish authors and institutions, Cabells has translated its information video on its Journalytics and Predatory Reports products, as well as translating this page, into Turkish. Hopefully, the availability of independently verified information on predatory journals and greater dialogue will improve the conditions for Turkey and its scholars to continue to grow their influence in global research.



Türkiye son yıllarda uluslararası araştırma sahnesinde yabana atılamayacak büyük bir aşama kaydetmektedir. Ancak yağmacı dergilerle diğer ülkelerde olduğundan daha fazla sorunlarla karşılaşıyor gibi görünüyor. Simon Linacre bu konuda ülkenin karşı karşıya olduğu sorunlara bakıyor ve Türk bilim insanlarına yardımcı olacak mevcut kaynakların altını çiziyor.

Basit bir “predatory journals Turkey” Google taraması akademik araştırmacıların bu aldatıcı yayınlarla ilgili endişelere sahip oldukları konusunda hızlı bir anlayış sağlıyor. Taramanın ilk sayfaları, Türkiye’nin bu sorunu Hindistan ve Nijerya gibi diğer bir kaç ülke ile paylaştığını gösteren sonuçlarla dolu. Fakat bu sonuçların bir kısmı da yağmacı yayınlar hakkında desteklenmeyen iddialar sunan anonim web sayfaları. Doğrulanmış ve güvenilir bilgi nadir görülüyor.

Neyse ki, Türk hükümeti bir sorun olduğunun farkında ve 2019 Baharında önlem almaya karar verdi. Profesör Zafer Koçak’ın ‘Predatory Publishing and Turkey’ makalesine göre, Yükseköğretim Kurulu tarafından alınan kararla “yağmacı dergilerde yayımlanan bilimsel makaleler akademik yükseltmelerde dikkate alınmayacak. Böylece Türkiye dünyada bu kararı yürürlüğe koyan ilk ülkelerden biri olma adımını attı”.

Yükseköğretim Kurulu web sitesine göre, Kurul hem ulusal hem de uluslararası ortamlarda yağmacı yayıncılığın arttığına inanıyor. Son zamanlarda bunu destekleyen bir çok makale yayımlandı – örneklerini burada ve burada görebilirsiniz – ve yayın sayıları ile birlikte hızla artan küresel yağmacılığa Türk yazarların yakalanma olasılığı var. Cabells, Türk yazarları ve kurumları desteklemek için Journalytics ve Predatory Reports ürünlerinin bilgilendirici videosu ile birlikte bu sayfayı da Türkçeye çevirdi. Umarız ki, yağmacı dergiler hakkında bağımsız olarak onaylanmış bilginin ulaşılabilirliği ve daha güçlü iletişim, Türkiye’nin ve  akademisyenlerinin global araştırmadaki etkilerini arttırarak devam ettirmeleri konusunda şartları iyileştirecek.

Predatory journals vs. preprints: What’s the difference?

While working towards publication in a legitimate journal, however circuitous the route, is of course a much better path than publishing in an illegitimate journal, Simon Linacre examines why this is a useful question to consider.

A blog post this week in The Geyser pointed out the problems surrounding version control of the same article on multiple preprint servers and on the F1000 platform.

TL;DR? It isn’t pretty.

The article used as an example is unquestionably a legitimate study relating to the coronavirus pandemic, and as such is a small but important piece in the jigsaw being built around science’s pandemic response. That this article has yet to be validated – and as such enabled as a piece that fits the COVID-19 jigsaw – is something that will presumably be achieved once it is published in a recognized peer-reviewed journal.

However, this does raise the following rather thorny question: how is the article any better served fragmented on different preprint servers and publishing platforms than it would be having been published as a single entity in a predatory journal?

I am being facetious here – working towards a legitimate publication, however circuitous the route, is far better than publishing in an illegitimate journal. However, comparing the two options is not as strange as one might think, and perhaps offers some guidance for authors uncertain about where to publish their research in the first place.

Firstly, early career researchers (ECRs), while often offered very little direction when it comes to publication ethics and decision-making, are understandably worried about sharing their data and findings on preprint servers for fear of being ‘scooped’ by other researchers who copy their results and get published first. This is a legitimate fear, and is one explanation why a researcher, although unfamiliar with a journal, might submit their research for a low fee and quick turnaround.

Secondly, ECRs or more experienced researchers may be incentivised by their institutions to simply achieve a publication without any checks on the type of journal they publish in. As such, they need a journal to validate their publication – even if the journal itself has not been validated – which is something preprints or non-journal platforms are unable to provide.

Finally, while recent research has shown that just over half of articles published in predatory journals do not receive any citations, just less than 50% did receive citations, and authors may prefer one sole accessible source for their research than multiple sources across different preprints. This is not to say that preprints can’t receive citations – indeed Google Scholar reveals 22 citations to the article above from its original posting on Arxiv – but the perception may be that only journals can deliver citations, and will therefore be the aim for some authors.

Of course, authors should know the very real difference between a predatory journal and a preprint, but the evidence of 14,000+ journals on Cabells Predatory Reports database and the millions of spam emails received daily from illegitimate journals points to at least some researchers falling for the same tricks and continue to line the pockets of predatory publishers. While research publishing options remain as varied and as complex as they are – and while higher education institutions and funders simply assume every researcher has an effective publishing strategy – then as many will fall into the predatory trap as they have always done.

Book review – Gaming the Metrics: Misconduct and Manipulation in Academic Research

The issues of gaming metrics and predatory publishing undoubtedly go hand-in-hand, outputs from the same system that requires academic researchers the world over to sing for their supper in some form or other. However, the two practices are often treated separately, almost as if there was no link at all, so editors Biagioli and Lippman are to be congratulated in bringing them together under the same roof in the shape of their book Gaming the Metrics: Misconduct and Manipulation in Academic Research (MIT Press, 2020).

The book is a collection of chapters that cover the whole gamut of wrongheaded – or just plain wrong – publication decisions on behalf of authors the word over on where to publish the fruits of their research. This ‘submission decision’ is unenviable, as it inevitably shapes academic careers to a greater or lesser degree. The main reason why authors make poor decisions is laid firmly at the doors of a variety of ‘publish or perish’ systems which seek to quantify the outputs from authors with a view to… well, the reason why outputs are quantified is never really explained. However, the reason why such quantification should be a non-starter is well-argued by Michael Power in Chapter 3, as well as Barbara M. Kehm (Ch. 6) in terms of the ever-popular university rankings. Even peer review comes under attack from Paul Wouters (Ch. 4), but as with the other areas any solutions are either absent, or in the case of Wouters proffered with minimal detail or real-world context.

Once into the book, any author would quickly realize that their decision to publish is fraught with difficulty with worrying about predatory publishers lurking on the internet to entice their articles and APCs from them. As such, any would be author would be well advised to heed the call ‘Caveat scriptor’ and read this book in advance of sending off their manuscript to any journals.

That said, there is also a case for advising ‘caveat lector’ before would-be authors read the book, as there are other areas where additional context would greatly help in addressing the problems of gaming metrics and academic misconduct. When it comes to predatory journals, there is a good deal of useful information included in several of the later chapters, especially the case studies in Chapters 7 and 15 which detail a suspiciously prolific Czech author and sting operation, respectively.

Indeed, these cases provide the context that is perhaps the single biggest failing of the book, which through its narrow academic lens doesn’t quite capture the wider picture of why gaming metrics and the scholarly communications system as a whole is ethically wrong, both for those who perpetrate it and arguably the architects of the systems. As with many academic texts that seek to tackle societal problems, the unwillingness to get dirt under the fingernails in the pursuit of understanding what’s really going on simply distances the reader from the problem at hand.

As a result, after reading Gaming the Metrics, one is like to simply shrug one’s shoulders in apathy about the plight of authors and their institutions, whereas a great deal more impact might have been achieved if the approach had been less academic and included more case studies and insights into the negative impact resulting from predatory publishing practices. After all, the problem with gaming the system is that, for those who suffer, it is anything but a game.

Gaming the Metrics: Misconduct and Manipulation in Academic Research, edited by Mario Biagioli and Alexandra Lippman (published Feb. 21 2020, MIT Press USA) ISBN: 978-0262537933.

The fake factor

On the day that the US says goodbye to its controversial President, we cannot bid farewell to one of his lasting achievements, which is to highlight issues of fake news and misinformation. Simon Linacre looks at how putting the issue in the spotlight could at least increase people’s awareness… and asks for readers’ help to do so.

Cabells completed around a dozen webinars with Indian universities towards the end of 2020 in order to share some of our knowledge of predatory publishing, and also learn from librarians, faculty members and students what their experiences were. Studies have shown that India has both the highest number of predatory journals based there and most authors publishing in them, as well as a government as committed as any to dealing with the problem, so any insight from the region is extremely valuable.

Q&A sessions following the webinars were especially rich, with a huge range of queries and concerns raised. One specific query raised a number of issues: how can researchers know if the index a journal says it is listed in is legitimate or not? As some people will be aware, one of the tricks of the trade for predatory publishers is to promote indices their journals are listed in, which can come in several types:

  • Pure lies: These are journals that say they have an ‘Impact Factor’, but are not listed by Clarivate Analytics in its Master Journal List of titles indexed on Web of Science (and therefore have an Impact Factor unless only recently accepted)
  • Creative lies: These journals say they are listed by an index, which is true, but the index is little more than a list of journals which say they are listed by the index, with the addition of the words ‘Impact Factor’ to make it sound better (eg. ‘Global Impact Factor’ , ‘Scholarly Article Impact Factor’)
  • Nonsensical lies: These are links (or usually just images) to seemingly random words or universities that try to import some semblance of recognition, but mean nothing. For example, it may be a name of a list, service or institution, but a quick search elicits nothing relating those names with the journal
  • White lies: One of the most common, many predatory journals say they are ‘listed’ or ‘indexed’ by Google Scholar. While it is true to say these journals can be discovered by Google Scholar, they are not listed or indexed for the simple reason that GS is not a list or an index

When Jeffrey Beall was active, he included a list of ‘Misleading Metrics’ on his blog that highlighted some of these issues. A version or versions of this can still be found today, but are not linked to here because (a) they are out of date by at least four years, and (b) the term ‘misleading’ is, well, misleading as few of the indexes include metrics in the first place, and the metrics may not be the major problem with the index. However, this information is very valuable, and as such Cabells has begun its own research program to create an objective, independently verifiable and freely available list of fake indexes in 2021. And, what’s more, we need your help – if anyone would like to suggest we look into a suspicious looking journal index, please write to me at simon.linacre@cabells.com and we will review the site for inclusion.

Back to basics

As we enter what is an uncertain 2021 for many both personally and professionally, it is worth perhaps taking the opportunity to reset and refocus on what matters most to us. In his latest blog post, Simon Linacre reflects on Cabells’ new video and how it endeavors to show what makes us tick.

It is one of the ironies of modern life that we seem to take comfort in ‘doomscrolling’, that addictive pastime of flicking through Twitter on other social media on the hunt for the next scandal to inflame our ire. Whether it is Brexit, the coronavirus epidemic or alleged election shenanigans, we can’t seem to get enough of the tolls of doom ringing out in our collective echo chambers. As the New Year dawns with little good news to cheer us, we may as well go all in as the world goes to hell in a handcart.

Of course, we also like the lighter moments that social media provide, such as cat videos and epic fails. And it is comforting to hear some stories that renew our faith in humanity. One parent on Twitter remarked this week as the UK’s schools closed and reverted to online learning, that she was so proud of her child who, on hearing the news, immediately started blowing up an exercise ball with the resolve not to waste the opportunity lockdown provided of getting fit.

Reminding ourselves that the glass can be at least half full even if it looks completely empty is definitely a worthwhile exercise, even if it feels like the effort of constantly refilling it is totally overwhelming. At Cabells, our source of optimism has recently come from the launch of our new video. The aim of the video is to go back to basics and explain what Cabells does, why it does it, and how it does it through its two main products – Journalytics and Predatory Reports.

Making the video was a lot of fun, on what was a beautiful sunny Spring day in Edinburgh with one of my US colleagues at an academic conference (remember them?). While nerve-shredding and embarrassing, it was also good to go back to basics and underline why Cabells exists and what we hope to achieve through all the work we do auditing thousands of journals every year.

It also acted as a reminder that there is much to look forward to in 2021 that will keep our glasses at least half full for most of the time. Cabells will launch its new Medical journal database early this year, which will see over 5,000 Medical journals indexed alongside the 11,000 journals indexed in Journalytics. And we also have major upgrades and enhancements planned for both Journalytics and Predatory Reports databases that will help researchers, librarians and funders better analyse journal publishing activities. So, let’s raise a (half full) glass to the New Year, and focus on the light at the end of the tunnel and not the darkness that seems to surround us in early January.

What to know about ISSNs

There are many ways to skin a cat, and many ways to infer a journal could be predatory. In his latest blog post, Simon Linacre looks at the role the International Standard Serial Number, or ISSN, can play in the production of predatory journals. 

For many reasons, the year 2020 will be remembered for the sheer volume of numbers that have invaded our consciousness. Some of these are big numbers – 80 million votes for Joe Biden, four million cases of COVID in the US in 2020 – and some of these will be small, such as the number of countries (1) leaving the EU at the end of the year. Wherever we look, we see numbers of varying degrees of import at seemingly every turn.

While numbers have been previously regarded as gospel, however, data has joined news and UFO sightings (seemingly one of the few phenomena NOT to increase in 2020) as something to be suspicious about or faked in some way. And one piece of data trusted by many authors in determining the validity or otherwise of a journal is the International Standard Serial Number, or ISSN.

An ISSN can be obtained relatively easily via either a national or international office as long as a journal can be identified as an existing publication. As the ISSN’s own website states, an ISSN is “a digital code without any intrinsic meaning” and does not include any information about the contents of that publication. Perhaps most importantly, an ISSN “does not guarantee the quality or the validity of the contents”. This perhaps goes some way to explain why predatory journals can often include an ISSN on their websites. Indeed, more than 40% of the journals included in Cabells’ Predatory Reports database include an ISSN in their journal information.

But sometimes predatory publishers can’t obtain an ISSN – or at least can’t be bothered to – and will fake the ISSN code. Of the 6,000 or so journals with an ISSN in Predatory Reports, 288 or nearly 5% have a fake ISSN, and this is included as one of the database’s behavioural indicators to help identify predatory activity. It is instructive to look at these fake ISSNs to see the lengths predatory publishers will go to in order to achieve some semblance of credibility in their site presence.

For some journals, it is obvious that the ISSN is fake as it looks wrong. In the example above for the Journal of Advanced Statistics and Probability, the familiar two groups of four digits followed by a hyphen format is missing, replaced by nine digits and a forward slash, which is incorrect.

For other journals, such as the Global Journal of Nuclear Medicine and Biology below, the format is correct, but a search using the ISSN portal brings up no results, so the ISSN code is simply made up.

More worrying are the few publications that have hijacked existing, legitimate journals and appropriated their identity, including the ISSN. In the example below, the Wulfenia Journal has had its identity hijacked, with the fake journal website pictured below.

If you compare it to the genuine journal shown below (the German homepage can be found here), you can see they list the same ISSN.

One can only imagine the chaos caused for a legitimate journal when its identity is hijacked, and this is just part of wider concerns on the effects of fake information being shared have on society. As always, arming yourself with the right information – and taking a critical approach to any information directed your way – will help see you through the morass of misinformation we seem to be bombarded with in the online world.

Guest Post: A look at citation activity of predatory marketing journals

This week we are pleased to feature a guest post from Dr. Salim Moussa, Assistant Professor of Marketing at ISEAH at the University of Gafsa in Tunisia. Dr. Moussa has recently published insightful research on the impact predatory journals have had on the discipline of marketing and, together with Cabells’ Simon Linacre, has some cautionary words for his fellow researchers in that area.

Academic journals are important to marketing scholars for two main reasons: (a) journals are the primary medium through which they transmit/receive scholarly knowledge; and (b) tenure, promotion, and grant decisions depend mostly on the journals in which they have published. Selecting the right journal to which one would like to submit a manuscript is thus a crucial decision. Furthermore, the overabundance of academic marketing journals -and the increasing “Publish or Perish” pressure – makes this decision even more difficult.

The “market” of marketing journals is extremely broad, with Cabells’ Journalytics indexing 965 publication venues that that are associated with “marketing” in their aims and scope. While monitoring the market of marketing journals for the last ten years, I have noticed that a new type of journal has tapped into it: open access (OA) journals.

The first time I have ever heard about OA journals was during a dinner in an international marketing conference held in April 2015 in my country, Tunisia. Many of the colleagues at the dinner table were enthusiastic about having secured publications in a “new” marketing journal published by “IBIMA”. Back in my hometown (Gafsa), I took a quick look at IBIMA Publishing’s website. The thing that I remember the most from that visit is that IBIMA’s website looked odd to me. Then a few years later, while conducting some research on marketing journals, I noticed some puzzling results for a particular journal. Investigating the case of that journal, I came to realize that a scam journal was brandjacking the identity of the flagship journal of the UK-based Academy of Marketing’s, Journal of Marketing Management.

Undertaking this research, terms such“Predatory publishers”, “Beall’s List”, and “Think, Check, Submit” were new discoveries for me. This was also the trigger point of a painful yet insightful research experience that lasted an entire year (from May 2019 to May 2020).

Beall’s list was no longer available (shutdown in January 2017), and I had no access to Cabells’ Predatory Reports. Freely available lists were either outdated or too specialized (mainly Science, Technology, and Medicine) to be useful. So, I searched for journals that have titles that are identical or confusingly similar to those of well-known, prestigious, non-predatory marketing journals. Using this procedure, I identified 12 journals and then visited the websites of each of these 12 journals to collect information about both the publisher and the journal; that is, is the journal OA or not, its Article Processing Charges, whether the journal had an editor in chief or not, the names of its review board members and their affiliations (if any), the journal’s ISSNs, etc. I even emailed an eminent marketing scholar that I was stunned to see his name included in the editorial board of a suspicious journal.

With one journal discarded, I had a list of 11 suspicious journals (Journal A to Journal K).

Having identified the 11 publishers of these 11 journals, I then consulted three freely available and up-to-date lists of predatory publishers: the Dolos List, the Kscien List, and the Stop Predatory Journals List. The aim of consulting these lists was to check whether I was wrong or right in qualifying these publishers as predatory. The verdict was unequivocal; each of the 11 publishers were listed in all three of them. These three lists, however, provided no reasons for the inclusion of a particular publisher or a particular journal.

To double-check the list, I used the Directory of Open Access Journals, which is a community-curated online directory that indexes and provides access to high-quality, OA, peer-reviewed journals. None of the 11 journals were indexed in it. To triple-check the list, I used both the 2019 Journal Quality List of the Australian Business Deans Council and the 2018 Academic Journal Guide by the Chartered Association of Business Schools. None of the 11 journals were ranked in these two lists either.

To be brief, the one year of endeavor resulted in a paper I submitted to the prestigious academic journal, Scientometrics, published by Springer Nature, and my paper was accepted and published online in late October 2020. In that paper, I reported the findings of a study that examined the extent of citations received by articles published in ten predatory marketing journals (as one of the 11 journals under scrutiny was an “empty journal”; that is, with no archives). The results indicated that some of these journals received quite a few citations with a median of 490 citations, with one journal receiving 6,296 citations (see also Case Study below).

I entitled the article “Citation contagion: A citation analysis of selected predatory marketing journals.” Some people may or may not like the framing in terms of “contagion” and “contamination” (especially in these COVID times), but I wanted the title to be striking enough to attract more readership. Those who read the article may see it as a call for marketing researchers to “not submit their (possibly interesting) knowledge products to any journal before checking that the publication outlet they are submitting to is a non-predatory journal.” Assuming that the number of citations an article receives signals its quality, the findings in my study indicate that some of the articles published in these predatory journals deserved better publication venues. I believe that most of the authors of these articles were well-intentioned and did not know that the journals they were submitting to were predatory. 

A few months earlier and having no access to Cabells databases, I read each of the posts in their blog trying identify marketing journals that were indexed in Predatory Reports. Together with Cabells, our message to the marketing research community is that 10 of the 11 journals that I have investigated were already listed (or under-review for inclusion) in Predatory Reports. I believe my study has revealed only the tip of the iceberg. Predatory Reports now indexes 140 journals related to the subject of Marketing (which represents 1% of the total number of journals listed in Predatory Reports). Before submitting your papers to an OA marketing journal, you can use Predatory Reports to verify that it is legitimate.

Case Study

The study completed by Dr. Moussa provides an excellent primer on how to research and identify predatory journals (writes Simon Linacre). As such, it is instructive to look at one of the journals highlighted in Dr. Moussa’s article in more detail.

Dr. Moussa rightly suspected that the British Journal of Marketing Studies looked suspicious due to its familiar-sounding title. This is a well-used strategy by predatory publishers to deceive authors who do not make themselves familiar with the original journal. In this case, the British Journal of Marketing Studies sounds similar to a number of potential journals in this subject discipline.

As Dr. Moussa also points out, a questionable journal’s website will often fail to stand up to a critical eye. For example, the picture below shows the “offices” of BJMS – a small terraced house in Southern England, which seems an unlikely location for an international publishing house. This journal’s website also contains a number of other tells that while not singularly defining of predatory publishers, certainly provide indicators: prominent phone numbers, reference to an ‘Impact Factor’ (not from Clarivate), fake indexation in databases (eg DOAJ), no editor contact details, and/or fake editor identity.

What is really interesting about Dr. Moussa’s piece is his investigation of citation activity. We can see from the data below that ‘Journal I’ (which is the British Journal of Marketing Studies) that both total citations and the most citations received by a single article are significant, and represent what is known as ‘citation leakage’ where citations are made to and from predatory journals. As articles in these journals are unlikely to have had any peer review, publication ethics checks or proof checks, their content is unreliable and skews citation data for reputable research and journals.

  • Predatory journal: Journal I (BJMS)
  • Total number of citations received: 1,331
  • Number of citations received by the most cited article: 99
  • The most cited article was published in: 2014
  • Number of citations received from SSCI-indexed journals: 3
  • Number of citations received from FT50 listed journals: 0
Predatory Reports entry for BJMS

It is a familiar refrain from The Source, but it bears repeating – as an author you should do due diligence on where you publish your work and ‘research your research’. Using your skills as a researcher for publication and not just what you want to publish will save a huge amount of pain in the future, both for avoiding the bad journals and choosing the good ones.